Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data preconditioning

After process analysis, data preconditioning is the first important step in the modeling process. If faulty data is used or data is insufficiently rich in information, a poor model will result. An important tool that can be used is principal component analysis. This tool gives a good indication whether there are points in the data sets which are abnormal. Bad data points should be removed from the data set. One should be careful if a dynamic model has to be developed. If data is removed from a dynamic data set, the continuity in the data is lost and this has consequences for the model. The best thing would be to initialize the model at the start of each new section of the data set. In case of static data, points can be aibitrarily removed without consequences for the model to be developed. [Pg.275]

It often happens that collected data is bad, for example in case of analyzer readings it may happen that the gas chromatograph was out of service for some period of time. This situation is similar to the situation where bad data points have to be removed. Rather than removing bad data points, one could also fill the bad data section with good data points. This is a difficult issue. One could use linear interpolation between the last and first good process value. One could also look at larger data sections that look similar to the data section in which the bad data points are present and then copy the similar data section to the bad data section. This is probably a better method than linear interpolation. [Pg.275]

In case of dynamic modeling, the data collection frequency is also an important issue. What will be the execution frequency of the model If this frequency is for example 1 minute, it would make sense to collect data at a frequency higher than 1 minute, for example, every 10 to 20 seconds. If the data collection frequency is high and one wants to use every data point while prediction of the process measurement is only required at a much lower frequency, than the process data will be highly correlated, i.e. the ciment process reading will depend heavily on the previous process reading. This is often not a desirable situation for good prediction and/or control. [Pg.275]


Rules may represent either guidelines based on experience, or compact descriptions of events, processes, and behaviors with the details and assumptions omitted. In either case, there is a degree of uncertainty associated with the appHcation of the rule to a given situation. Rule-based systems allow for expHcit ways of representing and dealing with uncertainty. This includes the representation of the uncertainty of individual rules, as weU as the computation of the uncertainty of a final conclusion based on the uncertainty of individual rules, and uncertainty in the data. There are numerous approaches to uncertainty within the rule-based paradigm (2,35,36). One of these approaches is based on what are called certainty factors. In this approach, a certainty factor (CF) can be associated with variable—value pairs, and with individual rules. The certainty of conclusions is then computed based on the CF of the preconditions and the CF for the rule. For example, consider the foUowing example. [Pg.533]

McGowen, J.M. and McDaniel, B.W. "The Effects of Fluid Preconditioning and Test Cell Design on the Measurement of Dynamic Fluid Loss Data," SPE paper 18212, 1988 SPE Annual Technical Conference and Exhibition, Houston, October 2-5. [Pg.659]

Taking into account the ZnCFO positive influence on formation of the complex of the physical-mechanical characteristics of vulcanizates on the basis of various rubbers and theoretical preconditions about interrelation "recipe - structure - property", it was interested to define the morphology features of elastomeric compositions. With this purpose the percalation method of the analysis by the rheometer data of rubber mixes was used [10],... [Pg.200]

A precondition for an appropriate decision in the planning of a preparative electroorganic synthesis is sufficient information about the electrochemical reaction. As far as possible, knowledge about the influence of parameters such as temperature, solvent, pH value, and stirring rate should be included. Electroanalytical standard methods to acquire such data have been discussed in Chapter 1 cyclovoltammetry as an especially valuable tool and its combination with the rotating disk electrode method for additional knowledge. At... [Pg.29]

There are two necessary and related preconditions which must be satisfied for complex mixture analysis by pattern recognition to be successful. First, we must obtain an adequate data base of FTIR spectra from which we can derive the spectral patterns we need to recognize. Second, we must demonstrate that there Is a suitable measure or metric of similarity between the spectra. It Is these two conditions which were evaluated by the work presented here. Pattern recognition techniques were most suitable for the evaluation. [Pg.161]

The same type of procedure was applied to the Rh4(CO)i2 / HMn(CO)5 initiated homogeneous catalyzed hydroformylation of 3,3-dimethyl-l-butene at 298 K [94]. A set of e=21 experiments with k >26 spectra and v=4751 MIR spectral chaimels were used. The spectra from the 21 hydroformylation experiments were preconditioned and assembled into one matrix A7i3x295i- This data matrix was then... [Pg.182]

Figure 4.11 A singular value decomposition of the preconditioned in situ spectroscopic data showing the 1st, 4th, 5th, 7th significant vectors and the 713th vector. The marked extrema are those which were used to recover the organometallic pure component spectra as well as alkene and aldehyde by STEM. Atmospheric moisture and CO2, hexane, and dissolved CO were removed from the experimental data during preconditioning. (C. Li, E. Widjaja, M. Garland,/ Am. Chem. Soc., 2003, 725, 5540-5548.)... Figure 4.11 A singular value decomposition of the preconditioned in situ spectroscopic data showing the 1st, 4th, 5th, 7th significant vectors and the 713th vector. The marked extrema are those which were used to recover the organometallic pure component spectra as well as alkene and aldehyde by STEM. Atmospheric moisture and CO2, hexane, and dissolved CO were removed from the experimental data during preconditioning. (C. Li, E. Widjaja, M. Garland,/ Am. Chem. Soc., 2003, 725, 5540-5548.)...
Semi-batch, Non-preconditioned, Monometallic Catalytic Data... [Pg.184]

The description of electron motion and electronic states that is at the heart of all of chemistry is included in wave function theory, which is also referred to as self-consistent-field (SCF) or, by honouring its originators, Hartree-Fock (HF) theory [7]. In principle, this theory also includes density functional theory (DFT) approaches if one uses densities derived from SCF densities, which is common but not a precondition [2] therefore, we treat density functional theory in a separate section. Many approaches based on wave function theory date back to when desktop supercomputers were not available and scientists had to reduce the computational effort by approximating the underlying equations with data from experiment. This approach and its application to the elucidation of reaction mechanisms are outlined in Section 7.2.3. [Pg.173]

The PMN review process has evolved over time within the constraints set by TSCA. An important constraint is that submitters are required to furnish only test data already in their possession (if any) and are not required to conduct a battery of tests as a precondition for approval. This generalization holds true for basic chemical property data as well as toxicity data, and it is the main reason why TSCA has been such a powerful impetus for developing estimation methods for many of the parameters needed in environmental assessment. To illustrate how extreme the situation is, in one study of more than 8,000 PMNs for class 1 chemical substances (i.e., those for which a specific chemical structure can be drawn) that were received from 1979 through 1990,Lynch et al. (1991) found only 300 that contained any of the property data noted earlier as needed for environmental assessment. The U.S. is unique among industrialized nations in requiring its assessors to work in the virtual absence of test data. [Pg.6]


See other pages where Data preconditioning is mentioned: [Pg.275]    [Pg.225]    [Pg.225]    [Pg.275]    [Pg.225]    [Pg.225]    [Pg.533]    [Pg.26]    [Pg.644]    [Pg.171]    [Pg.136]    [Pg.369]    [Pg.336]    [Pg.180]    [Pg.181]    [Pg.182]    [Pg.187]    [Pg.194]    [Pg.105]    [Pg.214]    [Pg.400]    [Pg.191]    [Pg.192]    [Pg.193]    [Pg.67]    [Pg.162]    [Pg.172]    [Pg.104]    [Pg.533]   
See also in sourсe #XX -- [ Pg.225 ]




SEARCH



Preconditioning

© 2024 chempedia.info