Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data interpretation performance

Evidence of the appHcation of computers and expert systems to instmmental data interpretation is found in the new discipline of chemometrics (qv) where the relationship between data and information sought is explored as a problem of mathematics and statistics (7—10). One of the most useful insights provided by chemometrics is the realization that a cluster of measurements of quantities only remotely related to the actual information sought can be used in combination to determine the information desired by inference. Thus, for example, a combination of viscosity, boiling point, and specific gravity data can be used to a characterize the chemical composition of a mixture of solvents (11). The complexity of such a procedure is accommodated by performing a multivariate data analysis. [Pg.394]

Measurement Selection The identification of which measurements to make is an often overlooked aspect of plant-performance analysis. The end use of the data interpretation must be understood (i.e., the purpose for which the data, the parameters, or the resultant model will be used). For example, building a mathematical model of the process to explore other regions of operation is an end use. Another is to use the data to troubleshoot an operating problem. The level of data accuracy, the amount of data, and the sophistication of the interpretation depends upon the accuracy with which the result of the analysis needs to oe known. Daily measurements to a great extent and special plant measurements to a lesser extent are rarelv planned with the end use in mind. The result is typically too little data of too low accuracy or an inordinate amount with the resultant misuse in resources. [Pg.2560]

Used either as prelaboratory preparation for related laboratory activities or to expose students to additional laboratory activities not available in their program, these modules motivate students to learn by proposing real-life problems in a virtual environment. Students make decisions on experimental design, observe reactions, record data, interpret these data, perform calculations, and draw conclusions from their results. Following a summary of the module, students test their understanding by applying what they have learned to new situations or by analyzing the effect of experimental errors. [Pg.22]

The main characteristics of FAB-MS are indicated in Table 6.15. FAB ionisation is relatively simple to perform. However, parameter optimisation and data interpretation of the resulting FAB spectra can be complex. Matrix selection for additive analysis is crucial. Solubility of the additives in the matrix is essential for production of viable spectra. FAB/FIB is well suited to organic compounds which exhibit some polarity, and contain either acidic and/or basic functional groups. Compounds with basic groups run well in positive ionisation mode, and those with acidic centres run well in the negative ionisation... [Pg.369]

This chapter provides a complementary perspective to that provided by Kramer and Mah (1994). Whereas they emphasize the statistical aspects of the three primary process monitoring tasks, data rectification, fault detection, and fault diagnosis, we focus on the theory, development, and performance of approaches that combine data analysis and data interpretation into an automated mechanism via feature extraction and label assignment. [Pg.10]

By definition, the exemplar patterns used by these algorithms must be representative of the various pattern classes. Performance is tied directly to the choice and distribution of these exemplar patterns. In light of the high dimensionality of the process data interpretation problem, these approaches leave in question how reasonable it is to accurately partition a space such as R6+ (six-dimensional representation space) using a finite set of pattern exemplars. This degradation of interpretation performance as the number of possible labels (classes) increases is an issue of output dimensionality. [Pg.51]

PDF approaches represent a statistically formal way of accomplishing local kernel definition. Although intent and overall results are analogous to defining kernels of PCA features, considerable work currently is required for PDF approaches to be viable in practice. It is presently unrealistic to expect them to adequately recreate the underlying densities. Nevertheless, there are advantages to performing data interpretation based on direct PDF estimation and, as a result, work continues. [Pg.56]

The knowledge required to implement Bayes formula is daunting in that a priori as well as class conditional probabilities must be known. Some reduction in requirements can be accomplished by using joint probability distributions in place of the a priori and class conditional probabilities. Even with this simplification, few interpretation problems are so well posed that the information needed is available. It is possible to employ the Bayesian approach by estimating the unknown probabilities and probability density functions from exemplar patterns that are believed to be representative of the problem under investigation. This approach, however, implies supervised learning where the correct class label for each exemplar is known. The ability to perform data interpretation is determined by the quality of the estimates of the underlying probability distributions. [Pg.57]

A fixed target value for a is preferable and can be arrived at in several ways. It could be fixed arbitrarily, with a value based on a perception of how laboratories should perform. It could be an estimate of the precision required for a specific task of data interpretation, a could be derived from a model of precision, such as the Horwitz Curve .15 However, while this model provides a general picture of reproducibility, substantial deviation from it may be experienced for particular methods. [Pg.94]

The frontier orbital approach (Fukui et al., 1962, 1954b) has met with considerable success in so far as frontier orbital charges correlate well with experimental data. The performance of these indices is often superior to that of others, with the possible exception of localization energies. It is, however, difficult to give meaning to the correlation since physical interpretations of the role of the frontier electrons in reaction mechanisms are often obscure, and attempts to give substance to Fukui s hypothesis have frequently embodied questionable procedures or models. [Pg.112]

There are several applications to scientific and engineering instrumentation which especially relevant to chemistry and chemical engineering. These include building into instruments expertise in instrument control and data interpretation, to attempt to minimize the amount of staff time required to perform routine analyses and to optimize the performance of a system. There are several efforts underway in process control, focused currently in the electrical power and chemical industries. [Pg.7]

Phage antibodies are directly used to label tissue sections. Interpretation of the data is performed by a team of experts including histochemists and must be subjected to cross-examination. Images are captured using microscopes with a digital camera attachment and archived in databases for further analysis of pathologists. [Pg.119]

When considering the bending of a beam and attempting to extract a modulus value one must make several assumptions, the most important being that the modulus in tension is the same as in compression, and is independent of strain (at least for the range of strain involved). The simple Bernoulli-Euler theory is usually used to interpret the data. When performing resonance tests it is particularly useful to find a set of resonances and compare the measured frequency ratios with the theoretical ones given in the previous chapter. [Pg.85]

When the experimentalist set an ambitious objective to evaluate micromechanical properties quantitatively, he will predictably encounter a few fundamental problems. At first, the continuum description which is usually used in contact mechanics might be not applicable for contact areas as small as 1 -10 nm [116,117]. Secondly, since most of the polymers demonstrate a combination of elastic and viscous behaviour, an appropriate model is required to derive the contact area and the stress field upon indentation a viscoelastic and adhesive sample [116,120]. In this case, the duration of the contact and the scanning rate are not unimportant parameters. Moreover, bending of the cantilever results in a complicated motion of the tip including compression, shear and friction effects [131,132]. Third, plastic or inelastic deformation has to be taken into account in data interpretation. Concerning experimental conditions, the most important is to perform a set of calibrations procedures which includes the (x,y,z) calibration of the piezoelectric transducers, the determination of the spring constants of the cantilever, and the evaluation of the tip shape. The experimentalist has to eliminate surface contamination s and be certain about the chemical composition of the tip and the sample. [Pg.128]


See other pages where Data interpretation performance is mentioned: [Pg.539]    [Pg.511]    [Pg.283]    [Pg.975]    [Pg.31]    [Pg.47]    [Pg.82]    [Pg.261]    [Pg.158]    [Pg.552]    [Pg.348]    [Pg.216]    [Pg.361]    [Pg.67]    [Pg.17]    [Pg.330]    [Pg.130]    [Pg.217]    [Pg.49]    [Pg.402]    [Pg.335]    [Pg.181]    [Pg.199]    [Pg.275]    [Pg.281]    [Pg.288]    [Pg.289]    [Pg.299]    [Pg.361]    [Pg.702]    [Pg.288]    [Pg.539]    [Pg.193]    [Pg.216]    [Pg.9]    [Pg.392]    [Pg.145]   
See also in sourсe #XX -- [ Pg.61 ]

See also in sourсe #XX -- [ Pg.61 ]




SEARCH



Data interpretation

Interpretation Performance

Interpreting data

Performance data

© 2024 chempedia.info