Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data analysis measurement uncertainty

Uncertainty expresses the range of possible values that a measurement or result might reasonably be expected to have. Note that this definition of uncertainty is not the same as that for precision. The precision of an analysis, whether reported as a range or a standard deviation, is calculated from experimental data and provides an estimation of indeterminate error affecting measurements. Uncertainty accounts for all errors, both determinate and indeterminate, that might affect our result. Although we always try to correct determinate errors, the correction itself is subject to random effects or indeterminate errors. [Pg.64]

Because the technical barriers previously outhned increase uncertainty in the data, plant-performance analysts must approach the data analysis with an unprejudiced eye. Significant technical judgment is required to evaluate each measurement and its uncertainty with respec t to the intended purpose, the model development, and the conclusions. If there is any bias on the analysts part, it is likely that this bias will be built into the subsequent model and parameter estimates. Since engineers rely upon the model to extrapolate from current operation, the bias can be amplified and lead to decisions that are inaccurate, unwarranted, and potentially dangerous. [Pg.2550]

Several significant challenges exist in applying data analysis and interpretation techniques to industrial situations. These challenges include (1) the scale (amount of input data) and scope (number of interpretations) of the problem, (2) the scarcity of abnormal situation exemplars, (3) uncertainty in process measurements, (4) uncertainty in process discriminants, and (5) the dynamic nature of process conditions. [Pg.7]

Uncertainty in Process Discriminants. Because processes operate over a continuum, data analysis generally produces distinguishing features that exist over a continuum. This is further compounded by noise and errors in the sensor measurements. Therefore, the discriminants developed to distinguish various process labels may overlap, resulting in uncertainty between data classes. As a result, it is impossible to define completely distinguishing criteria for the patterns. Thus, uncertainty must be addressed inherently. [Pg.8]

EURACHEM (1995) Quantifying uncertainty in analytical measurement. Teddington EURACHEM (1998) The fitness for purpose of analytical methods. Teddington Frank IE, Todeschini R (1994) The data analysis handbook. Elsevier, Amsterdam... [Pg.330]

Chromatographic procedures applied to the identification of proteinaceous paint binders tend to be rather detailed consisting of multiple analytical steps ranging from solvent extractions, chromatography clean up, hydrolysis, derivatisation reactions, and measurement to data analysis. Knowledge of the error introduced at each step is necessary to minimise cumulative uncertainty. Reliable results are consequently obtained when laboratory and field blanks are carefully characterised. Additionally, due to the small amounts of analyte and the high sensitivity of the analysis, the instrument itself must be routinely calibrated with amino acid standards along with measurements of certified reference proteins. All of these factors must be taken into account because many times there is only one chance to take the measurement. [Pg.247]

Risk characterization provides a basis for discussions of risk management between risk assessors and risk managers (US EPA 1998). These discussions are held to ensure that results of risk analysis are presented completely and clearly for decision makers, thus allowing any necessary mitigation measures (e.g., monitoring, collecting additional data to reduce uncertainty, etc.). [Pg.12]

According to apphcation of Dunham s formalism to analysis of molecular spectra, as for GaH and H2, these radial coefficients of seven types represent many Dunham coefficients Ym and their auxiliary coefficients Zki of various types that collectively allow wave numbers of observed transitions to be reproduced almost within their uncertainty of measurement through formula 54. Mostly because of inconsistency between reported values of frequencies of pure rotational transitions [118,119], the reduced standard deviation of the fit reported in table 3 is 1.25, slightly greater than unity that would be applicable with consistent data for which uncertainty of each measurement were carefully assigned. [Pg.292]

Although the measurement uncertainties limit the conclusions which can be drawn from these results, the data set proved useful for the determination of general Influences on rainwater composition In the Seattle area and for the demonstration of the application of these exploratory data analysis techniques. Current efforts to collect and analyze aerosol and rainwater samples over meteorologically appropriate time scales with precise analytical techniques are expected to provide better resolution of the factors controlling the composition of rainwater. [Pg.51]

These include sampling, handling, transport, storage and preparation of items to be tested and/or calibrated, and, where appropriate, an estimation of the measurement uncertainty (see chapter 12 of this book) as well as statistical techniques for analysis of test and/or calibration data. [Pg.36]

We have seen two different approaches to estimate the measurement uncertainty. One was using data from control charts, CRM analysis, PT results and/or recoveiy tests and sometimes maybe also experience of the analyst, the other was just using the reproducibility standard deviations from interlaboratory tests. In most cases the second method delivers higher estimates. [Pg.266]

Ellison, S. L. R., and Barwick, V. J. (1998), Using validation data for ISO measurement uncertainty estimation. Part 1. Principles of an approach using cause and effect analysis, Analyst, 123,1387-1392. [Pg.785]

Also special care should be taken to reduce uncertainties on emission data and measurements. The validation of an aerosol model requires the analysis of the aerosol chemical composition for the main particulate species (ammonium, sulphate, nitrate and secondary organic aerosol). To find data to perform this kind of more complete evaluation is not always easy. The same applies to emissions data. The lack of detailed information regarding the chemical composition of aerosols obliges modellers to use previously defined aerosols components distributions, which are found in the literature. Present knowledge in emission processes is yet lacunal, especially concerning suspension and resuspension of deposited particles [37]. [Pg.269]

Exploratory data analysis (EDA). This analysis, also called pretreatment of data , is essential to avoid wrong or obvious conclusions. The EDA objective is to obtain the maximum useful information from each piece of chemico-physical data because the perception and experience of a researcher cannot be sufficient to single out all the significant information. This step comprises descriptive univariate statistical algorithms (e.g. mean, normality assumption, skewness, kurtosis, variance, coefficient of variation), detection of outliers, cleansing of data matrix, measures of the analytical method quality (e.g. precision, sensibility, robustness, uncertainty, traceability) (Eurachem, 1998) and the use of basic algorithms such as box-and-whisker, stem-and-leaf, etc. [Pg.157]

A third source of uncertainty is the occurrence of rare or unique events in the measurement, such as an incorrect reading by the observer, or a chance disturbance in the equipment. Such errors can often produce large deviations from the other readings, and are hence termed outliers . There are statistical tests for recognising such data points, but the occurrence of outliers can be a real problem in statistical data analysis. [Pg.297]

The study of elementary reactions for a specific requirement such as hydrocarbon oxidation occupies an interesting position in the overall process. At a simplistic level, it could be argued that it lies at one extreme. Once the basic mechanism has been formulated as in Chapter 1, then the rate data are measured, evaluated and incorporated in a data base (Chapter 3), embedded in numerical models (Chapter 4) and finally used in the study of hydrocarbon oxidation from a range of viewpoints (Chapters 5-7). Such a mode of operation would fail to benefit from what is ideally an intensely cooperative and collaborative activity. Feedback is as central to research as it is to hydrocarbon oxidation Laboratory measurements must be informed by the sensitivity analysis performed on numerical models (Chapter 4), so that the key reactions to be studied in the laboratory can be identified, together with the appropriate conditions. A realistic assessment of the error associated with a particular rate parameter should be supplied to enable the overall uncertainty to be estimated in the simulation of a combustion process. Finally, the model must be validated against data for real systems. Such a validation, especially if combined with sensitivity analysis, provides a test of both the chemical mechanism and the rate parameters on which it is based. Therefore, it is important that laboratory determinations of rate parameters are performed collaboratively with both modelling and validation experiments. [Pg.130]

The estimation of uncertainty replaces a full validation of the analytical method. It generates the necessary information at the right time. The statistical information received from the analysis can be used for the interpretation of the data and finally the analysis is designed to the customers needs. In this case measurement uncertainty is a good alternative to validation. [Pg.78]

Thus the exact incorporation of the instrument resolution function would entail the evaluation of this four dimensional integral for each data point, in addition to the convolution in t, required to incorporate the uncertainty in the measurement of time of flight. To reduce data processing times, the approximation is made in the data analysis that the resolution can be incorporated as a single convolution in t space, with a different resolution function Rm (J ) for each mass. Thus (17) is modified to... [Pg.450]


See other pages where Data analysis measurement uncertainty is mentioned: [Pg.4]    [Pg.5]    [Pg.338]    [Pg.344]    [Pg.237]    [Pg.315]    [Pg.49]    [Pg.2]    [Pg.51]    [Pg.448]    [Pg.173]    [Pg.121]    [Pg.328]    [Pg.6393]    [Pg.507]    [Pg.1811]    [Pg.163]    [Pg.113]    [Pg.293]    [Pg.8]    [Pg.84]    [Pg.320]    [Pg.259]    [Pg.270]    [Pg.434]    [Pg.69]    [Pg.639]    [Pg.304]   
See also in sourсe #XX -- [ Pg.8 ]

See also in sourсe #XX -- [ Pg.8 ]




SEARCH



Measurement data

Uncertainty analysis

© 2024 chempedia.info