Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

The Uncertainty Concept

To assess errors associated with laboratory results in a systematic way, the uncertainty concept has been introduced in laboratory medicine. According to the ISO Guide to the Expression of Uncertainty in Measurement ( GUM ) uncer- [Pg.398]

The uncertainty concept is directed toward the end user (clinician) of the result, who is concerned about the total error possible, and who is not particularly interested in the question whether the errors are systematic or random. In the outiine of the uncertainty concept it is assumed that any known systematic error components of a measurement method have been corrected, and the specified uncertainty includes the uncertainty associated with correction of the systematic error(s). Although this appears logical, a problem may be that some routine methods have systematic errors dependent on the patient category from which the sample originates. For example, kinetic Jaffe methods for creatinine are subject to positive interference by alpha-keto compounds and to negative interference by bilirubin and its metabolites, which means that the direction of systematic error will be patient dependent and not generally predictable. [Pg.398]

In the theory on uncertainty, a distinction between type A and B uncertainties is made. Type A uncertainties are frequency-based estimates of standard deviations (e.g, an SD of the imprecision). Type B uncertainties are uncertainty components for which frequency-based SDs are not available. Instead, the uncertainty is estimated by other approaches or by the opinion of experts. Finally the total uncertainty is derived from a combination of all sources of uncertainty. In this context, it is practical to operate with standard uncertainties (w t), which are equivalent to standard deviations. By multiplication of a standard uncertainty with a coverage factor (k), the uncertainty corresponding to a specified probability level is derived. For example, multiplication with a coverage factor of two yields a probability level of 95% given a normal distribution. When considering the total uncertainty of an analytical result obtained by a routine method, the preanalytical variation, method imprecision, random matrix-related interferences, and uncertainty related to calibration and bias corrections (traceability) should be taken into account. Expressing the uncertainty components as standard uncertainties, we have the general relation  [Pg.398]

Uncertainty can be assessed in various ways, and often a combination of procedures is necessary. In principle, uncertainty can be judged directly from measurement comparisons or indirectly from an analysis of individual error sources according to the law of error propagation ( error budget )- Measurement comparison may consist of a method comparison study with a reference method based on patient samples according to the principles outlined previously or by measurement of certified matrix reference materials (CRMs). [Pg.399]

Example of Direct Assessment of Uncertainty on the Basis of Measurements of a Certified Reference Material [Pg.399]


The significance of the uncertainty concept in analytical chemistry has increased in the last century, notwithstanding that at first some conformity was missed. But inconsistencies have been dispelled (see Thompson [1995] AMC [1995]) and operational approaches have been presented by Hund et al. [2001]. Numerous examples of application have been given in EURACHEM [1995]. [Pg.104]

The uncertainty concept is composed of both chemists and physicists approaches of handling of random deviations and substitutes so classical error theories in an advantageous way. [Pg.104]

This model of course makes a lot of assumptions and there are quite a few problems. It is not possible to have appropriate reference standards for all possible chemical measurements, very often there are no common links (i.e. reference to a common basis, which is ideally the SI), laboratories do not always use the standards in an appropriate way and very often the uncertainty concept is not used at all ... [Pg.209]

Although human wonder and minds are the sources of uncertainty in forms of vagueness, dubiousness, incompleteness they also serve to overcome problems through human experience, expert views (Chap. 5). The uncertainty concepts in understanding complex problems are dependent on observations, experiences and conscious expert views. When problems are solved, there is always remaining uncertainty that paves the way for future developments. Thus, scientific solutions cannot be taken as absolute truths in positivistic manner. [Pg.37]

Our concepts of petroleum reserves and resources and their measurements are changing to reflect the uncertainty associated with these terms. Petroleum reseiwes have been largely calculated deterministically (i.e. single point estimates with the assumption of certainty). In the past decade, reseiwe and resource calculations have incorporated uncertainty into their estimates using probabilistic methodologies. One of the questions now being addressed are such as how certain arc you that the rcsciwcs you estimate arc the actual reseiwes and what is the range of uncertainty associated with that estimate New techniques arc required to address the critical question of how much petroleum we have and under what conditions it can be developed. [Pg.1007]

The initial set of experiments and the first few textbook chapters lay down a foundation for the course. The elements of scientific activity are immediately displayed, including the role of uncertainty. The atomic theory, the nature of matter in its various phases, and the mole concept are developed. Then an extended section of the course is devoted to the extraction of important chemical principles from relevant laboratory experience. The principles considered include energy, rate and equilibrium characteristics of chemical reactions, chemical periodicity, and chemical bonding in gases, liquids, and solids. The course concludes with several chapters of descriptive chemistry in which the applicability and worth of the chemical principles developed earlier are seen again and again. [Pg.482]

Whether this concept can stand up under a rigorous psychological analysis has never been discussed, at least in the literature of theoretical physics. It may even be inconsistent with quantum mechanics in that the creation of a finite mass is equivalent to the creation of energy that, by the uncertainty principle, requires a finite time A2 A h. Thus the creation of an electron would require a time of the order 10 20 second. Higher order operations would take more time, and the divergences found in quantum field theory due to infinite series of creation operations would spread over an infinite time, and so be quite unphysical. [Pg.450]

In Science, every concept, question, conclusion, experimental result, method, theory or relationship is always open to reexamination. Molecules do exist Nevertheless, there are serious questions about precise definition. Some of these questions lie at the foundations of modem physics, and some involve states of aggregation or extreme conditions such as intense radiation fields or the region of the continuum. There are some molecular properties that are definable only within limits, for example, the geometrical stmcture of non-rigid molecules, properties consistent with the uncertainty principle, or those limited by the negleet of quantum-field, relativistic or other effects. And there are properties which depend specifically on a state of aggregation, such as superconductivity, ferroelectric (and anti), ferromagnetic (and anti), superfluidity, excitons. polarons, etc. Thus, any molecular definition may need to be extended in a more complex situation. [Pg.469]

One of the best definitions is by Attilio Bisio (1) The successful startup and operation of a commercial size unit whose design and operating procedures are in part based on experimentation and demonstration at a smaller scale of operation. He also points out that Smith (2) argued in 1968 that the starting point for scaleup studies is the ultimate intended commercial unit. The professional should scaledown from the design parameters and constraints of that commercial unit so that the smaller scale experiments were most useful in reducing the uncertainties of the commercial run. Smith wrote that scaleup from small-scale studies is a misleading concept. [Pg.313]

To obtain the mass emissions of pollutants from e-waste recycling processes, it is essential that the inputs of pollutants are truly e-waste related. To fulfill this requirement, a causal analysis is desirable. However, the concept of causation is rather problematic because causal mechanisms are complex [26]. Nonetheless, we are compelled to identify causes, in an attempt to minimize the uncertainties associated with our estimates. In this chapter, the strict empiricist, David Hume s empirical criterion, was adopted. This approach requires only a combination of (1) e-waste processing and environmental pollution are associated in space and time (contiguity) (2) e-waste processing precede to environmental pollution (temporal succession) and (3) e-waste processing is always conjoined with environmental pollution (consistent conjunction). These are always the cases judged from a number of previous studies [6, 27-35]. [Pg.282]

Traditionally, analytical chemists and physicists have treated uncertainties of measurements in slightly different ways. Whereas chemists have oriented towards classical error theory and used their statistics (Kaiser [ 1936] Kaiser and Specker [1956]), physicists commonly use empirical uncertainties (from knowledge and experience) which are consequently added according to the law of error propagation. Both ways are combined in the modern uncertainty concept. Uncertainty of measurement is defined as Parameter, associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (ISO 3534-1 [1993] EURACHEM [1995]). [Pg.101]

The concept of the value of the stochastic solution (VSS) measures the advantage of using a two-stage stochastic program over using a deterministic one, in other words, it measures the cost of ignoring the uncertainty. [Pg.197]

In this chapter we investigate the interaction between experimental design and information quality in two-factor systems. However, instead of looking again at the uncertainty of parameter estimates, we will focus attention on uncertainty in the response surface itself. Although the examples are somewhat specific (i.e., limited to two factors and to full second-order polynomial models), the concepts are general and can be extended to other dimensional factor spaces and to other models. [Pg.279]

In general, the consequences of the assumptions of toxicological, statistical, and/or uncertainty factors made in the derivations of the TTC concepts are difficult to overview since there are uncertainties and drawbacks in more or less all of the available TTC approaches. [Pg.201]

In the decision whether toxicity studies may be omitted at tonnage levels at or above 100 tons/year is appropriate or not, a TTC value might be used in the comparison with the available exposure information. However, due to limitations and uncertainties in the derivation of TTC values, as well as the fact that the TTC concept has not yet been evaluated for the diverse group of industrial chemicals and for different routes of exposure other than dietary, the Nordic group concluded that it is too premature to use the TTC concept within REACH. [Pg.202]

Once a QSAR model is constructed, it must be validated using the external test set. The data points in the test set should not appear in the training set. There are two approaches to improve the prediction accuracy for a given QSAR model. The first approach utilized the concept of "the domain of applicability," which is used to estimate the uncertainty in prediction of a particular molecule based on how similar it is to the compound used to build the model. To make a more accurate prediction for a given molecule in the test set, the structurally similar compounds in the training set are used to construct model and that model is used to make the prediction. In some cases, the domain similarity is measured using molecular descriptor similarity, rather than the structural similarity. The... [Pg.120]


See other pages where The Uncertainty Concept is mentioned: [Pg.292]    [Pg.398]    [Pg.402]    [Pg.402]    [Pg.210]    [Pg.292]    [Pg.398]    [Pg.402]    [Pg.402]    [Pg.210]    [Pg.74]    [Pg.416]    [Pg.65]    [Pg.289]    [Pg.335]    [Pg.287]    [Pg.565]    [Pg.16]    [Pg.19]    [Pg.54]    [Pg.397]    [Pg.249]    [Pg.234]    [Pg.11]    [Pg.277]    [Pg.281]    [Pg.318]    [Pg.53]    [Pg.131]    [Pg.138]    [Pg.3]    [Pg.21]    [Pg.126]    [Pg.756]    [Pg.504]   


SEARCH



Uncertainty concept

© 2024 chempedia.info