Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Variability versus uncertainty

Exposure assessment informs decision-making regarding protection of human health. [Pg.11]

However, the true exposures are rarely known for a given individual and are estimated using modelling procedures based upon available data. Uncertainty regarding exposure estimates arises as a result of the limited availability of empirical information, as well as limitations in the measurements, models or techniques used to develop representations of complex physical, chemical and biological processes. As described by NRC (1994), uncertainty forces decision makers to judge how probable it is that risks will be overestimated or underestimated for every member of the exposed population . Furthermore, because every individual can have a different exposure level, it is also possible that the estimate of uncertainty can differ among individuals. [Pg.11]

the notions of intraindividual variability and uncertainty are distinct concepts, because they arise for different reasons. Variability is an inherent property of the system being modelled. In contrast, uncertainty can be conceptualized as dependent on the current state of knowledge regarding the system being modelled. From this perspective, uncertainty is more a [Pg.11]

Two important considerations in probabilistic exposure assessment are whether to quantify uncertainty and whether to separate it from variability within the analysis and output  [Pg.12]

Strategy or approach is then determined by the desired nature of the output an estimate for a given percentile of the population with (option 3) or without (option 1) confidence bounds, or the probability of a randomly chosen individual falling below (or above) a given exposure (option 2). [Pg.12]


In many process-design calculations it is not necessary to fit the data to within the experimental uncertainty. Here, economics dictates that a minimum number of adjustable parameters be fitted to scarce data with the best accuracy possible. This compromise between "goodness of fit" and number of parameters requires some method of discriminating between models. One way is to compare the uncertainties in the calculated parameters. An alternative method consists of examination of the residuals for trends and excessive errors when plotted versus other system variables (Draper and Smith, 1966). A more useful quantity for comparison is obtained from the sum of the weighted squared residuals given by Equation (1). [Pg.107]

Uncertainty analysis and sensitivity analysis are tools that provide insight on how model predictions are affected by data precision. One of the issues in uncertainty analysis that must be confronted is how to rank both individual inputs and groups of inputs according to their contribution to overall uncertainty. In particular, there is a need to distinguish between the relative contribution of true uncertainty versus variability (i.e. heterogeneity), as well as to distinguish model uncertainty from parameter uncertainty. This case-study illustrates methods of uncertainty representation and variance characterization. [Pg.119]

An uncertainty analysis involves the determination of the variation of imprecision in an output function based on the collective variance of model inputs. One of the five issues in uncertainty analysis that must be confronted is how to distinguish between the relative contribution of variability (i.e. heterogeneity) versus true certainty (measurement precision) to the characterization of predicted outcome. Variability refers to quantities that are distributed empirically - such factors as soil characteristics, weather patterns and human characteristics - which come about through processes that we expect to be stochastic because they reflect actual variations in nature. These processes are inherently random or variable, and cannot be represented by a single value, so that we can determine only their moments (mean, variance, skewness, etc.) with precision. In contrast, true uncertainty or model specification error (e.g. statistical estimation error) refers to an input that, in theory, has a single value, which cannot be known with precision due to measurement or estimation error. [Pg.140]

An added benefit of Monte Carlo analysis is that a common by-product of this computerized examination is a sensitivity analysis that shows how much each predictor variable contributed to the uncertainty or variability of the predictions. This, in turn, tells both the risk assessor and risk manager which portion of the variability is from natural fluctuation versus how much is caused by lack of knowledge. Given this information, decisions can be made as to where the most cost-effective allocation of resources may occur to refine the estimate of exposure and risk. In the example, the sensitivity analysis shown in Figure 4 presents the apportionment of variance for the model. [Pg.1738]

In the second type of quaniiiaiive mass spectrometry for molecular species, analyte concentrations are obtained directly from the heights of the mass spectral peaks. Tor simple mixtures, it is sometimes possible to find peaks at unique m/r values for each component. Under these circumstances, calibration curves of peak heights versus concentration can be prepared and used for analysis of unknowns. More accurate results can ordinarily be rcali/ed, however, by incorporating a lixed amount of an internal standard substance in both samples and calibration standards. The ratio of the peak intensity of the analyte species to that of the internal standard is then plotted as a function of analyle concentration. The internal standard tends to reduce uncertainties arising in sample preparation and introduction. These uncertainties are often a major source ol indeterminate error with ibe small samples needed for mass spectrometry. Internal standards are also used inOCVMS and f.C/MS. For these techniques, the ratio of peak areas serves as the analytical variable. [Pg.583]

Equations (10) and (11) can also be expressed in exponential form, as well as in forms which use AG as the dependent variable rather than pK (from the van t Hoff isotherm, AG = -RT In K). For equilibrium reactions in aqueous and other polar solutions, the ACp value is expected to have a finite value, due to the significant changes in solvent structure which occur when ionization takes place. For some compoimds, the ACp value may have a large uncertainty and be not statistically different to zero, depending on the precision of the raw data (e.g., 5,5-di-isopropylbarbituric acid) [89]. In these cases, the pRa temperature dependence is satisfactorily described by the integrated van t Hoff equation [Eq. (10) without the C log T term]. This equation will give a linear van t Hoff plot of pfCg versus 1/T. [Pg.32]


See other pages where Variability versus uncertainty is mentioned: [Pg.11]    [Pg.11]    [Pg.315]    [Pg.34]    [Pg.95]    [Pg.49]    [Pg.215]    [Pg.8]    [Pg.20]    [Pg.607]    [Pg.1356]    [Pg.1706]    [Pg.1708]    [Pg.277]    [Pg.1008]    [Pg.1746]    [Pg.329]    [Pg.223]    [Pg.453]    [Pg.353]    [Pg.60]    [Pg.214]    [Pg.1012]    [Pg.1740]    [Pg.154]    [Pg.179]    [Pg.1112]    [Pg.218]    [Pg.98]    [Pg.292]    [Pg.20]    [Pg.318]    [Pg.53]   


SEARCH



© 2024 chempedia.info