Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Bounding uncertainty analyses

There are basically 2 approaches for making calcnlations in the face of nncertainty. One approach is to approximate an estimate. Mnch of the machinery of modern analysis and statistics is focnsed on getting good approximations. Another approach is to bound the value being songht. These 2 approaches are clearly complementary to each other. [Pg.89]

Rowe (1988) reviewed the following advantages of bonnding over approximation as a strategy for calculation nnder nncertainty. Bonnds are [Pg.89]

Application of Uncertainty Analysis to Ecological Risk of Pesticides [Pg.90]

Bounds are often easier to compute than approximate estimates, which, in contrast, commonly require the solving of integrals. This simplicity of calculation extends to the combination of bounds. If, for instance, one set of information tells us that A is in a particular interval and another set tells us that A is in a different interval, it is straightforward to compute what the aggregate data set is implying simply by taking the intersection of the 2 intervals. When we have 2 estimates from separate approximations, on the other hand, we would have to invoke a much more complicated meta-analysis to combine the estimates. [Pg.90]

In many cases, determining the correct decision does not require perfect precision. Analysis can reveal whether the uncertainty makes it unclear what the best decision is. Because the bounds on uncertainty tend to tighten as we collect more data, as soon as the best decision is obvious, one can stop gathering data. For approximations that contain no associated statement about their own reliability, scientists tend to always clamor for more data no matter how much they already have. For approximations, it takes an explicit uncertainty analysis to discern whether the data are essential to make the decision at hand. [Pg.90]


A common risk evaluation and presentation method is simply to multiply the frequency of each event by consequence of each event and then sum these products for all situations considered in the analysis. In insurance terms, this is the expected loss per year. The results of an uncertainty analysis, if performed, can be presented as a range defined by upper and lower confidence bounds that contain the best estimates. If the total risk represented by the best estimate or by the range estimate is... [Pg.41]

Some approaches to uncertainty analysis (e.g., 2D Monte Carlo and P-bounds) enable the assessor to separate variability and uncertainty. Other approaches do not separate them, and some schools of thought regard the distinction between variability and uncertainty as artificial or unhelpful. [Pg.168]

Communication between risk managers, risk assessors, and analysts is essential from the start of the assessment process, not just in communicating results. For example, the choice of uncertainty analysis methods will be dependent on 1) the questions posed by decision makers, 2) the closeness of the risk estimate and its bounds to thresholds of acceptability or unacceptability, 3) the type of decision that must be made, and 4) the consequences of the decision. [Pg.172]

Limitations on data availability are a recurrent concern in discussions about uncertainty analysis and probabilistic methods, but arguably these methods are most needed when data are limited. More work is required to develop tools, methods, and guidance for dealing with limited datasets. Specific aspects that require attention are the treatment of sampling error in probability bounds analysis, and the use of qualitative information and expert judgment. [Pg.174]

Scenario uncertainty characterization may include a description of the information used for the scenario characterization (scenario definition). This includes a description of the purpose of the exposure analysis. For regulatory purposes, the level of the tiered approach is essential to describe the choice of data, whether defaults, upper-bound estimates or other single point estimates, or distributions have been used. This choice may govern the kind of uncertainty analysis. [Pg.17]

Recently, however, limited use of best estimate plus uncertainty analysis methods has been undertaken. This is consistent with the international trend toward use of such methods. In this approach, more physically realistic models, assumptions, and plant data are used to yield analysis predictions that are more representative of expected behavior. This requires a corresponding detailed analysis of the uncertainties in the analysis and their effect on the calculated consequences. Typically, the probability of meeting a specific numerical safety criterion, such as a fuel centerline temperature limit, is evaluated together with the confidence limit that results from the uncertainty distributions associated with governing analysis parameters. The best estimate plus uncertainties approach addresses many of the problematic issues associated with conservative bounding analysis by... [Pg.188]

The uncertainty analysis should determine an as-left band, bounding the equipment performance after calibration. This tolerance should be included in the TSU such that leaving the equipment anywhere in the as-left band will assure a trip before the PPL is reached. [Pg.238]

An uncertainty analysis has been performed in order to assess the robustness of the results of the importance analyses. In fact, the importance measures depend on the values of the failure probabilities of the components of the storage system, which are uncertain. Thus, the results of the importance analysis need to be checked against the variability due to the uncertainty in the model input parameters. To do this, uncertainty in the occurrence probabilities of the most important basic events has been considered by uniform distributions with lower and upper bounds provided by SOL experts. The analysis has highlighted that the uncertainties in the occurrence probabilities of the basic events do not impact on the ranking of the most important causes identified for each TOP event thus, the results discussed above can be considered robust. [Pg.2365]

A number of studies on uncertainty analysis have been accumulated so far. Most studies are aimed at investigating the upper bound of the structural responses considering the uncertainties of structural parameters (e.g., see Ben-Haim and Elishakoff 1990 Ben-Haim 2001, 2006 ... [Pg.2341]

Figure 3 illustrates the relationship between the robustness function and the feasible domain of stmctural design to satisfy the performance criterion / < / for two-dimensional interval parameters. The robustness function a is derived as the worst case of the objective function, i.e., the upper bound of the objective function / in U X , (x). However, when the number of the combinations of uncertain parameters is extremely large, it may be hard to evaluate the worst case of the objective function reliably. For this reason, an efficient uncertainty analysis method is desired which can evaluate the upper bound of... [Pg.2343]

For the ECS phase, the random variables that apply to a given assembly are combined statistically by linear error propagations (a conventional square root of the sum of the squares formula) to determine an overall uncertainty allowance. Inherent iii this analysis is the assumption that covariance is negligible and variation from the reference case is linear. The listing of uncertainties for the ECS phase is not nearly as exhaustive as the listing for FI. The ECS uncertainty analysis needs to show that it is bounding with respect to "linear" assumptions and that the inclusion of critical assumptions is complete. This is an open item. [Pg.548]

Dourson et al. (1996) referred to a number of examinations of subchronic-to chronic NOAEL ratios, which showed that the average difference between subchronic and chronic values was only 2-3. Based on these examinations as well as on the analysis by Lewis (1993) and unpubhshed work in US-EPA, the authors concluded that the routine use of a 10-fold default factor for this area of uncertainty should be examined closely. They noted that short-term (2 weeks) and subchronic (90 days) NOAELs are often available and can give an indication of the possible differences in the subchronic NOAEL and the expected chronic NOAEL. When such data are not available, a 10-fold UF may not be unreasonable, but should be considered as a loose upper-bound estimate to the overall uncertainty. [Pg.267]

In Chapter 3 of this book we discussed the problem of multisite refinery integration under deterministic conditions. In this chapter, we extend the analysis to account for different parameter uncertainty. Robustness is quantified based on both model robustness and solution robustness, where each measure is assigned a scaling factor to analyze the sensitivity of the refinery plan and integration network due to variations. We make use of the sample average approximation (SAA) method with statistical bounding techniques to generate different scenarios. [Pg.139]

The crudest form of bounding analysis is just interval arithmetic (Moore 1966 Neumaier 1990). In this approach the uncertainty about each quantity is reduced to a pair of numbers, an upper bound and a lower bound, that circumscribe all the possible values of the quantity. In the analysis, these numbers are combined in such a way to obtain sure bounds on the resulting value. Formally, this is equivalent to a worst case analysis (which tries to do the same thing with only 1 extreme value per quantity). The limitations of such analyses are well known. Both interval arithmetic and any simple worst case analysis... [Pg.90]

Finally, although both probability bounds analysis and robust Bayes methods are fully legitimate applications of probability theory and, indeed, both find their foundations in classical results, they may be controversial in some quarters. Some argue that a single probability measure should be able to capture all of an individual s uncertainty. Walley (1991) has called this idea the dogma of ideal precision. The attitude has never been common in risk analysis, where practitioners are governed by practical considerations. However, the bounding approaches may precipitate some contention because they contradict certain attitudes about the universal applicability of pure probability. [Pg.115]

Practitioners of ecological risk assessments will frequently experience large uncertainty bounds on the estimates of risk. Unfortunately, characterizing and/or reducing uncertainty can be very costly. However, these costs must be balanced with the need to conduct sufficient analysis to make an informed decision. [Pg.151]

An initial step in addressing such situations should be the performance of an analysis of the sensitivity of a risk assessment model to changes in the variable. If the model proves relatively insensitive to conservative bounds to the variable, then further consideration of uncertainty for this variable may be unnecessary and a point estimate may suffice. [Pg.169]

Several papers in this book and in the recent literature (3) discuss use of pyrolysis techniques coupled with gas chromatography and mass spectrometry to determine forms of organically bound sulfur, but these methods introduce an uncertainty due to the possible interconversion of these sulfur forms during the heating step. For example, it has been shown that when benzyl sulfide was heated to 290°C, tetraphenyl thiophene, hydrogen sulfide and stilbene were produced (4). Coupled with heat and mass transport limitation considerations, particularly for viscous liquids and solids, it is not unreasonable to question whether at least some of the thiophenic forms observed by these techniques were produced during the analysis and may not have been present in the original sample. [Pg.224]

Often there are cases where the submodels are poorly known or misunderstood, such as for chemical rate equations, thermochemical data, or transport coefficients. A typical example is shown in Figure 1 which was provided by David Garvin at the U. S. National Bureau of Standards. The figure shows the rate constant at 300°K for the reaction HO + O3 - HO2 + Oj as a function of the year of the measurement. We note with amusement and chagrin that if we were modelling a kinetics scheme which incorporated this reaction before 1970, the rate would be uncertain by five orders of magnitude As shown most clearly by the pair of rate constant values which have an equal upper bound and lower bound, a sensitivity analysis using such poorly defined rate constants would be useless. Yet this case is not atypical of the uncertainty in rate constants for many major reactions in combustion processes. [Pg.336]


See other pages where Bounding uncertainty analyses is mentioned: [Pg.89]    [Pg.91]    [Pg.93]    [Pg.95]    [Pg.97]    [Pg.99]    [Pg.101]    [Pg.103]    [Pg.105]    [Pg.107]    [Pg.109]    [Pg.111]    [Pg.113]    [Pg.115]    [Pg.117]    [Pg.119]    [Pg.121]    [Pg.141]    [Pg.199]    [Pg.33]    [Pg.1117]    [Pg.141]    [Pg.1632]    [Pg.368]    [Pg.853]    [Pg.2361]    [Pg.326]    [Pg.49]    [Pg.53]    [Pg.210]    [Pg.53]    [Pg.15]    [Pg.248]    [Pg.193]    [Pg.122]    [Pg.40]    [Pg.7]    [Pg.98]    [Pg.103]    [Pg.104]    [Pg.112]    [Pg.118]    [Pg.68]    [Pg.233]    [Pg.88]    [Pg.104]   


SEARCH



Uncertainty Bounds

Uncertainty analysis

© 2024 chempedia.info