Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Assessment Factors Probabilistic Approach

It has been suggested by, e.g.. Slob and Pieters (1998), Kalberlah and Schneider (1998), Vermeire et al. (1999), and KEMI (2003) to use probabihty distributions for the various types of assessment factors in order to achieve a more precise estimation of the degree of statistical certainty and to avoid the piling up of worst-case assumptions in the overall assessment factor. [Pg.290]

The probabilistic approach allows for a closer link with specific knowledge or lack of knowledge in specific assessments. For example, one may be more confident in the magnimde of the possible interspecies difference in one case than another. This may be expressed in the width of the relevant distribution for the assessment factor. However, in many cases, even the range of uncertainty is uncertain, and for those situations default distributions are called for. [Pg.290]

In this method, each assessment factor is considered uncertain and characterized as a random variable with a lognormal distribution with a GM and a GSD. Propagation of the uncertainty can then be evaluated using Monte Carlo simulation (a repeated random sampling from the distribution of values for each of the parameters in a calculation to derive a distribution of estimates in the population), yielding a distribution of the overall assessment factor. This method requires characterization of the distribution of each assessment factor and of possible correlations between them. As a first approach, it can be assumed that all factors are independent, which in fact is not correct. [Pg.290]

Baird et al. (1996) suggested a probabilistic alternative to the practice used by the US-EPA to derive RfDs from a NOAEL and application of UFs. The probabilistic approach expresses the human population threshold for a given substance as a probability distribution of values, rather than a single RfD value, taking into account the major sources of scientific uncertainty in such estimates. The approach was illustrated by using much of the same data that US-EPA used to justify their RfD procedure. For the four key extrapolations that were considered necessary to define the human population threshold based on animal data (interspecies, interindividual, LOAEL-to-NOAEL, and subchronic-to-chronic), the proposed approach used available data to define a probability distribution of each adjustment factor, rather than using available data to define point estimates of UFs. [Pg.290]

Slob and Pieters (1998) have proposed a probabilistic approach for deriving acceptable human intake limits and human health risks from toxicological studies in which it is acknowledged that both the effect parameter (e.g., NOAEL, BMD) and the assessment factors are uncertain and can best be described by lognormal distributions. [Pg.290]


Residential exposure should be estimated by taking into account distributions of exposure factors. Methods to assess distributions are through the deterministic or probabilistic approach (Figure 6.6). The former is often taken in preventive risk assessment in which each default value is determined from each distribution as a reasonable worst-case . The estimated exposures for the deterministic approach are expected to occur in the upper range. For actual risk assessments, the probabilistic approach directly uses the parameter distributions instead of single values to calculate distributions of exposure. To characterize exposure, an... [Pg.237]

In a deterministic approach, we obtain a single value of the two parameters and can obtain the factor of safety against failure. However, the advantages of assessing using probabilistic approach are (i) the outcome is expressed as a (estimated) likelihood of failure by taking into account various sources of uncertainty involved in the assessment (ii) sensitivity analysis can be conducted to identify key factors that affect the outcome (the results are useful to improve the... [Pg.2416]

In conclusion, if no substance-specific data are available, it is recommended as a default to correct for differences in metabolic size (differences in body size between humans and experimental animals) by using allometric scaling based on the caloric requirement approach (see Table 5.4). The assessment factor accounting for remaining interspecies differences should preferentially be described probabilistically as suggested by Vermeire et al. (1999, 2001) and KEMI (2003), or a deterministic default factor of 2.5 could be used for extrapolation of data from rat studies to the human situation. [Pg.243]

In conclusion, the assessment factor for interindividual variability should preferentially be described probabilistically. However, at present there is no database-derived distribution of the interindividual factor and thus a deterministic default factor of 10, split evenly into a sub-factor of 3.16 for both toxicokinetics and toxicodynamics, respectively, is recommended in order to account for the interindividual variability in the human population. Alternatively, the pathway-related UF approach suggested by Renwick and Lazams (1998) and further developed as reviewed by Dome et al. (2005) could be applied in case the pathway(s) of the metabolism of the chemical in humans... [Pg.260]

As described in detail in this book, the use of assessment factors is an established practice in chemical risk assessment to account for uncertainties inherent in the hazard (effects) assessment and consequently, inherent in the risk assessment. The use of assessment factors to address this uncertainty is part of the conventional approach that has developed over the years. According to the current risk assessment paradigm, the usual approach is simply to multiply these individual assessment factors in order to establish an overall composite numerical assessment factor (Section 5.10). An alternative to the traditional assessment factor approach is to combine estimates of the ranges that these factors may encompass through a probabilistic assessment this is essentially a variation of the standard paradigm. [Pg.349]

Because of the variation in assessment factors and their uncertainty (lightning characteristics, attachment locations, damage models, etc.), a probabilistic or statistical approach can be used to express a relationship for the probability of failure of a critical chemical equipment item. For a single equipment item, this relationship can be expressed as ... [Pg.929]

Two major approaches can be quoted to characterize human error the probabilistic and the causal one (Rouse Rouse 1983). Typically, the probabilistic approach is pursued by reliability engineers and human factors specialists trying to measure human reliability in terms of the same features as those of equipment to give an estimate of reliability of the system as a whole (Adams 1982). Human failure rates regarding particular types of tasks and procedures serve as an input for human risk assessment. In contrast, the causal approach to characterizing human error is based on the premise that errors are rarely random and, in fact, can be traced back to causes and contributing factors in order to propose improvements. [Pg.111]

Approaches for aggregating exposure for simple scenarios have been proposed in the literature (Shurdut et al., 1998 Zartarian et al., 2000). The USEPA s National Exposure Research Laboratory has developed the Stochastic Human Exposure and Dose Simulation (SHEDS) model for pesticides, which can be characterized as a first-generation aggregation model and the developers conclude that to refine and evaluate the model for use as a regulatory decision-making tool for residential scenarios, more robust data sets are needed for human activity patterns, surface residues for the most relevant snrface types, and cohort-specific exposure factors (Zartarian et al, 2000). The SHEDS framework was used by the USEPA to conduct a probabilistic exposure assessment for the specific exposure scenario of children contacting chromated copper arsenate (CCA)-treated playsets and decks (Zartarian et al, 2003). [Pg.373]

The following sections in this paper are structured as follows. Section 2 describes the basic ideas and principles of the Bayesian updating approach and explains how this approach can be used for the drilling jar selection case. In Section 3 we present the main features of the extended analysis, emphasising the assessment of the uncertainty factors that extends beyond the probabilistic analysis. In Section 4 we outline how the total uncertainty picture can be presented for the jar case. Section 5 provides some conclusions. [Pg.792]

The remainder of the article is organized as follows In Section 2, the factor model, which is a typical CCF factor model, is briefly introduced. Uncertainties brought by P factors are discussed and serve as an inspiration of using modified models to deal with dependencies in PR A. The D-S evidence theory and evidential networks are briefly introduced in Section 3. The EN based approach is then discussed in detail in Section 4. In Section 5, a part of a practical probabilistic risk assessment is analyzed using the proposed approach. The conclusion is expressed in the end. [Pg.1422]


See other pages where Assessment Factors Probabilistic Approach is mentioned: [Pg.290]    [Pg.290]    [Pg.242]    [Pg.243]    [Pg.91]    [Pg.154]    [Pg.2250]    [Pg.256]    [Pg.115]    [Pg.34]    [Pg.457]    [Pg.644]    [Pg.751]    [Pg.408]    [Pg.291]    [Pg.160]    [Pg.1707]    [Pg.235]    [Pg.729]    [Pg.57]    [Pg.346]    [Pg.2309]   


SEARCH



Assessment factors approach

© 2024 chempedia.info