Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Probabilistic scenarios

When applying network design models to the applications of humanitarian and health logistics, there are many unknowns. Getting accurate estimates for these parameters can be quite difficult. The proper quantification of the uncertainty of the model parameters as well as the framework for incorporating the uncertainty into the model is therefore extremely important. For the military humanitarian logistics example, a stochastic optimization approach was used to incorporate probabilistic scenarios. However, additional work is needed in both examples presented, particularly on estimating the "demand"-related parameters. For example, in the health... [Pg.184]

The toxic effects model translates the exposure profiles into casualty probabilities for the personnel, assuming a probabilistic dose-effect relationship. The casualty levels and spectra can be obtained for various type of health effects, e.g. eye effects, inhalation, percutane, subdivided in two levels (incapacitating and lethal), and various protection levels, e.g. no protection, suit only, mask only, mask and suit, and collective protection. Table 1 gives a typical result for one scenario. In case no protection is used, 63% of the population dies due to inhalation of sarin and 25% dies due to percutaneous exposure. Clearly, when both mask and suit are worn, the casualty levels are dropping drastically. [Pg.68]

The event tree is useful for providing scenarios of possible failure modes. If quantitative data are available, an estimate can be made of the failure frequency. This is used most successfully to modify the design to improve the safety. The difficulty is that for most real processes the method can be extremely detailed, resulting in a huge event tree. If a probabilistic computation is attempted, data must be available for every safety function in the event tree. [Pg.491]

This approach is proposed by Chen/Lee (2004) to reach more robust solutions considering probabilistic for in this case demand quantity scenarios without considering price uncertainty. [Pg.247]

The US Environmental Protection Agency (USEPA 1998) describes problem formulation as an iterative process with 4 main components integration of available information, definition of assessment endpoints, definition of conceptual model, and development of an analysis plan. These 4 components apply also to probabilistic assessments. In addition, it is useful to emphasize the importance of a 5th component dehnition of the assessment scenarios. The relationships between all 5 components are depicted in Figure 2.1. Note that the bidirectional arrows represent the interdependency of the different components and imply that they may need to be revised iteratively as the formulation of the problem is rehned. [Pg.11]

From the standpoint of practical regulatory assessment, it would be desirable to reach a consensus on the selection of methods for routine use for pesticide risk assessments while recognizing that there may be scientific reasons for preferring alternative methods in particnlar cases. Such a consensus does not yet exist. Further case studies are required, covering a range of contrasting pesticides and scenarios, to evaluate the available methods more fully. While a consensus is lacking, it is important that reports on probabilistic assessments clearly explain how their methods work and why they were selected. [Pg.24]

The use of uncertainty analysis and probabilistic methods requires systematic and detailed formulation of the assessment problem. To facilitate this, a) risk assessors and risk managers should be given training in problem formulation, b) tools to assist appropriate problem formulation should be developed, and c) efforts should be made to develop generic problem formulations (including assessment scenarios, conceptual models, and standard datasets), which can be used as a starting point for assessments of particular pesticides. [Pg.173]

According to IPCS [18] an exposure model is a conceptual or mathematical representation of the exposure process, designed to reflect real-world human exposure scenarios and processes. There are many different ways to classify exposure models. A consensus appears to be developing around the following classification scheme proposed by the World Health Organization [19], which has been adopted in this chapter (a) mechanistic or empirical and (b) deterministic or stochastic (probabilistic). Table 1 lists these model categories. However, alternative classifications may be considered as well. [Pg.264]

As with the shifting probability p in the probabilistic shifting scheme, the threshold T is also an application-specific parameter, whose value affects the tradeoff between the shifting cost (unicast) and the rekey cost (multicast). Analyzing the tradeoff is difficult and may vary for different application scenarios. Thus, we evaluate the choice of different thresholds through simulation, as discussed in Section 4.2. [Pg.14]

An uncertainty analysis gives the assessor the opportunity to re-evaluate the scenario, model approaches and parameters of the analysis and to consider their influence in the overall analysis. The practical impact of uncertainty analysis is illustrated within the annexed case-studies, which also clarify how uncertainty analyses follow a systematic methodology, based on a tiered approach, and consider all possible sources of uncertainty. The first step in uncertainty analysis consists of a screening, followed by a qualitative analysis and two levels of quantitative analysis, using deterministic and probabilistic data. The assessor should be aware that an uncertainty analysis cannot answer all the questions, which, moreover, may lead to new questions. [Pg.84]

Ab initio and Monte-Carlo calculations. Attempts have appeared in pulse radiolysis to describe the dynamics of free electron production, recombination and solvation on a microscopic scale [31-34]. This requires the knowledge of a number of physical parameters solvated electron and free ion yields, electron and hole mobilities, slowing-down cross-sections, localization and solvation times, etc. The movement and fate of each reactant is examined step by step in a probabilistic way and final results are obtained by averaging a number of calculated individual scenarios. [Pg.84]

Approaches for aggregating exposure for simple scenarios have been proposed in the literature (Shurdut et al., 1998 Zartarian et al., 2000). The USEPA s National Exposure Research Laboratory has developed the Stochastic Human Exposure and Dose Simulation (SHEDS) model for pesticides, which can be characterized as a first-generation aggregation model and the developers conclude that to refine and evaluate the model for use as a regulatory decision-making tool for residential scenarios, more robust data sets are needed for human activity patterns, surface residues for the most relevant snrface types, and cohort-specific exposure factors (Zartarian et al, 2000). The SHEDS framework was used by the USEPA to conduct a probabilistic exposure assessment for the specific exposure scenario of children contacting chromated copper arsenate (CCA)-treated playsets and decks (Zartarian et al, 2003). [Pg.373]

Guidance for selecting an appropriate mathematical approach for a given exposure scenario (deterministic versus probabilistic) and guidance regarding conduct of acceptable probabilistic assessments. [Pg.375]

Also useful have been probabilistic tools in obtaining more realistic results. Probabilistic risk assessment may appear less precise as opposed to the current approach, which involves deterministic methods in the calculation of at least some values. However, the practice for the assessor is to set up a scenario and evaluate it using a number of assumptions. Thus typically the experimenter must use good judgment in considering the likeliness of certain events occur-ing, and to evaluate the probability of these events. The probability of certain situations, pathways, or scenarios has always been at the basis of the evaluation. What is new, now, is the availability of some computerized tools to measure probability. [Pg.626]


See other pages where Probabilistic scenarios is mentioned: [Pg.304]    [Pg.61]    [Pg.65]    [Pg.67]    [Pg.70]    [Pg.300]    [Pg.162]    [Pg.169]    [Pg.170]    [Pg.3161]    [Pg.304]    [Pg.61]    [Pg.65]    [Pg.67]    [Pg.70]    [Pg.300]    [Pg.162]    [Pg.169]    [Pg.170]    [Pg.3161]    [Pg.37]    [Pg.254]    [Pg.11]    [Pg.15]    [Pg.950]    [Pg.484]    [Pg.304]    [Pg.79]    [Pg.256]    [Pg.104]    [Pg.56]    [Pg.189]    [Pg.240]    [Pg.190]    [Pg.26]    [Pg.397]    [Pg.27]    [Pg.95]    [Pg.123]    [Pg.7]    [Pg.179]    [Pg.362]    [Pg.379]    [Pg.124]    [Pg.32]    [Pg.1115]   
See also in sourсe #XX -- [ Pg.65 , Pg.67 , Pg.70 , Pg.71 , Pg.72 , Pg.73 ]




SEARCH



Scenario, scenarios

Scenarios

© 2024 chempedia.info