Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Probability event and

The addition of planning tools for specific internal and external emergency scenarios that could affect the hospital. These tools target probable events and assist with integration with community agencies as well. The tools include checklists, forms, and so forth. [Pg.140]

A particular kind of electronic relaxation process is electron transfer. In this case (see Chapter 16) the electronic transition is associated with a large rearrangement of the charge distribution and consequently a pronounced change of the nuclear configuration, which translate into a large A. Nuclear tunneling in this case is a very low-probability event and room temperature electron transfer is usually treated as an activated process. [Pg.443]

We see that the Geometric Test outperforms the other two tests for all combinations of probability vectors for sample sizes of 5 and 10. Beyond a sample size of 10 the Fisher Test, in particular, typically out-performs the Geometric Test. However, in this paper we are interested in situations where we have small probabilities and possibly few data and the in this case the Geometric Test give an improvement, albeit small, over the other two tests. When combined with its superior Type I error probabilities for the trinomial distribution with low probability events and this new Geomet-... [Pg.1899]

Currently, there are three DBAs that appear enveloping and could potentially limit reactor power the large break loss-of-coolant accident (LBLOCA) and the loss of pumping accident (LOPA), which are judged to very low probability events and the loss of septifoil cooling accident, which has occurred twice during the life of the SRS reactors. [Pg.568]

To evaluate the probability that one event and another will occur, we multiply the probabilities of the individual events. Thus the probability... [Pg.44]

Let us consider an event that must have one of two outcomes. It must either occur with probability or fail to occur with probability po. Since these are exclusive events and the probability that something will happen is unity, it follows that... [Pg.822]

Layer of protection analysis (LOPA) is a simplified form of event tree analysis. Instead of analyzing all accident scenarios, LOPA selects a few specific scenarios as representative, or boundary, cases. LOPA uses order-of-magnitLide estimates, rather than specific data, for the frequency of initiating events and for the probability the various layers of protection will fail on demand. In many cases, the simplified results of a LOPA provide sufficient input for deciding whether additional protection is necessary to reduce the likelihood of a given accident type. LOPAs typically require only a small fraction of the effort required for detailed event tree or fault tree analysis. [Pg.37]

Frequency Phase 3 Use Branch Point Estimates to Develop a Ere-quency Estimate for the Accident Scenarios. The analysis team may choose to assign frequency values for initiating events and probability values for the branch points of the event trees without drawing fault tree models. These estimates are based on discussions with operating personnel, review of industrial equipment failure databases, and review of human reliability studies. This allows the team to provide initial estimates of scenario frequency and avoids the effort of the detailed analysis (Frequency Phase 4). In many cases, characterizing a few dominant accident scenarios in a layer of protection analysis will provide adequate frequency information. [Pg.40]

A logic model that graphically portrays the range of outcomes from the combinations of events and circumstances in an accident sequence. For example, a flammable vapor release may result in a fire, an explosion, or in no consequence depending on meteorological conditions, the degree of confinement, the presence of ignition sources, etc. These trees are often shown with the probability of each outcome at each branch of the pathway... [Pg.76]

A fault tree is a graphical form of a Boolean equation, but the probability of the top event (and lesser events) can be found by substituting failure rates and probabilities for these iwo-staie events. The graphical fault tree is prepared for computer or manual evaluation by pruning" it of less significant events to focus on more significant events. Even pruned, the tree may be so large that it IS intractable and needs division into subtrees for separate evaluations. If this is done, care must be taken to insure that no information is lost such as interconnections between subtrees. [Pg.111]

Function event trees are developed to represent the plant s response to each initiator. The function event tree is not an end product it is an intermediate step that provides a baseline of information and permits a stepwise approach to sorting out the complex relationships between potential initiating events and the response of the mitigating features. They structure plant respoases to accident conditions - possibly as time sequences. The transition labels of function event trees (usually along the top of the event tree) are analyzed to provide the probability of that function occurring or not occurring. [Pg.113]

System event trees connect to the containment event tree with a large number ot sequences fonned by different combinations of events. The number of sequences is made tractable by grouping sequences in release categories - a process called binning. Two approaches to binning are 1) probability screening and 2) plant-damage bins. [Pg.118]

Appendix HI, of WASH-1400 presents a database from 52 references that were used in the study. It includes raw data, notes on test and maintenance time and frequency, human-reliability estimates, aircraft-crash probabilities, frequency of initiating events, and information on common-cause failures. Using this information, it assesses the range for each failure rate. [Pg.153]

PREPROCESSOR - Modifies the FTAP punch file output for common cause and dependent analyses (conditional probabilities), removes complemented events, and corrects for mutually exclusive events. [Pg.239]

Uncertainly estimates are made for the total CDF by assigning probability distributions to basic events and propagating the distributions through a simplified model. Uncertainties are assumed to be either log-normal or "maximum entropy" distributions. Chi-squared confidence interval tests are used at 50% and 95% of these distributions. The simplified CDF model includes the dominant cutsets from all five contributing classes of accidents, and is within 97% of the CDF calculated with the full Level 1 model. [Pg.418]

The analysis set out above demonstrates the importance of a comprehensive evaluation of the human aspects of a hazardous operation, from the point of view of identifying all contributory events and recovery possibilities. It also indicates the need for a complete evaluation of the operational conditions (procedures, training, manning levels, labeling, etc.) which could impact on these probabilities. [Pg.207]

In this study detailed fault trees with probability and failure rate calculations were generated for the events (1) Fatality due to Explosion, Fire, Toxic Release or Asphyxiation at the Process Development Unit (PDU) Coal Gasification Process and (2) Loss of Availability of the PDU. The fault trees for the PDU were synthesized by Design Sciences, Inc., and then subjected to multiple reviews by Combustion Engineering. The steps involved in hazard identification and evaluation, fault tree generation, probability assessment, and design alteration are presented in the main body of this report. The fault trees, cut sets, failure rate data and unavailability calculations are included as attachments to this report. Although both safety and reliability trees have been constructed for the PDU, the verification and analysis of these trees were not completed as a result of the curtailment of the demonstration plant project. Certain items not completed for the PDU risk and reliability assessment are listed. [Pg.50]

In April 1982, a data workshop was held to evaluate, discuss, and critique data in order to establish a consensus generic data set for the USNRC-RES National Reliability Evaluation Program (NREP). The data set contains component failure rates and probability estimates for loss of coolant accidents, transients, loss of offsite power events, and human errors that could be applied consistently across the nuclear power industry as screening values for initial identification of dominant accident sequences in PRAs. This data set was used in the development of guidance documents for the performance of PRAs. [Pg.82]

Societal Risk - This represents a measure of the risk to a group of people, including tlie risk of incidents potentially affecting more tlian one person. Individual risk (see above) is generally not significantly affected by the number of people involved in an incident. The risk to a person at a particular location depends on tlie probability of occurrence of the luizardous event, and on the probability of an adverse imptict at that location should the event occur. [Pg.515]

Hazard, risk, failure, and reliability are interrelated concepts concerned witli uncertain events and tlierefore amenable to quantitative measurement via probability. "Hazard" is defined as a potentially dangerous event. For example, tlie release of toxic fumes, a power outage, or pump failure. Actualization of the potential danger represented by a hazard results in undesirable consequences associated with risk. [Pg.541]

Risk is defined as tlie product of two factors (1) tlie probability of an undesirable event and (2) tlie measured consequences of the undesirable event. Measured consequences may be stated in terms of financial loss, injuries, deatlis, or Ollier variables. Failure represents an inability to perform some required function. Reliability is the probability that a system or one of its components will perform its intended function mider certain conditions for a specified period. Tlie reliability of a system and its probability of failure are complementary in tlie sense tliat the sum of these two probabilities is unity. This cluipler considers basic concepts and llieorenis of probability tliat find application in tlie estimation of risk and reliability. [Pg.541]

After defining fundamental terms used in probability and introducing set notation for events, we consider probability theorems facilitating tlie calculation of the probabilities of complex events. Conditional probability and tlie concept of independence lead to Bayes theorem and tlie means it provides for revision of probabilities on tlie basis of additional evidence. Random variables, llicir probability distributions, and expected values provide tlie means... [Pg.541]

Hazard, risk, failure, and reliability are interrelated concepts concerned with uncertain events and tlierefore amenable to quantitative measurement via probability. [Pg.566]

Tliis cliapter is concerned willi special probability distributions and tecliniques used in calculations of reliability and risk. Tlieorems and basic concepts of probability presented in Cliapter 19 are applied to llie determination of llie reliability of complex systems in terms of tlie reliabilities of their components. Tlie relationship between reliability and failure rate is explored in detail. Special probability distributions for failure time are discussed. Tlie chapter concludes with a consideration of fault tree analysis and event tree analysis, two special teclmiques lliat figure prominently in hazard analysis and llie evaluation of risk. [Pg.571]

Figure 21.1.3 depicts tlie risk to society in tenns of the annual probability of N or more deaths as a result of tlie occurrence of incident I or incident 11. Note tliat tlie scales in Fig. 21.1.3 are logaritlunic. Tlie plotted probabilities are obtained by summing tlie probabilities of the events resulting in N or more deatlis for N = 0, 3, 6, 13. Table 21.3.1 lists these events and probabilities. [Pg.613]

In mathematics, Laplace s name is most often associated with the Laplace transform, a technique for solving differential equations. Laplace transforms are an often-used mathematical tool of engineers and scientists. In probability theory he invented many techniques for calculating the probabilities of events, and he applied them not only to the usual problems of games but also to problems of civic interest such as population statistics, mortality, and annuities, as well as testimony and verdicts. [Pg.702]


See other pages where Probability event and is mentioned: [Pg.315]    [Pg.34]    [Pg.186]    [Pg.40]    [Pg.240]    [Pg.315]    [Pg.34]    [Pg.186]    [Pg.40]    [Pg.240]    [Pg.17]    [Pg.108]    [Pg.116]    [Pg.1835]    [Pg.2271]    [Pg.2277]    [Pg.40]    [Pg.26]    [Pg.52]    [Pg.129]    [Pg.131]    [Pg.307]    [Pg.464]    [Pg.259]    [Pg.475]    [Pg.613]    [Pg.629]    [Pg.295]    [Pg.426]   


SEARCH



And probability

Events, probability

© 2024 chempedia.info