Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Probability assessment historic data

Typical events that are considered are fire, explosion, ship collision, and the failure of pressurized storage vessels for which historical data established the failure frequencies. Assessment of consequences was based partly on conservative treatment of past experience. For example ilic assessment of the number of casualties from the release of a toxic material was based on past histoiy conditioned by knowledge of the toxicology and the prevailing weather conditions. An altemati. e used fault trees to estimate probabilities and identify the consequences. Credit is taken in this process for preventative measures in design, operation, and maintenance procedures. Historical data provide reliability expected from plant components and humans. [Pg.433]

The hazard frequency assessment includes two part temporal frequency and spatial frequency. Temporal frequency is a hkehhood characteristic of rockfall occurrence per unit of time period i.e. one year. It indicates the rock instabiUty or failure in the source area. Temporal frequency can be inferred from historical catalogue or using relative rock failure rating system (Jaboyedoff et al., 2005). Distribution laws have been proposed for rockfaUs based on statistical analysis of historical data sets to derive their recurrent probability (Dussauge et al., 2002,2003). [Pg.53]

One of the concerns during Human and Organizational Factor (HOF) assessment in process industries is the absence of a database for HOF quantification. Therefore, for an effective HOF assessment, it is required to rely on an authentic source for quantification. In this work, it was decided to rely on data from past accidents for HOF quantification. lEC Standard (2009, p. 15) also illustrates the use of relevant historical data to predict the probability of occurrence for events in future. [Pg.997]

Risk assessments commonly use frequencies of observed events (i.e. historical data) as a major part of the basis for the risk assessment. To decide whether some available historical data is relevant for the future situation may be challenging. When in lack of data, or when we believe the past is not relevant for assessing the future, we turn to experts who possess knowledge that we consider relevant and want to apply in the assessments. We ask the experts to provide their opinion, and quantify degrees of belief using subjective probabilities. [Pg.1440]

The risk probability is assessed based on input from analysts at transferring site, experience of receiving site with methods and products, historical method performance, e.g. method/product process capability data, stability trends, OOS history, etc. (Raska et al., 2010). [Pg.35]

Probably the first major publication of a process model for the autoclave curing process is one by Springer and Loos [14]. Their model is still the basis, in structure if not in detail, for many autoclave cure models. There is little information about results obtained by the use of this model only instructions on how to use it for trial and error cure cycle development. Lee [16], however, used a very similar model, modified to run on a personal computer, to do a parametric study on variables affecting the autoclave cure. A cure model developed by Pursley was used by Kays in parametric studies for thick graphite epoxy laminates [18]. Quantitative data on the reduction in cure cycle time obtained by Kays was not available, but he did achieve about a 25 percent reduction in cycle time for thick laminates based on historical experience. A model developed by Dave et al. [17] was used to do parametric studies and develop general rules for the prevention of voids in composites. Although the value of this sort of information is difficult to assess, especially without production trials, there is a potential impact on rejection rates. [Pg.455]

The X chart considers only the current data value in assessing the status of the process. Run rules have been developed to include historical information such as trends in data. The run rules sensitize the chart, but they also increase the false alarm probability. The warning limits are useful in developing additional run rules in order to increase the sensitivity of Shewhart charts. The warning limits are established at 2-sigma level, which corresponds to o /2=0.02275. Hence,... [Pg.13]

If it is available, historical information such as records of runup heights, tide gauge records and reports of observed tsunamis and the damage they caused should be used to assess the validity of the computational methods used for determining the near shore effects of tsunamis. The justification of the analytical methods presented for determining the probable maximum tsunamis should be supported to the extent possible by evidence of satisfactory agreement with data from observations, but in any case the results should be demonstrated to be conservative. [Pg.55]

As with any quantitative risk assessment technique, it is important that where probabilities or frequencies are assigned numerical values, fhese values are supporfed by evidence. Wherever possible, historical performance data should be gathered to support the assumptions made. Where literature sources are used, analysts should justify fheir use as parf of the LOPA report. [Pg.94]


See other pages where Probability assessment historic data is mentioned: [Pg.288]    [Pg.760]    [Pg.79]    [Pg.462]    [Pg.55]    [Pg.232]    [Pg.533]    [Pg.245]    [Pg.133]    [Pg.558]    [Pg.144]    [Pg.140]    [Pg.177]    [Pg.678]    [Pg.49]    [Pg.77]    [Pg.1832]    [Pg.118]   


SEARCH



Data assessment

Historic data

Historical data

© 2024 chempedia.info