Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Computer risk assessment analysis

The popularity of Monte Carlo for risk-based uncertainty analysis is somewhat driven by the fact that Monte Carlo is fundamentally easy to implement, particularly with the advent of the personal computer, and graphically based software like Crystal Ball (www.decisioneering.com) and Risk (www.palisade. com/risk.html). The availability of such software systems generally promotes the use of uncertainty analysis in ecological risk assessments, reducing the amount of mathematical and statistical knowledge required of the user to implement the... [Pg.54]

The overall concept of all of the following tools is that of risk analysis or risk assessment. Risk analysis helps to decide whether an aspect is GMP-critical or not. The risk analysis can be performed in a formal or more informal way. Following are two popular and import types of risk analysis. Another method, the fault tree analysis (FTA), has recently been used in the area of computer validation. This method is not described here, as it is a complex form of risk analysis. [Pg.488]

Written procedures shall define how the system will be used and controlled, and periodic review of these procedures and the validation documentation status must be carried out. The periodic review procedure should define responsibilities and should include predetermined criteria for reporting that computer system validation is being satisfactorily maintained in alignment with the project validation plan. A GMP risk assessment should form part of each periodic review to reconfirm (or not) the findings of the previous risk analysis and provide information for any revalidation that is considered necessary. [Pg.629]

Iman, R.L. and Conover, W.J. (1980). Small sample sensitivity analysis techniques for computer models, with an application to risk assessment. Communications in Statistics A—Theory and Methods, 9, 1749-1842. [Pg.326]

The risk assessment process can be conducted by examining record types to see if they are GxP or non-GxP, and then applying severity checks, likelihood, and probability of detection criteria, as illustrated in Figure 15.2. The most severe scenarios shonld be linked to direct patient/consnmer impact. GxP noncompliance and broken license conditions are severe in their own right bnt not as critical as patient/consumer health in this analysis." Its likelihood will be influenced by the degree of human error in how the record is input and used. The probability of detection needs to take into account the probability of the impacted record being used. Once failure modes are understood, then the appropriate design controls can be introduced. These should be documented and validated as part of the computer system life cycle discussed earher in this book. [Pg.359]

Uncertainties inherent to the risk assessment process can be quantitatively described using, for example, statistical distributions, fuzzy numbers, or intervals. Corresponding methods are available for propagating these kinds of uncertainties through the process of risk estimation, including Monte Carlo simulation, fuzzy arithmetic, and interval analysis. Computationally intensive methods (e.g., the bootstrap) that work directly from the data to characterize and propagate uncertainties can also be applied in ERA. Implementation of these methods for incorporating uncertainty can lead to risk estimates that are consistent with a probabilistic definition of risk. [Pg.2310]

There is at least one major area of activity pertaining directly to the environment for which the reader will seek in vain. The complexity of environmental problems and the availability of personal computers have led to extensive studies on models of varying sophistication. A discussion and evaluation of these lie well beyond the competence of an old-fashioned experimentalist this gap is left for others to fill but attention is drawn to a review that covers recent developments in the application of models to the risk assessment of xenobiotics (Barnthouse 1992), a book (Mackay 1991) that is devoted to the problem of partition in terms of fugacity — a useful term taken over from classical thermodynamics — and a chapter in the book by Schwarzenbach et al. (1993). Some superficial comments are, however, presented in Section 3.5.5 in an attempt to provide an overview of the dissemination of xenobiotics in natural ecosystems. It should also be noted that pharmacokinetic models have a valuable place in assessing the dynamics of uptake and elimination of xenobiotics in biota, and a single example (Clark et al. 1987) is noted parenthetically in another context in Section 3.1.1. In similar vein, statistical procedures for assessing community effects are only superficially noted in Section 7.4. Examples of the application of cluster analysis to analyze bacterial populations of interest in the bioremediation of contaminated sites are given in Section 8.2.6.2. [Pg.20]

Risk limits have always to be regarded in relation with the procedure used for risk assessment. The degree of conservativeness, of detail etc. in determining risks influence the calculated result (vid. Chaps. 9 and 10). In order to ensure equitable treatment the application of risk limits requires a largely unified, convention-based procedure of analysis. This is achieved, for example in the Netherlands, by using the computer program PH AST [31]. It contains algorithms for the methods of analysis and default values for many of the required input parameters. [Pg.276]

Figure 9.4 Risk assessment for an aquatic environment based on a probabilistic procedure into which the concept of varying sensitivity in multispecies communities is incorporated (Nendza, Volmer and Klein, 1990). Exposure and effects are determined separately from experimental or, if not available, QSAR data. Physico-chemical data and information on bioaccumulation and biotransformation are the input for computer simulations of transport and distribution processes that estimate the concentrations of a potential contaminant in a selected river scenario, using, for example, the EXAMS model (Bums, Cline and Lassiter, 1982). For the effects assessment, the log-normal sensitivity distribution is calculated from ecotoxicological data and the effective concentrations for the most sensitive species are determined. The exposure concentrations and toxicity data are then compared by analysis of variance to give a measure of risk for the environment. Modified from Nendza, Volmer and Klein (1990) with kind permission from Kluwer Academic Publishers, Dordrecht. Figure 9.4 Risk assessment for an aquatic environment based on a probabilistic procedure into which the concept of varying sensitivity in multispecies communities is incorporated (Nendza, Volmer and Klein, 1990). Exposure and effects are determined separately from experimental or, if not available, QSAR data. Physico-chemical data and information on bioaccumulation and biotransformation are the input for computer simulations of transport and distribution processes that estimate the concentrations of a potential contaminant in a selected river scenario, using, for example, the EXAMS model (Bums, Cline and Lassiter, 1982). For the effects assessment, the log-normal sensitivity distribution is calculated from ecotoxicological data and the effective concentrations for the most sensitive species are determined. The exposure concentrations and toxicity data are then compared by analysis of variance to give a measure of risk for the environment. Modified from Nendza, Volmer and Klein (1990) with kind permission from Kluwer Academic Publishers, Dordrecht.
Eranchin, P. A Computational Framework for Systemic Seismic Risk Analysis of Civil Infrastructural Systems. In Pitilakis, K., et al. (eds.) SYNER-G Systemic Seismic Vulnerability and Risk Assessment of Complex Urban, Utility, Lifeline Systems and Critical Facilities. Geotechnical, Geological and Earthquake Engineering, vol. 31, pp. 23-56. Springer Science- -Business Media Dordrecht (2014), doi 10.1007/978-94-017-8835-9 2... [Pg.337]

IMAN, R.L. and CANOVER, W.J., Small Sample Sensitivity Analysis Techniques for Computer Models with an Application to Risk Assessment, Communications in Statistics. Theory ahd Methods, A9 (17), 1980, 1749-1842. [Pg.408]


See other pages where Computer risk assessment analysis is mentioned: [Pg.2270]    [Pg.132]    [Pg.356]    [Pg.342]    [Pg.432]    [Pg.552]    [Pg.29]    [Pg.85]    [Pg.418]    [Pg.93]    [Pg.164]    [Pg.2025]    [Pg.656]    [Pg.2238]    [Pg.39]    [Pg.174]    [Pg.155]    [Pg.573]    [Pg.745]    [Pg.391]    [Pg.764]    [Pg.2274]    [Pg.23]    [Pg.1050]    [Pg.14]    [Pg.14]    [Pg.2217]    [Pg.25]    [Pg.437]    [Pg.381]    [Pg.212]    [Pg.215]    [Pg.183]    [Pg.326]    [Pg.154]   
See also in sourсe #XX -- [ Pg.113 ]




SEARCH



Analysis, computers

Computational assessment

Risk analysis

Risk assessment analysis

© 2024 chempedia.info