Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Risk analysis, computer systems

RISKMAN is an integrated Microsoft Windows , personal computer software system for [H. i forming quantitative risk analysis. Used for PSAs for aerospace, nuclear power, and chemical [iroccsses, it has five main modules Data Analysis, Systems Analysis, External Events Analysis, Event Tree Analysis, and Important Sequences. There are also modules for software system maintenance, backup, restoration, software updates, printer font, and page control. PEG has also integrated the fault tree programs CAFTA, SETS, NRCCUT, and IRRAS into RISKMAN. [Pg.143]

Table 11.4 Example of a risk analysis matrix for assessment of a computer system. Table 11.4 Example of a risk analysis matrix for assessment of a computer system.
Written procedures shall define how the system will be used and controlled, and periodic review of these procedures and the validation documentation status must be carried out. The periodic review procedure should define responsibilities and should include predetermined criteria for reporting that computer system validation is being satisfactorily maintained in alignment with the project validation plan. A GMP risk assessment should form part of each periodic review to reconfirm (or not) the findings of the previous risk analysis and provide information for any revalidation that is considered necessary. [Pg.629]

Another important activity is the process to evaluate new security technologies and their integration with computer systems. The procedural controls implemented as a result of a risk analysis provide a starting point to look for technologies that can replace procedural controls. These procedural controls are the result of security-related implementation requirements identified during the risk analysis. [Pg.125]

The risk assessment process can be conducted by examining record types to see if they are GxP or non-GxP, and then applying severity checks, likelihood, and probability of detection criteria, as illustrated in Figure 15.2. The most severe scenarios shonld be linked to direct patient/consnmer impact. GxP noncompliance and broken license conditions are severe in their own right bnt not as critical as patient/consumer health in this analysis." Its likelihood will be influenced by the degree of human error in how the record is input and used. The probability of detection needs to take into account the probability of the impacted record being used. Once failure modes are understood, then the appropriate design controls can be introduced. These should be documented and validated as part of the computer system life cycle discussed earher in this book. [Pg.359]

Ale, B. J. M. and Whitehouse, R. J. (1986) Computer based systems for risk analysis of process plant, in Heavy Gas and Risk Analysis, Hartwig, S. (ed.) (Reidel). [Pg.394]

The problem of risk analysis in pipelines, especially to be treated imder a focus of Decision Theory, Utility Theory and the Multi-Attribute Utility Theory (MAUT), requires a decision support system with a very much rich computational capacity in all its components. The Decision Support System for Risk Analysis in installation of pipelines imder a MAUT approach should, at the same time, have substantial databases and models that meet the needs of the mathematical modeling to the problem, and a structured dialogue component to perform an intense communication with the decision-maker. [Pg.92]

The risk analysis in pipeline installations, especially been tackled by a multi-attribute utility theory approach, claim a well-structured computational tools. The decision support system to risk analysis on pipelines installations under a MAUT approach... [Pg.93]

We describe some models of the disastrous events spreading. This is a very important part of risk analysis. Despite the strong dependency of successive events, after some arrangements we can use Markov models for the description. It allows us to compute several characteristics of such system. When we consider a system of objects among them a disastrous event could spread, we can compute a probabdity distribution of absorbing states, first passage times for any of the objects and many others. This modeling can help us to make some preventive decision or to prepare disaster recovery plans. In the paper, the model is described and some computations are outlined. Keywords risk, safety, successive event, disastrous event, markov chain. [Pg.1127]

For this reason a systematic analysis or a consistent theory of risk acceptance in the exact sense of a rational system of humans, respectively social motivation and readiness to take risks, is not merely wanting completely thus far [9-24]. It cannot be expected either from future progress of empirical social sciences or social psychology—neither as a computation system for decision-making nor as a yardstick system for democratic" justification of demands to take risks—nor as an instruction... [Pg.419]

In the last twenty years, various non-deterministic methods have been developed to deal with optimum design under environmental uncertainties. These methods can be classified into two main branches, namely reliability-based methods and robust-based methods. The reliability methods, based on the known probabiUty distribution of the random parameters, estimate the probability distribution of the system s response, and are predominantly used for risk analysis by computing the probability of system failure. However, variation is not minimized in reliability approaches (Siddall, 1984) because they concentrate on rare events at the tail of the probability distribution (Doltsinis and Kang, 2004). The robust design methods are commonly based on multiobjective minimization problems. The are commonly indicated as Multiple Objective Robust Optimization (MORO) and find a set of optimal solutions that optimise a performance index in terms of mean value and, at the same time, minimize its resulting dispersion due to input parameters uncertainty. The final solution is less sensitive to the parameters variation but eventually maintains feasibility with regards probabilistic constraints. This is achieved by the optimization of the design vector in order to make the performance minimally sensitive to the various causes of variation. [Pg.532]

A hazard analysis should be carried out for the computer system s architecture and the functionality within it to identify any specific risks that might compromise the safety function and thus to indicate any need for changes to the architecture or additions of functions (such as self-checks) to mitigate the effects of hazards. Such an analysis should be included in the safety demonstration. [Pg.39]

Complementary to the hazard mitigating functions that are added in response to specific risks identified through the hazard analysis for the computer system, features such as watchdog timers, program sequence checking, reinitialization of variables and other fault detection mechanisms constitute good practice. Because of the need for simplicity of the system, these features are added only to the extent that they do not make the software unnecessarily more complex. [Pg.39]

Eranchin, P. A Computational Framework for Systemic Seismic Risk Analysis of Civil Infrastructural Systems. In Pitilakis, K., et al. (eds.) SYNER-G Systemic Seismic Vulnerability and Risk Assessment of Complex Urban, Utility, Lifeline Systems and Critical Facilities. Geotechnical, Geological and Earthquake Engineering, vol. 31, pp. 23-56. Springer Science- -Business Media Dordrecht (2014), doi 10.1007/978-94-017-8835-9 2... [Pg.337]

Malinowski, J., 2013. A method of computing the interstate transition intensities for multi-state series-parallel systems. Safety, Reliability and Risk Analysis Beyond the Horizon. Proceedings of the European Safety and Reliability Conference, ESREL2013 l i- l 9. [Pg.240]

Lund, M.S., B. Solhaug, K. Stolen (2011). Risk analysis of changing and evolving systems using CORAS. In F. o. S.A. (FOSADT1) and D. VI (Eds.), Lecture Notes in Computer Science, Volume 6858, pp. 231-274. Springer. [Pg.1537]

This discussion uses a typical MC-simulation to illustrate the need of emphasizing foundational issues how to perceive risk and assure quality of the assessment procedure. Students in fields relying on the use of quantitative assessments, such as risk, reliability or system analysis, learn about MC-simulation and different ways to quantitatively describe uncertainty. It is important not to end there, but give the students, and future quantitative assessors, the ability to fully understand what this uncertainty represents and on what scientific principles it can be assessed. We would like to point at Bayesian Evidence Synthesis as a suitable framework to teach predictive statistical principles involving simulations from a computer model. BES is useful both in scientific research and risk assessment and can therefore close the gap between these two highly important processes of knowledge production. [Pg.1597]

Nowadays, there is an increasing interest in system protection against intentional threats of physical nature [8,19]. On those regards, model-based vulnerability assessment is a crucial phase in the risk analysis of critical infrastructures. In fact, typical risk models include the computation of three logically sequential factors probability or frequency of threats (P) probability that threats are successful in their intent (i.e., vulnerability, V) consequences of successful threats (i.e., expected damage, D). Therefore, in order to evaluate infrastructure risks (R), it is essential to be able to compute the vulnerability of the system with respect to the threats [11]. One of the most widespread and intuitive model for the evaluation of the risk is [21] R = P V D. This model is based on a quantitative notion of vulnerability, different from other definitions also commonly used,... [Pg.230]

Mikulski, J. Malfunctions of Railway Traffic Control Systems Failure Rate Analysis. Proceedings of the 3rd International Conference on Computer Simulation in Risk Analysis and Hazard Mitigation, 2002, p. 141-147. [Pg.201]

An important issue related to the fragility curve construction and implicitly to the risk assessment is the selection of an appropriate earthquake intensity measure (IM) that characterizes the strong ground motion that best correlates with the response of each element, for example, building, pipeline, or harbor facilities like cranes. Several measures of the intensity of ground motion (IMs) have been developed. Each intensity measure may describe different characteristics of the motion, some of which may be more adverse for the structure or the system under consideration. The use of a particular IM in seismic risk analysis should be guided by the extent to which the measure corresponds to damage to the components of a system. Optimum intensity measures are defined in terms of practicality, effectiveness, efficiency, sufficiency, robustness, and computability (Mackie and Stojadinovic 2003). [Pg.3148]

The popularity of Monte Carlo for risk-based uncertainty analysis is somewhat driven by the fact that Monte Carlo is fundamentally easy to implement, particularly with the advent of the personal computer, and graphically based software like Crystal Ball (www.decisioneering.com) and Risk (www.palisade. com/risk.html). The availability of such software systems generally promotes the use of uncertainty analysis in ecological risk assessments, reducing the amount of mathematical and statistical knowledge required of the user to implement the... [Pg.54]


See other pages where Risk analysis, computer systems is mentioned: [Pg.2270]    [Pg.162]    [Pg.806]    [Pg.2025]    [Pg.34]    [Pg.675]    [Pg.157]    [Pg.382]    [Pg.2274]    [Pg.2385]    [Pg.76]    [Pg.555]    [Pg.104]    [Pg.177]    [Pg.345]    [Pg.330]    [Pg.1205]    [Pg.5]    [Pg.300]    [Pg.31]    [Pg.140]    [Pg.164]    [Pg.164]    [Pg.180]    [Pg.271]    [Pg.507]    [Pg.442]   


SEARCH



Analysis, computers

Computer systems

Risk analysis

System Risk

© 2024 chempedia.info