Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

And human error

ETA breaks down an accident iato its contributing equipment failures and human errors (70). The method therefore is a reverse-thinking technique, ie, the analyst begias with an accident or undesirable event that is to be avoided and identifies the immediate cause of that event. Each of the immediate causes is examined ia turn until the analyst has identified the basic causes of each event. The fault tree is a diagram that displays the logical iaterrelationships between these basic causes and the accident. [Pg.83]

The result of the ETA is a Hst of combiaations of equipment and human failures that ate sufficient to result ia the accident (71). These combiaations of failures are known as minimal cut sets. Each minimal cut set is the smallest set of equipment and human failures that are sufficient to cause the accident if all the failures ia that minimal set exist simultaneously. Thus a minimal cut set is logically equivalent to the undesired accident stated ia terms of equipment failures and human errors. [Pg.83]

Frequency Phase 1 Perform Qualitative Study, Typically Using HAZOP, FMEA, or What-if Analysis. To perform a qualitative study you should first (1) define the consequences of interest, (2) identify the initiating events and accident scenarios that could lead to the consequences of interest, and (3) identify the equipment failure modes and human errors that could contribute to the accident... [Pg.39]

Environmental factors such as severe hailstorms and human errors such as impact by aircraft service equipment also cause in-service damage to bonded assemblies. Bonded honeycomb sandwich assemblies are particularly prone to such damage because of their customary use as lightly loaded fairings and flight control surfaces and subsequent thin facesheets and relative fragility. [Pg.1170]

THE TRADITIONAL SAFETY ENGINEERING APPROACH TO ACCIDENTS AND HUMAN ERROR... [Pg.46]

In addition to their descriptive fimctions, TA techniques provide a wide variety of information about the task that can be useful for error prediction and prevention. To this extent, there is a considerable overlap between Task Analysis and Human Error Analysis (HEA) techniques described later in this chapter. HEA methods generally take the result of TA as their starting point and examine what aspects of the task can contribute to human error, hr the context of human error reduction in the CPI, a combination of TA and HEA methods will be the most suitable form of analysis. [Pg.161]

From a human reliability perspective, a number of interesting points arise from this example. A simple calculation shows that the frequency of a major release (3.2 x lO"" per year) is dominated by human errors. The major contribution to this frequency is the frequency of a spill during truck unloading (3 X10" per year). An examination of the fault tree for this event shows that this frequency is dominated by event B15 Insufficient volume in tank to imload truck, and B16 Failure of, or ignoring LIA-1. Of these events, B15 could be due to a prior human error, and B16 would be a combination of instrument failure and human error. (Note however, that we are not necessarily assigning the causes of the errors solely to the operator. The role of management influences on error will be discussed later.) Apart from the dominant sequence discussed above, human-caused failures are likely to occur throughout the fault tree. It is usually the case that human error dominates a risk assessment, if it is properly considered in the analysis. This is illustrated in Bellamy et al. (1986) with an example from the analysis of an offshore lifeboat system. [Pg.205]

One of the most noticeable ergonomic deficiencies in both control rooms was the number of panels that had to be scanned and monitored during the scenarios, and the number of rapid actions required at opposite ends of the control room. The need for virtually simultaneous actions and movements would have been discovered and resolved in the design stage had a task analysis and human error analysis been carried out on an emergency operation. [Pg.339]

Bainbridge, L. (1987). Ironies of Automation. In J. Rasmussen, K. Duncan, J. Leplat (Eds.), New Technology and Human Error. New York Wiley. [Pg.366]

In April 1982, a data workshop was held to evaluate, discuss, and critique data in order to establish a consensus generic data set for the USNRC-RES National Reliability Evaluation Program (NREP). The data set contains component failure rates and probability estimates for loss of coolant accidents, transients, loss of offsite power events, and human errors that could be applied consistently across the nuclear power industry as screening values for initial identification of dominant accident sequences in PRAs. This data set was used in the development of guidance documents for the performance of PRAs. [Pg.82]

The successful and profitable conduct of a frozen food quality program requires suitable and workable standards of product quality and condition, and suitable methods for determining the degree of product conformance with the standards. The methods used may be either subjective or objective in character, or a combination of both the most consistent and reproducible results will be obtained with objective methods. Present-day frozen food standards are, for the most part, based upon subjective methods of quality determination. These are certainly better than none at all but they are subject to considerable misinterpretation and human error, and for this reason leave much to be desired in providing a sound basis for quality control. [Pg.29]

FTA is a deductive method that uses Boolean logic symbols (i.e., AND gates, OR gates) to break down the causes of the top event into basic equipment failures and human errors. The analysts begin with the top event and identify the causes and the logical relationships between the causes and the top event. Each of the causes, called intermediate events, is examined in the same manner until the basic causes for every intermediate event have been identified. [Pg.71]

A part of the test plan must include testing for the consequences of equipment malfunction, deviations in process conditions, and human error. Bench-scale equipment, for example, the RC1, is quite suitable for such experiments. By analysis of the process, critical conditions can be defined, which then need to be tested in order to be able to proceed safely from the laboratory to pilot plant studies. In testing abnormal conditions or process deviations, caution is required to assure that no uncontrollable hazard is created in the laboratory. Typical deviations, including impact on the process, are discussed in the following paragraph. [Pg.134]

Lees (Loss Prevention in the Process Industries, 2d ed., Butter-worths, London, 1996), BP (Hazards of Trapped Pressure and Vacuum, 2003), and Kletz (What Went Wrong —Case Histories of Process Plant Disaster, Gulf Publishing Company, 1989) include additional case histories providing valuable lessons about how equipment failures and human errors can combine to inflict vacuum damage. [Pg.35]

Another type of logic tree, the event tree, is an inductive technique. Event Tree Analysis (ETA) also provides a structured method to aid in understanding and determining the causes of an incident.(i) While the fault tree starts at the undesired event and works backward to identify root causes, the event tree looks forward to display the progression of various combinations of equipment failures and human errors that result in the incident graphically. [Pg.56]

Accidents are very rare relative to the number of near accidents and human errors. Fortunate as it may seem, this poses a real problem for complex systems with a high catastrophy potential (nuclear power plants, chemical plants, commercial aviation) few accidents means few cases to analyse and hardly any feedback to learn from. This leads to the undesirable situation of ad-hoc corrective measures after each single accident, because the database is far too small to generate statistically sensible preventi ve measures. [Pg.20]

Equipment failure and human error cause most releases at facilities. [Pg.500]

Kletz, T., Chung, R, and Shen-Orr, C. (1995), Computer Control and Human Error, Institution of Chemical Engineers, Rugby, U.K. [Pg.204]


See other pages where And human error is mentioned: [Pg.2264]    [Pg.2268]    [Pg.2276]    [Pg.447]    [Pg.366]    [Pg.64]    [Pg.163]    [Pg.279]    [Pg.123]    [Pg.269]    [Pg.1]    [Pg.39]    [Pg.49]    [Pg.29]    [Pg.214]    [Pg.744]    [Pg.207]    [Pg.2019]    [Pg.2023]    [Pg.2031]   
See also in sourсe #XX -- [ Pg.230 ]




SEARCH



Errors and

Human error

© 2024 chempedia.info