Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Latent human error

In the case of a latent human error the consequences of the error may only become apparent after a period of time when the condition caused by the error combines with other errors or particular operational conditions. Two types of latent error can be distinguished. One category originates at the operational level and leads to some required system function being degraded or unavailable. Maintenance and inspection operations are a frequent source of this type of latent failure. [Pg.40]

Management policies are the source of many of the preconditions that give rise to systems failures. For example, if no explicit policy exists or if resources are not made available for safety critical areas such as procedures design, the effective presentation of process information, or for ensuring that effective communication systems exist, then human error leading to an accident is, at some stage, inevitable. Such policy failures can be regarded as another form of latent human error, and will be discussed in more detail in Section 2.7. [Pg.41]

Latent Human Error/Failure (management level) A management level human error is an inadequate or nonexistent management policy which creates the preconditions for active or latent human, hardware, or software failures. [Pg.42]

Reason, J. 1990. The contribution of latent human errors to the breakdown of complex systems. Philosophical Transactions of the Royal Society of London, Series B. 327, 475-84. [Pg.97]

This belief is understandable, as the underlying cause of the loss-producing event may be from well-hidden latent human error incorporated in the overall design of the job, its steps, and task, as noted in literature on Human Performance Improvement (Volume 1 Concepts and principles, human performance, improvement handbook, 2009). From a risk standpoint, since a separation between the hazard, associated risk, and the time when a loss-producing event occurs, most of us simply cannot make the connection. We have referred to this mindset as No loss=no risk . [Pg.201]

We routinely see a goal about reducing injuries to a specific number. A goal of a certain number of injuries and illnesses is not feasible. It ignores the latent human error-related hazards that have not yet resulted in an injury and the near miss incidents that could have resulted in greater severity or other loss-producing events that by circumstance did not involve human injury. [Pg.312]

The safety system brings together behavioral science, human performance improvement, and promotes safety as an important value within an organization. It is sometimes forgotten that behavioral observations are only one part of a process. Consideration should be made for latent human errors that build in hidden hazards and associated risk that may not be identified by observation alone. [Pg.39]

Engineering controls design the environment, the process, equipment, and/or materials to directly eliminate or control the hazard(s).The hazard may be completely removed or if it stiU exists, mechanisms are in place to contain it. This level attempts to remove latent human error potential from the process as well as control the scope and nature of hazards and associated risks (Prevention through Design, 2012). [Pg.158]

What steps and tasks or decision points redundant. The team may find that steps and tasks contain unnecessary inspections, out of date procedures, changes in technology, techniques implemented no longer needed, latent human error potential that has been designed into the process, etc. [Pg.389]

It is therefore useful to distinguish between active and latent errors or failures. An active human error has an immediate effect in that it either directly causes a hazardous state of the system or is the direct initiator of a chain of events which rapidly leads to the imdesirable state. [Pg.40]

Latent component failures, human errors, and related imsafe acts and errors are all results of weaknesses in our management systems. This is why the terms root cause and management system weaknesses are used interchangeably. The term latent failure or latent error is still used in some academic settings. [Pg.38]

Organizational Error—latent management system problem that can result in human error. [Pg.438]

Reason (21) has described a model for looking at human error that portrays a battle between the sources of error and the system-based defenses against them. This model is often referred to as the "Swiss cheese model" because the defenses against error are displayed as thin layers with holes that are described as latent error in the system. Figure 26.5 demonstrates the model as applied to medication error. Each opportunity for error is defended by the prescriber, pharmacist, nurse, and patient. When a potential error is identified and corrected (e.g., dose error, route of administration error) the event becomes a "near miss" rather than an ADE. In those cases in which the holes in the Swiss cheese line up, a preventable medication error occurs. The Swiss cheese model provides an interesting framework for research in this field. [Pg.409]

Figure 5 illustrates the relations between factors influencmg on possibility of latent error commitment which can induce active human errors. Relevant factors should be considered during analysis of protection layers, especially the alarm system and human machine interface (Scarborough et al. 2005). [Pg.310]

James Reason s Human Error, which was previously mentioned, is also a highly recommended resource. First published in 1990, it has since had 12 reprintings. Reason discusses The Nature of error Studies of human error Performance levels and error types Cognitive underspecification and error forms A design for a fallible machine The detection of errors Latent errors and system disasters and Assessing and reducing the human error risk. [Pg.71]

Per references from the nuclear power industry, loss-producing events are said to be split between 80 percent human error and 20 percent physical equipment failure. Of this 80 percent allocated to human error, 70 percent stem from organizational weakness (latent errors). Only 30 percent are from individual mistakes. If 70 percent of incidents are due to organizational... [Pg.42]

The final two steps (Phases three and four) of this research program involve the design (Phase three) and conduct (Phase four) of a proof-of-concept pilot study on road user errors and latent conditions at intersections in Victoria. It is proposed that the pilot study be used to test and refine a novel methodology for collecting human error data within road transport and also to refine and validate the prototype model of road user error and contributing conditions presented in Figure 13.2. The proposed Phase four pilot study has the following three main aims ... [Pg.151]

Loss-producing events are triggered by initiating an action that may be due to human errors. These errors are not necessarily the result of lower level employees physically doing something that creates the event but can stem from latent errors built into the organization (Volume 1 Concepts and principles, human performance improvement handbook, 2009). [Pg.151]

Latent roots The deficiencies in the management systems or the management approaches that allow human errors to continue unchecked... [Pg.479]

The systems approach seeks to identify situations or factors likely to contribute to human error. James Reason s analysis of industrial accidents revealed that catastrophic safety failures almost never result from isolated errors conunitted by individuals. Most incidents result from smaller and multiple errors in components and environments with underlying system flaws. Reason s Swiss Cheese Model describes this phenomenon. Errors made by individuals can result in disastrous consequences due to flawed systans that are represented by the holes in the cheese. Reason believed human error would happen in complex systems. Striving for perfection or punishing individuals who make errors does not appreciably improve safety. A systems approach stresses efforts to catch or anticipate human errors before they occur. Reason used the terms active errors and latent errors to distinguish individual errors from system errors. Active errors almost always involve frontline personnel. They occur at the point of contact between a human and some element of a larger system. Latent errors occur due to failures of the organization or designs that allow inevitable active errors to cause harm. The terms sharp end and blunt end correspond to active error and latent error. The systems approach provides a framework for analysis of errors and efforts to improve safety. [Pg.81]

Human error as a simple catch-all explanation for accidents is now discredited. The term, if it means anything at all, does not provide an adequate description of the many ways in which the failure of people at all levels in organisations can contribute to the complex phenomenon we call an accident. It is more useful to think about human failure , which involves both errors and violations, and also to distinguish between active failures and latent failures. [Pg.120]

Human performance. The commission first reviews the tasks performed by the operators to identify human errors and to evaluate the adequacy of recovery actions and emergency response. Influences from design, procedures and other circumstances (including emotions) should be evaluated, see Chapter 8. The commission should also look into errors and omissions made by management (latent failures). [Pg.177]


See other pages where Latent human error is mentioned: [Pg.85]    [Pg.88]    [Pg.84]    [Pg.53]    [Pg.794]    [Pg.2]    [Pg.160]    [Pg.2223]    [Pg.5]    [Pg.120]    [Pg.314]    [Pg.140]    [Pg.3]    [Pg.75]    [Pg.132]    [Pg.141]    [Pg.144]    [Pg.145]    [Pg.146]    [Pg.154]    [Pg.155]    [Pg.207]    [Pg.340]    [Pg.341]    [Pg.73]    [Pg.77]    [Pg.1752]   


SEARCH



Human error

Latent

Latent errors

© 2024 chempedia.info