Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Feedback human error

The model of human error held by management and the plant culture constitutes the environment in which the data collection system operates. Within this environment, all data collection systems need to address the topics listed in Figure 6.1. These topics, from the types of data collected, to the feedback systems that need to be in place, will be addressed in subsequent sections of this chapter. [Pg.251]

The people most knowledgeable about a particular task are the people who perform it every day. Their help is essential for reducing the associated risks. Continuous feedback from the worker will provide the framework for improvements to the job. This feedback can only be fostered in an atmosphere of trust. If an incident occurs in which human error is a suspected cause, man-... [Pg.349]

Accidents are very rare relative to the number of near accidents and human errors. Fortunate as it may seem, this poses a real problem for complex systems with a high catastrophy potential (nuclear power plants, chemical plants, commercial aviation) few accidents means few cases to analyse and hardly any feedback to learn from. This leads to the undesirable situation of ad-hoc corrective measures after each single accident, because the database is far too small to generate statistically sensible preventi ve measures. [Pg.20]

Experimentation is important at aU levels of control [166]. For manual tasks where the optimization criteria are speed and smoothness, the limits of acceptable adaptation and optimization can only be known from the error experienced when occasionally crossing a limit. Errors are an integral part of maintaining a skill at an optimal level and a necessary part of the feedback loop to achieve this goal. The role of such experimentation in accidents cannot be understood by treating human errors as events in a causal chain separate from the feedback loops in which they operate. [Pg.42]

The second important difference between human and automated controllers is that, as noted by Thomas [199], while automated systems have basically static control algorithms (although they may be updated periodically), humans employ dynamic control algorithms that they change as a result of feedback and changes in goals. Human error is best modeled and understood using feedback loops, not as a chain of directly related events or errors as found in traditional accident causality models. Less successful actions are a natural part of the search by operators for optimal performance [164]. [Pg.229]

The effect of the controllers actions This feedback is used to detect human errors. As discussed in the section on design for error tolerance, the key to making errors observable—and therefore remediable—is to provide feedback about them. This feedback may be in the form of information about the effects of controller actions, or it may simply be information about the action itself on the chance that it was inadvertent. [Pg.296]

Operator inputs to the design process as well as extensive simulation and testing will assist in designing usable computer displays. Remember that the overall goal is to reduce the mental workload of the human in updating their process models and to reduce human error in interpreting feedback. [Pg.305]

Kjellen, Urban. 1987. Deviations and the feedback control of accidents. In New Technology and Human Error, ed. Jens Rasmussen, Keith Duncan, and Jacques Leplat, 143-156. New York John Wiley... [Pg.525]

Error reduction, however, is not the only approach to the problem of error. The second line of attack is directed towards the elimination of disastrous consequences of human error. The design of equipment (including the monitoring/feedback loop to its... [Pg.358]

The Confidential Human Factors Incident Reporting Programme (CHIRP) Administered by an independent body and which provides sensitive follow-up and feedback on reports of human errors that have been rendered anonymous. [Pg.78]

First off, the balance of plant (BOP) would have no nuclear safety function. Moreover, the STAR-H2 heat source reactor is being designed not only for passive safety response to Anticipated transients without scram (ATWS) initiators but also for passive load follow. The only information flow path from the BOP to the reactor would be the fused salt intermediate heat transport loop, which will convey the BOP heat request to the reactor by means of its flow rate and return temperature (see Fig. XXIV-3). In this way, the reactor could passively adjust its power to match heat demand while remaining in a safe operating regime. The safety implication of passive load follow is that the reactor would safety respond to all possible combinations and timing of ATWS initiators taken more than one at a time it would also safety respond to all conceivable human errors of the maintenance crew and the operator. In summary, all faults exterior to the reactor vessel might be safely accommodated on the basis of passive thermo-structural feedbacks. [Pg.686]

A Technique for Human Event ANAlysis (ATHEANA) was developed by the nuclear industry due to a perceived need for a human error analysis tool that more closely modeled actual operational events and which put a stronger focus on contextual factors. The quantification is based around three calculations. Eirst, calculating the probability of an Error Eorcing Condition (EEC i.e. the probability that the plant will be in a state which may induce an error). This is determined by a combination of plant conditions and PSEs. Second, the probability of an Unsafe Action (UA). And third, the probability of not recovering from the initial UA. This third area incorporates the possibility of alarms and/or feedback to the operator allowing them to correct the UA. [Pg.1096]

Complicating factors and multidiscipline effects must be included in any human factors/human error analysis. Feedback from the working environment contributes to the likelihood of error. [Pg.209]

To avoid accidents altogether, a complete elimination of human errors may be seen as the ultimate goal. This goal is not very practical, however, and will have severe side effects. Due to the intrinsic variability in human performance, errors will occur. Errors also provide the operators with task feedback and on-the-job learning about the systems that they operate. This experience is extremely valuable in situations when the operators have to handle unanticipated situations to avoid shut-down or accidents. This has often to be done under tight time constraints and psychological stress. [Pg.102]

Monitoring of systems feedback provides an opportunity for error detection and recovery when there is a mismatch between detected and observed outcome of an action. Human errors are not easily detected in complex systems. There may be a significant time lag between action and observed effects that makes detection difficult. The effects of human actions may be masked by actions taken by the technical control system. Biased error attribution on the part of the operator may also impede error recovery. Co-workers and supervisors are important resources for error recovery in this context. There are, however, some important preconditions for colleagues and supervisors to be able to contribute. They must co-operate closely with the erring operator to be able to observe performance and distinguish between erroneous and correct acts. There must also be a climate of trust and willingness to correct each other s behaviour. [Pg.103]

Personnel-related measures to reduce the frequency of human errors and promote safe behaviour include personnel selection, education and training, and safety campaigns and performance feedback (Salvendy, 1987, Sanders and McCormick, 1992). There are three main objectives of the personnel-related counter-measures ... [Pg.105]

A successful reporting of human errors has been accomplished by giving pilots and air-traffic controllers benefits when reporting such events. Most important is the fact that they receive immunity from punitive actions as soon as they have filed a near-accident report. This limited immunity covers negligence but not criminal acts such as drug trafficking. They also receive feedback on the results of their reporting. [Pg.157]

Human Factors Engineering/Ergonomics approach (control of error by design, audit, and feedback of operational experience) Occupational/process safety Manual/control operations Routine operation Task analysis Job design Workplace design Interface design Physical environment evaluation Workload analysis Infrequent... [Pg.44]

Advocates of the global approach would argue that human activities are essentially goal-directed (the cognitive view expressed in Chapter 2), and that this cannot be captured by a simple decomposition of a task into its elements. They also state that if an intention is correct (on the basis of an appropriate diagnosis of a situation), then errors of omission in skill-based actions are imlikely, because feedback will constantly provide a comparison between the expected and actual results of the task. From this perspective, the focus would be on the reliability of the cognitive rather than the action elements of the task. [Pg.225]

Ravden and Johnson (1989) evaluate usabihty of human computer interfaces. They identify nine top-level attributes visual clarity, consistency, compatibility, informative feedback, explicitness, appropriate functionality, flexibility and control, error prevention and correction, and user guidance and support. They disaggregate each into a number of more measurable attributes. These attributes can be used as part of a standard multiple-attribute evaluation. [Pg.134]

The human factors literature is rich in task analysis techniques for situations and jobs requiring rule-based behavior (e.g., Kirwan and Ainsworth 1992). Some of these techniques can also be used for the analysis of cognitive tasks where weU-practiced work methods must be adapted to task variations and new circumstances. This can be achieved provided that task analysis goes beyond the recommended work methods and explores task variations that can cause failures of human performance. Hierarchical task analysis (Shepherd 1989), for instance, can be used to describe how operators set goals and plan their activities in terms of work methods, antecedent conditions, and expected feedback. When the analysis is expanded to cover not only normal situations but also task variations or changes in circumstances, it would be possible to record possible ways in which humans may fail and how they could recover from errors. Table 2 shows an analysis of a process control task where operators start up an oil refinery furnace. This is a safety-critical task because many safety systems are on manual mode, radio communications between control room and on-site personnel are intensive, side effects are not visible (e.g., accumulation of fuel in the fire box), and errors can lead to furnace explosions. [Pg.1028]

As with personal attributes, characteristics of the machinery, tools, technology, and materials used by the worker can influence the potential for an exposure or accident. One consideration is the extent to which machinery and tools influence the use of the most appropriate and effective perceptual/ motor skills and energy resources. The relationship between the controls of a machine and the action of that machine dictates the level of perceptual/motor skill necessary to perform a task. The action of the controls and the subsequent reaction of the machinery must be compatible with basic human perceptual/motor patterns. If not, significant interference with performance can occur which may lead to improper responses that can cause accidents. In addition, the adequacy of feedback about the action of the machine affects the performance efficiency that can be achieved and the potential for an operational error. [Pg.1160]

Feedback error learning (FEL) is a hybrid technique [113] using the mapping to replace the estimation of parameters within the feedback loop in a closed-loop control scheme. FEL is a feed-forward neural network structure, under training, learning the inverse dynamics of the controlled object. This method is based on contemporary physiological studies of the human cortex [114], and is shown in Figure 15.6. [Pg.243]

Humans can play a variety of roles in a control system. In the simplest cases, they create the control commands and apply them directly to the controlled process. For a variety of reasons, particularly speed and efficiency, the system may be designed with a computer between the human controller and the system. The computer may exist only in the feedback loop to process and present data to the human operator. In other systems, the computer actually issues the control instructions with the human operator either providing high-level supervision of the computer or simply monitoring the computer to detect errors or problems. [Pg.275]

In contrast, studies of more advanced automation in aircraft find that errors of omission are the dominant form of error [181]. Here the controller does not implement a control action that is required. The operator may not notice that the automation has done something because that automation behavior was not explicitly invoked by an operator action. Because the behavioral changes are not expected, the human controller is less likely to pay attention to relevant indications and feedback, particularly during periods of high workload. [Pg.280]

Ji-Chang L, Chien-Hsing Y (1999) A heuristic error-feedback learning algorithm for fuzzy modeling. IEEE Trans Syst Man Cybernet Part A Syst Humans 29(6) 686-691... [Pg.63]


See other pages where Feedback human error is mentioned: [Pg.44]    [Pg.148]    [Pg.247]    [Pg.249]    [Pg.72]    [Pg.171]    [Pg.422]    [Pg.302]    [Pg.336]    [Pg.216]    [Pg.5010]    [Pg.458]    [Pg.503]    [Pg.55]    [Pg.29]    [Pg.259]    [Pg.13]    [Pg.200]    [Pg.52]    [Pg.450]    [Pg.96]    [Pg.262]    [Pg.784]    [Pg.2227]   
See also in sourсe #XX -- [ Pg.8 ]




SEARCH



Human error

© 2024 chempedia.info