Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Cognitive error

Gull, 1990, An Analysis of Nuclear Incidents resulting from Cognitive Error, 11th Advances in Reliability Technology Symposium, University of Liverpool, Elsevier, April... [Pg.480]

Despite these difficulties, the issue of cognitive errors is sufficiently important that we will describe some of the approaches that have been applied to process industry systems. These techniques can be used in both proactive... [Pg.179]

The CADET technique can be applied both proactively and retrospectively. In its proactive mode, it can be used to identify potenrial cognitive errors, which can then be factored into CPQRA analyzes to help generate failure scenarios arising from mistakes as well as slips. As discussed in Chapter... [Pg.180]

Some aspects of cognitive errors, that is, planning errors, can be addressed. [Pg.195]

A mistake (sometimes referred to as a cognitive error) occurs when a person acts on an incorrect train of reasoning, often because he was not properly informed as to what to do or how to do it. A mistake can be defined as follows ... [Pg.686]

A preliminary list of user s failure modes are omission of a known requirement (the user forgets), the user is not aware of a specific requirement, a requirement specified is incorrect (the user misunderstands his/her own needs), and the user specifies a requirement that conflicts with another requirement. Failure modes of the analyst are the analyst omits (forgets) a specific user requirement, the analyst misunderstands (cognitive error) a specific user requirement, the analyst mistransposes (typographical error) a user requirement, the analyst fails to notice a conflict between two users requirements (cognitive error). Communication failure is requirement is misheard. Each such failure is applicable to all types of requirements. [Pg.2310]

The mistakes can be cognitive errors ( 1 didn t know that ), or confirmation bias ( I misunderstood the situation ), which can lead to overconfidence. [Pg.7]

As discussed in Chapter 1, accidents happen when people under pressure make mistakes. The mistakes may have either immediate or delayed consequences. They may be mistakes of commission (doing something in error) or omission (not doing something they should have done). They may be cognitive errors ( I didn t know that ), or confirmation bias ( I misunderstood the situation ), which can lead to overconfidence. [Pg.139]

Mi) The operator (me) did not know that the recirculation isolation valve had a left-hand thread (a cognitive error). [Pg.319]

The key cognitive error was that I did not know that the recirculation isolation valve had a left-hand thread. If I had known to turn the hand-wheel clockwise, I would have immediately realized that the valve was actually already opeu, which would have made me look harder aud think again - and then I would have noticed the corroded limit switch bracket. I showed impulsiveness to fix a problem that had not yet been adequately diagnosed. A more mature head than mine would have considered the problem much more carefully before jumping to a diagnosis and a solution. [Pg.319]

Taxonomy-based HEI techniques use external error mode (EEM) taxonomies to identify potential errors within complex sociotechnical systems. Typically, EEMs are considered for each component step in a particular task or scenario to determine credible errors that may arise during human-machine interaction. Techniques such as the Systematic Human Error Reduction and Prediction Approach (SHERPA) (Embrey, 1986), the Human Error Template (HET) (Stanton et al., 2006), the Technique for the Retrospective and Predictive Analysis of Cognitive Errors (TRACEr) (Shorrock and Kirwan, 2002), and the Cognitive Reliability and Error Analysis Method (CREAM) (Hollnagel, 1998) all use domain-specific EEM taxonomies. Taxonomic approaches to HEI are typically the most successful in terms of sensitivity and are also the least expensive, quickest, and easiest to use however,... [Pg.345]

OR crisis events are often the results of unforeseen internal or external problems, and can frequently be attributed to human cognitive error or complex system safety cultures. There is seldom a single cause leading to an accident. The error chain is a concept to describe human error accidents as the result of a sequence of events that, uninterrupted may culminate in serious injury and death. The links of these error chains are identifiable by means of up to ten clues (table 1). Recognizing and breaking one link in the error chain will likely prevent the potential adverse event. [Pg.111]

The hypothesis of different causes is supported by a rich vocabulary for various types of malfunctions and errors - but strangely enough not for successful actions. In the early 1980s there were just two categories, namely errors of omission and errors of commission. But these were soon complemented by various types of cognitive errors, deviations and violations. As an illustration of that, consider the following variations of non-compliance in Table 3.1. [Pg.53]

There is, however, a practical problem, namely that a terminology is not readily available. When we want to describe various types of individual and organisational failures, errors and malfunctions, a rich vocabulary is at our disposal. Starting from simple categories such as errors of omission and commission, we not only have multiple theories (violations, non-compliance, loss of situation awareness, cognitive errors, error-forcing conditions, etc.) but also a rich flora of terms within each theory. But when we want to describe what people actually do, there is little to start from. [Pg.156]

Shorrock, S.T. and Kirwan, B. 1998. TRACEr—a technique for the retrospective analysis of cognitive errors in Air Traffic Management. In D. Harris (Ed) Engineering Psychology and Cognitive Ergonomics, Vol. 3, pp. 163-171. Aldershot, UK Ashgate. [Pg.1100]

Jerome Groopman, How Doctors Think (Boston Houghton Mifflin Co., 2007). This highly readable and informative book deals specifically with cognitive errors in medical decision making. [Pg.151]

Seek help from colleagues to dissect erroneous decisions to uncover the cognitive errors involved, their triggers, and what might have prevented the errors. [Pg.174]

Technique for the retrospective analysis of cognitive error (TRACEr)... [Pg.284]


See other pages where Cognitive error is mentioned: [Pg.66]    [Pg.66]    [Pg.67]    [Pg.67]    [Pg.179]    [Pg.180]    [Pg.186]    [Pg.227]    [Pg.283]    [Pg.81]    [Pg.51]    [Pg.140]    [Pg.131]    [Pg.132]    [Pg.132]    [Pg.382]    [Pg.1621]    [Pg.59]    [Pg.159]   
See also in sourсe #XX -- [ Pg.132 ]




SEARCH



© 2024 chempedia.info