Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Pilot error

The critical incident technique was first described by Flanagan (1954) and was used during World War II to analyze "near-miss incidents." The war time studies of "pilot errors" by Fitts and Jones (1947) are the classic studies using this technique. The technique can be applied in different ways. The most common application is to ask individuals to describe situations involving errors made by themselves or their colleagues. Another, more systematic approach is to get them to fill in reports on critical incidents on a weekly basis. One recent development of the technique has been used in the aviation world, to solicit reports from aircraft crews in an anonjmrous or confidential way, on incidents in aircraft operations. Such data collection systems will be discussed more thoroughly in Chapter 6. [Pg.157]

Fitts, P. M., Jones, R. E. (1947). Analysis of Factors Contributing to 460 "Pilot Error" Experiences in Operating Aircraft Controls. Reprinted in H. W. Sinaiko (Ed.) (1961), Selected Papers on Human Factors in the Design and Use of Control Systems. New York Dover. [Pg.369]

Pilot Error In the rush to start the descent, the pilot executed a change of course without verifying its effect on the flight path. [Pg.23]

It is still common to see statements that 70 percent to 80 percent of aircraft accidents are caused by pilot error or that 85 percent of work accidents are due to unsafe acts by workers rather than unsafe conditions. However, closer examination shows that the data may be biased and incomplete the less that is known about an accident, the most likely it will be attributed to operator error [93]. Thorough investigation of serious accidents almost invariably finds other factors. [Pg.37]

Even if a technical failure precedes the human action, the tendency is to put the blame on an inadequate response to the failure by an operator. Perrow claims that even in the best of industries, there is rampant attribution of accidents to operator error, to the neglect of errors by designers or managers [155], He dtes a U.S. Air Force study of aviation accidents demonstrating that the designation of human error (pilot error in this case) is a convenient classification for mishaps whose real cause is uncertain, complex, or embarrassing to the organization. [Pg.38]

These two contributory factors are highly related to the third cause—the pilots lack of situational awareness. Even using an event-chain model of accidents, the FMS-related events preceded and contributed to the pilot errors. There seems to be no reason why, at the least, they should be treated any different than the labeled causes. There were also many other factors in this accident that were not reflected in either the identified causes or contributory factors. [Pg.40]

This case, however, does not seem to have had much impact on the attribution of pilot error in later aircraft accidents. [Pg.40]

Another useful distinction is between errors of omission and errors of commission. Sarter and Woods [181] note that in older, less complex aircraft cockpits, most pilot errors were errors of commission that occurred as a result of a pilot control action. Because the controller, in this case the pUot, took a direct action, he or she is likely to check that the intended effect of the action has actually occurred. Hie short feedback loops allow the operators to repair most errors before serious consequences result. This type of error is stiU the prevalent one for relatively simple devices. [Pg.280]

Operational (such as pilot error, weather and operating procedures) or... [Pg.2]

Step 2 is all about checking that the evolving design concept is correct, complete and accomphshable (i.e. Are we building the right thing which will minimise pilot error ). The ouqtut of Step 2 is thus a validated Specification (refer Fig. 1.3). [Pg.337]

Updated Manuals and Fhght Crew Reference Cards which are traceable to the safety decisions leading to their creation. With due consideration of the apt quote (by the American baseball player Yogi Berra T don t want to make the wrong mistake ), such ICA (refer Section 11.2.3) must include exphcit warning of those pilot errors which may cause a catastrophic or hazardous events. [Pg.348]

Fundamentally, safety is underpinned or undermined by human capabilities and limitations, and to err is human. Accident investigators have moved away from the position of regarding the phrase pilot error as an appropriate explanatory cause. It is far more appropriate to ask why the error was made and why it was not detected and corrected in time, including the creation of barriers (or defensive layers) through the design of suitable interfaces, training, operational procedures etc. [Pg.358]

Fitts Jones (1961) Analysis of factors contributing to 460 pilot error experiences in operating aucrafi controls. In Sinaiko, H. W. (Ed.) Selected papers on Human Factors in the design and use of control systems. Dover Publications Inc. [Pg.1192]

The same growth in travel demand... has caused commuters to reach down into the ranks of pilots in air taxis and cargo operators at a faster rate. In the same way an influx of less-experienced pilots from commuters may have increased pilot error in jet carriers, an influx of replacement commuter pilots from air taxis and cargo operators may have increased commuter pilot error rates. [Pg.19]

Systematic failures have nothing to do with random component failures (although they should eliminate or reduce the hazardous effects of a single random failure) and are often associated with humans (humans, after all design the processes). The world s most sophisticated machines, designed with the highest safety goals in mind, jet airliners, still crash as a result of pilot error. [Pg.6]

In the area of engineering, an important social division was caused by the Smolerisk disaster of 2010, when the Polish president and many other people from the Polish elite died in an airplane crash near the Russian city of Smolerisk. The official aviation investigation was an object of critique from the catholic-right side of Polish politics. Smolerisk became a kind of mystical symbol and important aviation-engineering public case. Some lecturers from technical universities criticized the official technical explanation, which emphasized pilot error. It is known (in a derogatory way) as Smolerisk-physics. This political- and religion-based division also applies to engineering students and faculties. [Pg.135]

Coping mechanisms are a behavioral tool used by pilots and crew to offset or overcome stress and adversity. Research on coping mechanisms has helped to understand pilot error and challenges in crew resource management. Stress resistance has been part of selection methods and its understanding has much to offer to improve training in the aviation sector. [Pg.2]

Even more so. Green et al. (1996) further referred to stress as a person s evaluation downgrading his/her own abiUty to meet the perceived demands. All evidence considered, stress is subjective rather than objective. Li et al. (2001) analyzed a set of 29891 data related to aviation crashes for the years of 1983-1996 and found that pilot error was a probable the cairse in 85 percent of... [Pg.81]

Alkov, R.A., Gaynor, J.A. and Borowsky, M.S. (1985). Pilot error as a symptom of inadequate stress coping, Space, and Environmental Medicine, 56,... [Pg.95]

Li, G., Baker, S.P., Grabowski, J.G. and Rebok, G.W. (2001). Factors associated with pilot error in aviation crashed. Aviation, Space, and Environmental Medicine, 72, 52-8. [Pg.95]


See other pages where Pilot error is mentioned: [Pg.54]    [Pg.383]    [Pg.37]    [Pg.174]    [Pg.338]    [Pg.341]    [Pg.344]    [Pg.1720]    [Pg.173]    [Pg.14]    [Pg.18]    [Pg.19]    [Pg.19]    [Pg.26]    [Pg.38]    [Pg.40]    [Pg.43]    [Pg.219]    [Pg.240]    [Pg.6]    [Pg.40]    [Pg.77]    [Pg.6]    [Pg.7]    [Pg.212]    [Pg.92]    [Pg.99]    [Pg.108]   
See also in sourсe #XX -- [ Pg.18 , Pg.19 , Pg.20 ]




SEARCH



© 2024 chempedia.info