Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Error, analytical human

Chapter 4, Analytical Methods for Predicting and Reducing Human Error, contains a discussion and critique of the various methods that are available for analyzing a process for its potential for human error. [Pg.2]

The first component of the systems approach to error reduction is the optimization of human performance by designing the system to support human strengths and minimize the effects of human limitations. The hiunan factors engineering and ergonomics (HFE/E) approach described in Section 2.7 of Chapter 2 indicates some of the techniques available. Design data from the human factors literature for areas such as equipment, procedures, and the human-machine interface are available to support the designer in the optimization process. In addition the analytical techniques described in Chapter 4 (e.g., task analysis) can be used in the development of the design. [Pg.19]

In addition to the proactive uses of the SRK model described in the two previous sections, it can also be employed retrospectively as a means of identifying the underlying causes of incidents attributed to human error. This is a particularly useful application, since causal analyses can be used to identify recurrent vmderlying problems which may be responsible for errors which at a surface level are very different. It has already been indicated in Section 2.4.1 that the same observable error can arise from a variety of alternative causes. In this section it will be shown how several of the concepts discussed up to this point can be combined to provide a powerful analytical framework that can be used to identify the root causes of incidents. [Pg.81]

Several examples have already been provided of the use of cognitive models of error to evaluate the possible causes of accidents that have already occurred. This form of retrospective analysis performs a vital role in providing information on the recurring underlying causes of accidents in which human error is implicated. The advantage of an analytical framework driven by a model of human error is that it specifies the nature of the questions that need... [Pg.84]

Analytical Methods for Predicting and Reducing Human Error... [Pg.153]

The various analytical methods for predicting and reducing human error can be assigned to four groups or sections. In order to make a start on any form of analysis or prediction of human error, it is obviously necessary to gather information. The first section therefore describes a number of techniques that can be applied to acquire data about what the worker does, or what happened in an accident. [Pg.153]

The intention of this chapter has been to provide an overview of analytical methods for predicting and reducing human error in CPI tasks. The data collection methods and ergonomics checklists are useful in generating operational data about the characteristics of the task, the skills and experience required, and the interaction between the worker and the task. Task analysis methods organize these data into a coherent description or representation of the objectives and work methods required to carry out the task. This task description is subsequently utilized in human error analysis methods to examine the possible errors that can occur during a task. [Pg.200]

The fifteen samples analyzed for amino acid composition were compared to a human type 1 collagen standard. There are no significant differences in amino acid composition among these samples, which range in age from stillbirth to adult. Analytical error is 10%. All samples show the typical composition of type 1 collagen (Table 1.1). Based on these results there is no reason to suspect differential preservation of specific amino acids in the small bones of infants in comparison to bones of adults and older children. [Pg.5]

Thus, to further the goals of quality and good analytical practice for which RMs are intended, EQA schemes should combine some aspects of both objectives according to the political purposes for which the scheme is being organized. Whether used for educational or licensing purposes, the ultimate intention is to ensure a certain standard of analysis is achieved and maintained in order that the user of results may be protected against errors which could be costly, in financial or human terms. [Pg.120]

The famous adage— to err is human to forgive divine,—literally means that it is natural for people to make mistakes. However, errors in analytical chemistry or more precisely in pharmaceutical drug analysis are normally of three types, namely ... [Pg.8]

While analytical science is in many ways quite miraculous, it is by no means without problems. Errors can easily be made. Analyses are not always readily reproducible in different laboratories. Some technologies are exceedingly expensive. And while analytical methods are well worked out for many chemicals, they are not available at all for many more. (Indeed, if we are interested in the naturally occurring chemicals that human beings are exposed to, we will find that only a tiny fraction of these can now be analyzed for with anything except fairly sophisticated research tools most such chemicals are still unknown,... [Pg.34]

AQ/Qo = (Q - Qo)/Qo, where Q0 is charge consumption without analyte and Q is that at certain thrombin, concentration. An example of calibration curve for two independently prepared electrodes is shown in Fig. 47.3. It is seen, that results are well reproducible. Statistical analysis, performed earlier [4] revealed that standard error is approximately 11%. Interferences of this aptasensor with other compounds, human serum albumine (HSA) and human IgG are relatively low. An example is shown in Fig. 47.4, where calibration curve for thrombin is compared with those for HSA and IgG. Please note, that concentrations of HSA and IgG are much higher in comparison with that of thrombin. [Pg.1274]

A selective, sensitive, and rapid hydrophilic interaction liquid chromatography with electrospray ionization tandem mass spectrometry was developed for the determination of donepezil in human plasma [32], Donepezil was twice extracted from human plasma using methyl-ferf-butyl ether at basic pH. The analytes were separated on an Atlantis HILIC Silica column with the mobile phase of acetonitrile ammonium formate (50 mM, pH 4.0) (85 15, v/v) and detected by tandem mass spectrometry in the selective reaction monitoring mode. The calibration curve was linear (r = 0.9994) over the concentration range of 0.10-50.0 ng/ ml and the lower limit of quantification was 0.1 ng/ml using 200 /d plasma sample. The CV and relative error for intra- and inter-assay at four quality control levels were 2.7% to 10.5% and —10.0% to 0.0%, respectively. There was no matrix effect for donepezil and cisapride. The present method was successfully applied to the pharmacokinetic study of donepezil after oral dose of donepezil hydrochloride (10 mg tablet) to male healthy volunteers. [Pg.141]

Definitive data are obtained with rigorous analytical methods, such as EPA-approved methods or other standard analytical methods. For the data to be definitive, either analytical or total measurement error must be determined. Definitive data, which are analyte-specific and have a high degree of confidence in analyte identity and concentration, are used for decisions that have consequences for human health and the environment, such as site closure, risk assessment, and compliance monitoring of water effluents and air emissions. Definitive data may be generated at a field (mobile) laboratory or at an off-site (fixed-base) laboratory. [Pg.47]

Why do we need to perform DQA The need for DQA arises from the very existence of total error. The collection of planned data may go astray due to unforeseen field conditions, human errors or analytical deficiencies that may alter the type, quality, and quantity of the data compared to what has been planned. We use DQA as a tool that enables us to evaluate various components of total error and to establish their effect on the amount of valid and relevant data collected for each intended use. [Pg.265]

Reliable evaluation of the potential for human exposure to CDDs depends in part on the reliability of supporting analytical data from environmental samples and biological specimens. Historically, CDD analysis has been both complicated and expensive, and the analytical capabilities to conduct such analysis have been available through only a relatively few analytical laboratories. Limits of detection have improved greatly over the past decade with the use of high-resolution mass spectrometry, improvements in materials used in sample clean-up procedures, and with the use of known labeled and unlabeled chemical standards. Problems associated with chemical analysis procedures of CDDs in various media are discussed in greater detail in Chapter 6. In reviewing data on CDD levels monitored or estimated in the environment, it should be noted that the amount of the chemical identified analytically is not necessarily equivalent to the amount that is bioavailable (see Section 2.3) and that every measurement is accompanied with a certain analytical error. [Pg.455]


See other pages where Error, analytical human is mentioned: [Pg.285]    [Pg.303]    [Pg.271]    [Pg.69]    [Pg.270]    [Pg.31]    [Pg.321]    [Pg.242]    [Pg.273]    [Pg.164]    [Pg.778]    [Pg.447]    [Pg.142]    [Pg.235]    [Pg.177]    [Pg.347]    [Pg.28]   
See also in sourсe #XX -- [ Pg.204 , Pg.271 ]




SEARCH



Analytical Methods for Predicting and Reducing Human Error

Error, analytical

Human error

© 2024 chempedia.info