Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

The Analyst

The larger variance is placed in the numerator. For example, the F test allows judgment regarding the existence of a significant difference in the precision between two sets of data or between two analysts. The hypothesis assumed is that both variances are indeed alike and a measure of the same a. [Pg.204]

The result of an analysis is influenced by three factors the method, the sample, and the analyst. The influence of these factors can be studied by conducting a pair of experiments in which only one factor is changed. For example, two methods can be compared by having the same analyst apply both methods to the same sample and examining the resulting means. In a similar fashion, it is possible to compare two analysts or two samples. [Pg.88]

The response surfaces in Figure 14.2 are plotted for a limited range of factor levels (0 < A < 10, 0 < B < 10), but can be extended toward more positive or more negative values. This is an example of an unconstrained response surface. Most response surfaces of interest to analytical chemists, however, are naturally constrained by the nature of the factors or the response or are constrained by practical limits set by the analyst. The response surface in Figure 14.1, for example, has a natural constraint on its factor since the smallest possible concentration for the analyte is zero. Furthermore, an upper limit exists because it is usually undesirable to extrapolate a calibration curve beyond the highest concentration standard. [Pg.667]

Single-operator characteristics are determined by analyzing a sample whose concentration of analyte is known to the analyst. The second step in verifying a method is the blind analysis of standard samples where the analyte s concentration remains unknown to the analyst. The standard sample is analyzed several times, and the average concentration of the analyte is determined. This value should be within three, and preferably two standard deviations (as determined from the single-operator characteristics) of the analyte s known concentration. [Pg.683]

The goal of a collaborative test is to determine the expected magnitude of ah three sources of error when a method is placed into general practice. When several analysts each analyze the same sample one time, the variation in their collective results (Figure 14.16b) includes contributions from random errors and those systematic errors (biases) unique to the analysts. Without additional information, the standard deviation for the pooled data cannot be used to separate the precision of the analysis from the systematic errors of the analysts. The position of the distribution, however, can be used to detect the presence of a systematic error in the method. [Pg.687]

An introduction to several of the more common methods of surface and interface analysis has been presented in this article. This treatment is certainly not comprehensive. An ever-expanding number of methods for the interrogation of surfaces and interfaces are available to the analyst. The ones chosen for discussion here were meant to be representative of methods that can answer the more general questions posed at the beginning of this article. The reader is encouraged to pursue further reading on other techniques for specific appHcations in the many excellent monographs on the subject of surface and interface analysis. [Pg.288]

Method of Variation of Parameters This method is apphcable to any linear equation. The technique is developed for a second-order equation but immediately extends to higher order. Let the equation be y" + a x)y + h x)y = R x) and let the solution of the homogeneous equation, found by some method, he y = c f x) + Cofoix). It is now assumed that a particular integral of the differential equation is of the form P x) = uf + vfo where u, v are functions of x to be determined by two equations. One equation results from the requirement that uf + vfo satisfy the differential equation, and the other is a degree of freedom open to the analyst. The best choice proves to be... [Pg.455]

Analysts The above is a formidable barrier. Analysts must use limited and uncertain measurements to operate and control the plant and understand the internal process. Multiple interpretations can result from analyzing hmited, sparse, suboptimal data. Both intuitive and complex algorithmic analysis methods add bias. Expert and artificial iutefligence systems may ultimately be developed to recognize and handle all of these hmitations during the model development. However, the current state-of-the-art requires the intervention of skilled analysts to draw accurate conclusions about plant operation. [Pg.2550]

Clearly and uniquely specify the top event to the precise requirements without ( /er-specification which might exclude important failure modes. Even if constructed by difh ent analysts, the top event specification should produce trees that have the same Boolean equatio. ... [Pg.105]

A fragility curve for a component is calculated by knowing the conditions that will fail it. These conditions are calculated, deterministically by a structure analyst. The PSA specialist determines the variability of the conditions which together gives the probability of failure vs acceleration forces for given operating conditions, A separate fragility curve is needed for each mode that must to be considered. [Pg.192]

Another link exists between the PIF concept and the sociotechnical assessment methods described in Section 2.7 The checklists used in the TRIPOD methodology are essentially binary questions which evaluate whether the sets of PIFs making up each of the general failure types are adequate or not. The hierarchical sets of factors in HRAM are essentially PIFs which are expressed at increasingly finer levels of definition, as required by the analyst. The audit tool which forms MANAGER also comprises items which can be regarded as PIFs which assess both management level and direct PIFs such as procedures. [Pg.104]

This technique is the longest established of all the human reliability quantification methods. It was developed by Dr. A. D. Swain in the late 1960s, originally in the context of military applications. It was subsequently developed further in the nuclear power industry. A comprehensive description of the method and the database used in its application, is contained in Swain and Guttmann (1983). Further developments are described in Swain (1987). The THERP approach is probably the most widely applied quantification technique. This is due to the fact that it provides its own database and uses methods such as event trees which are readily familiar to the engineering risk analyst. The most extensive application of THERP has been in nuclear power, but it has also been used in the military, chemical processing, transport, and other industries. [Pg.227]

The value of fractional distillation in the examination of essential oils cannot be overestimated. The various fractions may be examined and their specific gravities, optical rotations, and refractive indices determined. The combination of these figures will often give the experienced analyst the most useful information and save him many hours needless work. Experience alone, however, will teach the chemist to make the fullest use of the results so obtained. In most cases distillation under reduced pressure is necessary on account of the risk of decomposing the various constituents of the oil. The use of a Briihl receiver (or any similar contrivance), which is easily obtained from any apparatus maker. [Pg.310]

This definition outlines in very broad terms the scope of analytical chemistry. When a completely unknown sample is presented to an analyst, the first requirement is usually to ascertain what substances are present in it. This fundamental problem may sometimes be encountered in the modified form of deciding what impurities are present in a given sample, or perhaps of confirming that certain specified impurities are absent. The solution of such problems lies within the province of qualitative analysis and is outside the scope of the present volume. [Pg.3]

Correctly used, statistics is an essential tool for the analyst. The use of statistical methods can prevent hasty judgements being made on the basis of limited information. It has only been possible in this chapter to give a brief resume of some statistical techniques that may be applied to analytical problems. The approach, therefore, has been to use specific examples which illustrate the scope of the subject as applied to the treatment of analytical data. There is a danger that this approach may overlook some basic concepts of the subject and the reader is strongly advised to become more fully conversant with these statistical methods by obtaining a selection of the excellent texts now available. [Pg.149]

While it is true that in many cases the quality of data acquired during analysis is directly proportional to the quality of the result that may be obtained, it is also true that in many cases the power of modem computer systems attached to analytical equipment of all sorts can be nsed to provide better results than might be thought possible from a cursory examination of the raw data. Even when chromatography is used to separate the components of a mixture and simplify the job of the analyst, the computer may still allow information hidden in the vast amount of data generated to be extracted. [Pg.74]

It has become an accepted wisdom that the use of RMs or CRMs will help to improve the accuracy and precision of an analytical process. This belief has led to a rapid growth in the use of RMs and CRMs in commercial laboratories. The authors and many analysts the world over support this view, but also recognize that in far too many cases inexperience and carelessness conspire together with the result that error accumulates and often unreliable data are produced. [Pg.236]

The literature includes a number of mis-matches, the following standing as examples for the many The use of bovine liver and other animal tissues for QC in the analysis of hmnan body fluids should not be considered by analysts. The matrix and the levels of trace elements do not match the levels to be analyzed, which may lead to serious errors. An even more severe mis-use was recently reported by Schuhma-cher et al. (1996) for NIST SRM 1577a Bovine Liver, which was used for QC in the analysis of trace elements in plant materials and soil samples in the vicinity of a municipal waste incinerator. Also recently, Cheung and Wong (1997) described how the quality control for the analysis of trace elements in clams (shellfish) and sediments was performed with the same material NIST SRM 1646, Estuarine sediment. Whilst the selected SRM was appropriate for sediments, its usefulness as a QC tool for clams is difficult to prove see also Chapter 8. This inappropriate use is the more mystifying because a broad selection of suitable shellfish RMs from various producers is available. [Pg.239]

Laboratory performance study. Laboratories use the method of their choice to measure one or more quantities on one or more homogeneous and stable test samples in order to assess the performance of the laboratory or analyst. The reported results are compared among themselves, with those of other laboratories, or with the known or assigned reference value, usually with the objective of evaluating or improving laboratory performances (IUPAC Orange Book [1997, 2000]). [Pg.252]

High performance liquid chromatography (HPLC) and capillary electrophoresis (CE) are two instrumental separation techniques that are applicable to the separation of proteins and peptides. The advantage of HPLC and CE techniques is that they afford the analyst the freedom to resolve a complex mixture by different routes employing different... [Pg.365]

Analysis of Variance (ANOVA) is a useful tool to compare the difference between sets of analytical results to determine if there is a statistically meaningful difference between a sample analyzed by different methods or performed at different locations by different analysts. The reader is referred to reference [1] and other basic books on statistical methods for discussions of the theory and applications of ANOVA examples of such texts are [2, 3],... [Pg.179]

Due to a lack of understanding in the use of analytical skills, most people resort to disconnected problem solving techniques to analyze their problems in lieu of a structured logical approach. This stems from the fact that we do not give analysts the tools necessary to do their jobs. Simply put, many of today s analysts lack the proper mentoring and training necessary to accomplish the desired result—the elimination of problems. Without these tools these analysts revert to their inherent god-given analytical techniques i.e., inference, perceptions, assumptions, intuition and reports by others. [Pg.42]

Blind samples are types of sample which are inserted into the analytical batch without the knowledge of the analyst - the analyst may be aware that blind samples are present but not know which they are. Blind samples may be sent by the customer as a check on the laboratory or by laboratory management as a check on a particular system. Results from blind samples are treated in the same way as repeat samples - the customer or laboratory manager examines the sets of results to determine whether the level of variation, between repeat measurements on the blind sample or between the observed results and an expected value, is acceptable, as described in Section 5.4.3. [Pg.118]

If the analytical method used by participants in the proficiency testing round has been validated by means of a formal collaborative trial, then the repeatability and reproducibility data from the trial can be used. The repeatability standard deviation gives an estimate of the expected variation in replicate results obtained in a single laboratory over a short period of time (with each result produced by the same analyst). The reproducibility standard deviation gives an estimate of the expected variation in replicate results obtained in different laboratories (see Chapter 4, Section 4.3.3 for further explanation of these terms). [Pg.188]

The meetings then revolve around potential safety issues identified by the analysts. The analysts are encouraged to voice any potential safety concern in terms of questions that begin with "what-if." However, any process safety concern can be voiced, even if it is not phrased as a question. For example ... [Pg.45]

Adequate resolution of the components of a mixture in the shortest possible time is nearly always a principal goal. Establishing the optimum conditions by trial and error is inefficient and relies heavily on the expertise of the analyst. The development of computer-controlled HPLC systems has enabled systematic automated optimization techniques, based on statistical experimental design and mathematical resolution functions, to be exploited. The basic choices of column (stationary phase) and detector are made first followed by an investigation of the mobile phase composition and possibly other parameters. This can be done manually but computer-controlled optimization has the advantage of releasing the analyst for other... [Pg.139]


See other pages where The Analyst is mentioned: [Pg.451]    [Pg.689]    [Pg.695]    [Pg.696]    [Pg.699]    [Pg.314]    [Pg.2553]    [Pg.231]    [Pg.160]    [Pg.1029]    [Pg.148]    [Pg.148]    [Pg.105]    [Pg.53]    [Pg.54]    [Pg.115]    [Pg.55]    [Pg.54]    [Pg.46]    [Pg.431]    [Pg.7]    [Pg.126]   
See also in sourсe #XX -- [ Pg.120 ]




SEARCH



Analysts

Role of the Analyst

The Piping Analyst

© 2024 chempedia.info