Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Calibration and data analysis

The current practice of size exclusion chromatography has been described in Chapter 2. In contrast to the early application of this technique, the experimental practice is now often relatively straightforward. Also, the complex data manipulation is now normally carried out using a dedicated data-handling system which produces the calculated molecular mass averages and distributions at the touch of a few keys. Unfortunately, this apparent ease of operation and calculation tends to conceal the real complexity of the situation, and the results are often treated as more definitive than they in fact are. [Pg.42]

SEC is a particularly valuable tool for examining the molecular mass distributions (MMDs) of polymers. However, appropriate levels of caution are required in the interpretation of the results. There are many difficulties in obtaining the best applicable calibration of an SEC system, and even if this is achieved, the data manipulation can introduce further uncertainty. This is well illustrated by pointing out that many computer programs for SEC contain an error in the calculation (see section 3.3). [Pg.42]

Having appreciated the problems associated with the conventional methods of calibrating SEC systems, many users of the technique seek to use detectors which should impart an absolute quality to the data. These extensions of the SEC technique can also contribute to a misinterpretation of the data. [Pg.42]


All three of these PhD theses contain sections on the theory of operation of each component of the mass/heat-flow sensor and experimental details such as block diagrams of the apparatus, sample preparation, data acquisition and control, calibration, and data analysis. [Pg.164]

CALIBRATION AND DATA ANALYSIS Table 3.1 Narrow MMD polymer calibrants ... [Pg.43]

Modern NIR technology relies heavily on the computer (and the microprocessor in particular), not only for its ability to control and acquire data from the instrument, but to facilitate calibration and data analysis. The foundations of data analysis were laid down in the 1930s. Work on the diffuse scattering of light in both transmission and reflection, by Kubelka and Munk [14] in 1931, opened the door to NIR measurements on solids. In 1933, Hotelling [15] wrote a classic paper on principal components analysis (PCA), and Mahalanobis formulated a mathematical approach for representing data clustering and separation in multidimensional space. [Pg.5]

Sample preparation, injection, calibration, and data collection, must be automated for process analysis. Methods used for flow injection analysis (FLA) are also useful for reliable sampling for process LC systems.1 Dynamic dilution is a technique that is used extensively in FIA.13 In this technique, sample from a loop or slot of a valve is diluted as it is transferred to a HPLC injection valve for analysis. As the diluted sample plug passes through the HPLC valve it is switched and the sample is injected onto the HPLC column for separation. The sample transfer time typically is determined with a refractive index detector and valve switching, which can be controlled by an integrator or computer. The transfer time is very reproducible. Calibration is typically done by external standardization using normalization by response factor. Internal standardization has also been used. To detect upsets or for process optimization, absolute numbers are not always needed. An alternative to... [Pg.76]

There can be no doubt that instrumental methods of analysis have revolutionized analytical chemistry, in terms of increased sensitivity, more rapid throughput, multielement capability, computerized calibration, and data handling, etc. There is a cost, too, of course - increased capital expenditure, increased instrumental complexity, and, above all, the current tendency to believe implicitly the output of a computer. Just because a machine gives an analysis to 12 places of decimals doesn t mean that it is true (see Chapter 13) ... [Pg.42]

Experimental Techniques. Chromatography was performed on a Varian model 5060 HPLC equipped with a RI-3 refractive index detector. A Vista Plus Gel Permeation Chromatography (GPC) data system was used consisting of a Vista 401 chromatography data system serially connected to an Apple II microcomputer. The Vista 401 performs data acquisition and allows data storage and automations capability while all SEC data processing is performed on the Apple II by means of user-interactive GPC software for automated, on-line calibration and polymer analysis. [Pg.77]

Finally, there is the need for proper documentation, which can be in written or electronic forms. These should cover every step of the measurement process. The sample information (source, batch number, date), sample preparation/analytical methodology (measurements at every step of the process, volumes involved, readings of temperature, etc.), calibration curves, instrument outputs, and data analysis (quantitative calculations, statistical analysis) should all be recorded. Additional QC procedures, such as blanks, matrix recovery, and control charts, also need to be a part of the record keeping. Good documentation is vital to prove the validity of data. Analyt-... [Pg.27]

Since ISEs can be used in continuous flow systems or in flow systems with sample injection (flow injection analysis, FIA)21 their application is wide, not limited to discrete samples. Analysis time becomes shorter, with faster recycling. Additionally, in flow systems the experimental assembly and data analysis can be controlled automatically by microcomputer, including periodic calibration. Another development is the use of sensors for the detection of eluents of chromatographic columns in high-pressure liquid chromatography (HPLC). Miniaturization has permitted an increase in the use of sensors in foods, biological tissues, and clinical analyses in general. [Pg.308]

S. Burke, Regression and calibration, LC-GC Europe Online Supplement Statistics and Data Analysis (2001), 13-18. [Pg.500]

As direct spectral interpretation is limited in NIR spectroscopy, multivariate mathematical methods are used to obtain useful information. These techniques are used to develop mathematical models that correlate spectral features to properties of interest. For quantitative work, calibration models are needed that relate the concentration of a sample-analyte to spectral data. Information on developing calibration models and data analysis is provided elsewhere [128-132]. [Pg.126]

Reis MM, Gurden SP, Smilde AK, Ferreira MMC, Calibration and detailed analysis of second-order flow-injection analysis data with rank overlap, Analytica Chimica Acta, 2000, 422, 21-36. [Pg.364]

Oil from the process stream simultaneously flows through the X-ray sampling flow cell and a densitometer. The densitometer measures the oil density and temperature and inputs these values into the data processing software for corrections to the calibration and final analysis results. [Pg.112]

While the original EPA definition of the MDL (Glaser 1981) did strive to achieve a usefnl compromise between scientific and statistical defensibUity on the one hand, and cost in terms of time, money and effort on the other, there appears to be a growing acceptance (EPA 2004) that a more rigorons approach is required. Because there is always some variability in the calibration data, the precision of a measurement of x for an unknown sample will obviously be poorer than the precision estimated from replicate measurements of response Y for the same sample. Thus, in estimating any parameter such as the LOD, LLOQ etc., one must take into account the variability in both the calibration and the analysis of the unknown sample. [Pg.423]

Determination of uranium in soil samples can be carried out by nondestructive analysis (NDA) methods that do not require separation of uranium (needed for alpha spectrometry or TIMS) or even digestion of the soil for analysis by ICPMS, ICPAES, or some other spectroscopic methods. These NDA methods can be divided into passive techniques that utilize the natural radioactive mission (gamma and x-ray) of the uranium and progeny radionuclides or active methods where neutrons or electromagnetic radiation are used to excite the uranium and the resultant emissions (gamma, x-rays, or neutrons) are monitored. In many cases, sample preparation is simpler for these nondestructive methods but the requiranent of a neutron source (from a nuclear reactor in many cases) or a radioactive source (x-ray or gamma) and relatively complex calibration and data interpretation procedures make the use of these techniques competitive only in some applications. In addition, the detection limits are usually inferior to the mass spectrometric techniques and the isotopic composition is not readily obtainable. [Pg.135]

With this brief discussion of multivariate analysis and linear equations completed, we are ready to discuss the calibration of instruments. Often underemphasized, the reliable calibration of equipment and instrumentation is the foundation of data quality and reliability. Accordingly, before leaping into the topic of calibration, we must formally introduce quality assurance and quality control, which in turn will integrate the statistical concepts introduced here and in the previous chapter. The discussion that follows will provide our link between statistics, calibration, and data quality. [Pg.59]

The second part of the book—Chapters 9-12— presents some selected applications of chemometrics to different topics of interest in the field of food authentication and control. Chapter 9 deals with the application of chemometric methods to the analysis of hyperspectral images, that is, of those images where a complete spectrum is recorded at each of the pixels. After a description of the peculiar characteristics of images as data, a detailed discussion on the use of exploratory data analytical tools, calibration and classification methods is presented. The aim of Chapter 10 is to present an overview of the role of chemometrics in food traceability, starting from the characterisation of soils up to the classification and authentication of the final product. The discussion is accompanied by examples taken from the different ambits where chemometrics can be used for tracing and authenticating foodstuffs. Chapter 11 introduces NMR-based metabolomics as a potentially useful tool for food quality control. After a description of the bases of the metabolomics approach, examples of its application for authentication, identification of adulterations, control of the safety of use, and processing are presented and discussed. Finally, Chapter 12 introduces the concept of interval methods in chemometrics, both for data pretreatment and data analysis. The topics... [Pg.18]


See other pages where Calibration and data analysis is mentioned: [Pg.175]    [Pg.6]    [Pg.42]    [Pg.45]    [Pg.47]    [Pg.49]    [Pg.51]    [Pg.55]    [Pg.175]    [Pg.6]    [Pg.42]    [Pg.45]    [Pg.47]    [Pg.49]    [Pg.51]    [Pg.55]    [Pg.259]    [Pg.668]    [Pg.5]    [Pg.366]    [Pg.204]    [Pg.365]    [Pg.115]    [Pg.49]    [Pg.3410]    [Pg.84]    [Pg.149]    [Pg.203]    [Pg.320]    [Pg.1171]    [Pg.208]    [Pg.231]    [Pg.1560]    [Pg.105]    [Pg.216]    [Pg.8]    [Pg.4047]    [Pg.214]    [Pg.418]    [Pg.667]    [Pg.455]    [Pg.370]   


SEARCH



Calibration analysis

Data and analysis

© 2024 chempedia.info