Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Extrapolation Practice validity

The "method of standard additions" has been employed as a technique for standardization of atomic absorption analyses of metals In biological fluids (13,21) In this procedure, several concentrations of standard analyte are added to samples of the biological fluid to be analyzed The calibration curve which Is obtained after additions of the standard analyte to the biological fluid should parallel that obtained when aqueous standards are analyzed Extrapolation of the standard additions curve back to a negative Intercept on the abscissa furnishes an estimate of the concentration of the analyte In the original sample (21) This technique Is helpful In assessing the validity of methods of trace metal analysis (11,13,58) However, In the author s opinion, the "method of standard additions" Is neither practical nor reliable as a routine method for standardization... [Pg.255]

It is well known that mechanisms can vary with temperature, pressure, etc. Accordingly, there always will be a risk that even true kinetic expressions developed at laboratory conditions are not valid for conditions used at a large scale. This is the well-known extrapolation risk. In practice, the situation is worse. Identification of the true mechanism often is troublesome and time-consuming. Therefore, we often deal with kinetic expressions that are empirical to a high degree rather than physically sound. [Pg.306]

Experimentally there are two methods of determining the ] extracolumn band broadening of a chromatographic instrument. The linear extrapolation method, discussed above, is relatively straightforward to perform and interpret but rests on the validity.. of equation (5.1) and (5.3). The assu itlon that the individual contributions to the extracolumn variance are independent, may not be true in practice, and it may be necessary to couple some of the individual contributions to obtain the most accurate values for the extracolumn variance [20]. It is assumed in equation (5.3) ... [Pg.280]

The problems particular to accelerated tests are related to the extrapolation process. It was stated earlier that it is essential that extrapolation rules from the test conditions to those of service are known and have been verified. In practice this is only an ideal as extrapolation procedures have not generally been comprehensively validated and almost certainly will not give accurate predictions in all cases. The only choice is to use the best techniques available and apply them with caution. [Pg.61]

Before there can be any extrapolation there must be confidence in the model or rules being used. In practice this often has to involve an element of faith because of lack of validation data, particularly where the rule is empirical. The theory or model should be no more complex than is necessary to fit the data. The accuracy of fit to, for example, a creep curve can often be made more precise by applying ever higher order polynomial expressions, but outside the range of points these functions diverge rapidly to infinity (or minus infinity) leading to predictions that are ridiculous. [Pg.136]

Under this project, an IPCS Harmonization Project Document on the Principles of Characterizing and Applying Human Exposure has been published (WHO/IPCS 2005). This document sets out the characteristics of exposure assessment models that should be described to aid in model selection by exposure assessors. The document summarizes current practice in exposure modeling and principles for evaluating exposure models, but does not provide a comprehensive list of existing exposure models. The focus of the document is on the discussion of general properties of exposure models and how they should be described. The characteristics of different modeling frameworks are examined, and 10 principles are recommended for characterization, evaluation, and use of exposure models in order to help model users select and apply the most appropriate models. The report also discusses issues such as validation, input data needs, time resolution, and extrapolation of the model results to different populations and scenarios. [Pg.317]

Besides meeting its assumptions, other problems in the application of SSD in risk assessment to extrapolate from the population level to the community level also exist. First, when use is made of databases (such as ECOTOX USEPA 2001) from which it is difficult to check the validity of the data, one does not know what is modeled. In practice, a combination of differences between laboratories, between endpoints, between test durations, between test conditions, between genotypes, between phenotypes, and eventually between species is modeled. Another issue is the ambiguous integration of SSD with exposure distribution to calculate risk (Verdonck et al. 2003). They showed that, in order to be able to set threshold levels using probabilistic risk assessment and interpret the risk associated with a given exposure concentration distribution and SSD, the spatial and temporal interpretations of the exposure concentration distribution must be known. [Pg.121]

Extrapolation results are, by definition, predictions on the performance of entities for which data are lacking. Both from a scientific perspective as well as from the perspective of practical decisions based on extrapolation, there is a need to consider not only the outcomes of an extrapolation per se but also the question of whether the outcome is supported by a certain degree of validity. All this relates to the issue generally referred to as validation. ... [Pg.265]

Validation by practical use occurs frequently, especially for the lower tier methods scientific validation is less well developed and one extrapolation approach can be considered valid for 1 assessment type, but not for the other. [Pg.322]

For food allergens, validated animal models for dose-response assessment are not available and human studies (double-blind placebo-controlled food challenges [DBPCFCs]) are the standard way to establish thresholds. It is practically impossible to establish the real population thresholds this way. Such population threshold can be estimated, but this is associated with major statistical and other uncertainties of low dose-extrapolation and patient recruitment and selection. As a matter of fact, uncertainties are of such order of magnitude that a reliable estimate of population thresholds is currently not possible. The result of the dose-response assessment can also be described as a threshold distribution rather than a single population threshold. Such distribution can effectively be used in probabilistic modeling as a tool in quantitative risk assessment (see Section 15.2.5)... [Pg.389]

First of all, such calculations can be carried out with sufficient accuracy only for rather simple structures and most frequently the results obtained cannot be accurately extrapolated even to related (but more complicated) systems. Secondly, these calculations refer to ideal situations such as the behavior of an isolated molecule. The validity of these results, strictly speaking, is thus limited to gas phase reactions. For these reasons, quantum mechanical calculations have not yet become daily working instruments in chemical practice and it is hardly to be expected that this approach might ever become a universal tool for the solution of chemical problems. At the same time, there are plenty of examples of entirely different and more fruitful ways for the application of quantum chemistry to organic chemistry. [Pg.456]

In-using a regression equation it must be always remembered that it is only valid over the range of the independent variable which occurred in the data used in calculating it. Extrapolation is most unwise, except when there is a very sound theoretical basis. It is good practice to quote the range used of the independent variable in order to discourage extrapolation. ( )... [Pg.66]

Titration curves have the virtue of simplicity, while thermodynamic models have, at least in principle, greater capacity to be adapted to changing process conditions. In practice, the amount of effort required to develop a thermodynamic model based on real chemical species and physical properties is usually prohibitive. Semiempirical models based on notional species and concentration equilibria can be developed quite readily (Gustafsson and Waller, 1983). However, these have extrapolation properties identical to the original titration curves. In both cases the pH of a mixture of components can be predicted accurately, subject to the assumptions that mixing two solutions of equal pH results in a solution with the same pH and the pH measurement is ideal (Gustafsson, 1982 Luyben, 1990). The validity of this assumption is discussed in Walsh (1993). [Pg.354]

A large percentage (41%) of the articles we reviewed did not include a comparison group. They did not incorporate a study design that would allow one to control variance, which therefore makes it difficult for the reader to confirm the validity or extrapolate the results to other practice settings. This is not to say that these articles are without value, however. Many are excellent descriptive reports that provide insight and experience from which others may learn. [Pg.305]

Composite Sorption Magnitude and Isotherm Nonlinearity. Accurate assessment of the extent to which the global isotherm for a system is nonlinear is important for accurate portrayal of sorption processes in that system. From a practical point of view, the extrapolation of linear approximations of weakly nonlinear or near-linear sorption isotherms to concentration ranges beyond which they are valid can result in significant errors in projections of contaminant fate and transport (1). From a conceptual point of view, observations of isotherm nonlinearity over specific concentration ranges may be employed in conjunction with models such as the DRM to probe and evaluate the extent to which multiple sorption mechanisms are operative in a particular system. [Pg.375]

In the study of thermal stability, accelerated testing in the form of elevated temperatures has been used by many pharmaceutical companies to minimize time involved in the testing process. This procedure is only valid for simple formulations in which the single major ingredient is broken down by a thermal reaction. In practice, regulatory authorities demand that a shelf-life determined by extrapolation of accelerated test data should be supported by actual stability data obtained by normal temperature storage (Carstensen, 1995). This is because degradation of a product by microbial contamination may well be inhibited at elevated temperatures. [Pg.64]

Quantitative methods are usually based on a comparison of the response from an analyte in a sample with the response from standards of the analyte in solution at known concentrations. In method development and validation, the calibration curve should first be determined to assess the detector response to standards over a range of concentration. These concentrations should cover the full range of analytical interest, and, although it is usually recommended practice to include a suitable blank with the calibration samples, this does not imply that it is acceptable to extrapolate into the region of the curve below the lowest calibration standard or to force the curve through the origin. [Pg.275]


See other pages where Extrapolation Practice validity is mentioned: [Pg.238]    [Pg.130]    [Pg.680]    [Pg.155]    [Pg.157]    [Pg.304]    [Pg.65]    [Pg.125]    [Pg.264]    [Pg.210]    [Pg.44]    [Pg.83]    [Pg.100]    [Pg.77]    [Pg.265]    [Pg.266]    [Pg.322]    [Pg.410]    [Pg.48]    [Pg.147]    [Pg.70]    [Pg.387]    [Pg.169]    [Pg.37]    [Pg.176]    [Pg.290]    [Pg.35]    [Pg.46]    [Pg.293]    [Pg.196]    [Pg.119]    [Pg.196]    [Pg.192]   
See also in sourсe #XX -- [ Pg.265 , Pg.266 ]




SEARCH



Extrapolation Practice

Extrapolation Practice validation

Extrapolation Practice validation

© 2024 chempedia.info