Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression validation

Foremost among the methods for interpolating within a known data relationship is regression the fitting of a line or curve to a set of known data points on a graph, and the interpolation ( estimation ) of this line or curve in areas where we have no data points. The simplest of these regression models is that of linear regression (valid... [Pg.931]

Several steps are involved in rapid analysis method development. These include gathering appropriate calibration samples, chemical characterization of the calibration samples, developing spectroscopic methods for the rapid technique, projection-to-latent-structures (PLS) regression, validation of the PLS algorithm, and the development of QA/QC procedures.128... [Pg.1475]

A crucial decision in PLS is the choice of the number of principal components used for the regression. A good approach to solve this problem is the application of cross-validation (see Section 4.4). [Pg.449]

A linear regression analysis should not be accepted without evaluating the validity of the model on which the calculations were based. Perhaps the simplest way to evaluate a regression analysis is to calculate and plot the residual error for each value of x. The residual error for a single calibration standard, r , is given as... [Pg.124]

The second task discussed is the validation of the regression models with the aid of the cross-validation (CV) procedures. The leave-one-out (LOO) as well as the leave-many-out CV methods are used to evaluate the prognostic possibilities of QSAR. In the case of noisy and/or heterogeneous data the LM method is shown to exceed sufficiently the LS one with respect to the suitability of the regression models built. The especially noticeable distinctions between the LS and LM methods are demonstrated with the use of the LOO CV criterion. [Pg.22]

The results obtained by lineal regression (LR) and by pareial least square regression (PLS) methods have been eompared to quantify tlie 0-H signal in anhydrite samples. The PLS quality is eharaeterized by a eoirelation eoeffieient of 0.9942 (eross-validation) using four faetors and a root mean square eiTor of ealibration (RMSEC) of 0.058. Tlie eoirelation eoeffieient of LR metliod obtained was 0.9753. [Pg.200]

It is often helpful to examine the regression errors for each data point in a calibration or validation set with respect to the leverage of each data point or its distance from the origin or from the centroid of the data set. In this context, errors can be considered as the difference between expected and predicted (concentration, or y-block) values for the regression, or, for PCA, PCR, or PLS, errors can instead be considered in terms of the magnitude of the spectral... [Pg.185]

Recorded kinetic curves were fitted to the five-parameter Equation (1). The parameters pj with their errors and the standard deviation of regressions are summarized in Tables 1-6. Comparison of the data confirm the previously reported (refs. 8,12) similarity in the behavior of the two isomers in the presence of strong bases in spite of the different shape of the kinetic curves. The relatively good agreement of exponents p2, P4 computed for the diastereomers at the same temperature and amine concentration demonstrates the validity of the model used. From comparison of Equations (4) and (7) it follows that both reaction must give the same exponent. [Pg.268]

The trend logio(CV) vs logjo(c) appears reasonably linear (compare this with Ref. 177 some points are from the method validation phase where various impurities were purposely increased in level). A linear regression line B) is used to represent Ae average trend (slope = -0.743). The target level for any given impurity is estimated by a simple model. Because the author-... [Pg.196]

Note the array R(/, J) contains the values is the number, that is the j-th measurement is the item or dimension number. For a program based on linear regression (LINREG. VALID, SHELFLIFE), since the array R(,) must have M >2 columns, it is up to the user to decide which column will be identified with abscissa X (index K), and which with ordinate Y (index L) / (/, k ) is the independent variable X, R I, L) is the dependent variable Y. K (and L, if necessary) are established by clicking on the column(s) after the file has been selected. When any program is started, the available data in the chosen file will be shown for review. [Pg.363]

As the solute descriptors (E, S, A, B and V) represent the solute influence on various solute-solvent phase interachons, the regression coefficients e, s, a, h and V correspond to the complementary effect of the solvent phases on these interactions. As an example, consider the product aA in Eq. (4). Since A is the H-bond acidity of the solute, a is the H-bond basicity of the system. In other words, the intermolecular forces discussed in Sections 12.1.1.2 and 12.1.1.3 are present in all Abraham s log P factorization equations, with the exception of those interactions involving ions. This is the reason why Abraham s equahons are valid for neutral species only. [Pg.323]

An important aspect of all methods to be discussed concerns the choice of the model complexity, i.e., choosing the right number of factors. This is especially relevant if the relations are developed for predictive purposes. Building validated predictive models for quantitative relations based on multiple predictors is known as multivariate calibration. The latter subject is of such importance in chemo-metrics that it will be treated separately in the next chapter (Chapter 36). The techniques considered in this chapter comprise Procrustes analysis (Section 35.2), canonical correlation analysis (Section 35.3), multivariate linear regression... [Pg.309]

M. Stone and R.J. Brooks, Continuum regression cross-validated sequentially constructed prediction embracing ordinary least sqaures, partial least squares, and principal component regression. J. Roy. Stat. Soc. B52 (1990) 237-269. [Pg.347]

Fig. 36.10. Prediction error (RMSPE) as a function of model complexity (number of factors) obtained from leave-one-out cross-validation using PCR (o) and PLS ( ) regression. Fig. 36.10. Prediction error (RMSPE) as a function of model complexity (number of factors) obtained from leave-one-out cross-validation using PCR (o) and PLS ( ) regression.

See other pages where Regression validation is mentioned: [Pg.171]    [Pg.171]    [Pg.171]    [Pg.171]    [Pg.889]    [Pg.885]    [Pg.402]    [Pg.491]    [Pg.497]    [Pg.511]    [Pg.16]    [Pg.717]    [Pg.717]    [Pg.119]    [Pg.124]    [Pg.229]    [Pg.340]    [Pg.393]    [Pg.31]    [Pg.405]    [Pg.176]    [Pg.61]    [Pg.60]    [Pg.184]    [Pg.375]    [Pg.423]    [Pg.483]    [Pg.501]    [Pg.341]    [Pg.99]    [Pg.444]    [Pg.238]    [Pg.329]    [Pg.330]    [Pg.342]    [Pg.350]    [Pg.369]   


SEARCH



Cross-validated multiple regression

Principal Component Regression cross validation

Regression cross model validation

Regression cross-validation

The Significance and Validity of QSAR Regression Equations

Validation procedure, regression

Validation procedure, regression objectives

© 2024 chempedia.info