Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Standard error of cross validation

SE Standard error, R coefficient of determination, SEC standard error of calibration, SEV(C) standard error of cross-validation, PLS terms number of terms used for modified partial least squares regression. [Pg.764]

Standard error of prediction (SEP) for independent test sets except for ATR study, where the standard error of cross-validation for the calibration set is given. [Pg.9]

Table 19.11 Comparison of the accuracy of some methods of predicting silage dry matter intake (expressed as the standard error of cross validation, g/kg metabolic liveweight per day)... Table 19.11 Comparison of the accuracy of some methods of predicting silage dry matter intake (expressed as the standard error of cross validation, g/kg metabolic liveweight per day)...
SECV standard error of cross-validation DM dry matter CP crude protein R j coefficient of determination in cross validation SEP standard error of prediction, n/a data non available (g 1 DM basis). [Pg.315]

The optimum number of PLS factors used in the models was determined by a cross-validation method. In cross-validation, five samples were temporarily removed from the calibration set to be used for validation. With the remaining samples, a PLS model was developed and applied to predict the respective milk component for each sample in the group of five. Results were compared with the respective reference values. This procedure was repeated several times until prediction for all samples had been obtained. Performance statistics were accumulated for each group of removed samples. The validation errors were combined into a standard error of cross-validation. The optimum number of PLS factors in each equation was defined to be that which corresponded to the lowest standard error of cross-validation. The performance of each regression was tested with independent validation set of samples. [Pg.382]

Statistic Standard Error of Cross-Validation (SECV)... [Pg.145]

N = number of samples SD = standard deviation SECV = standard error of cross-validation = coefficient of determination for cross-validation. [Pg.390]

Root Mean Square Error of Cross Validation for PCA Plot (Model Diagnostic) As described above, the residuals from a standard PCA calculation indicate how the PCA model fits the samples that were used to construction the PCA model. Specifically, they are the portion of the sample vectors that is not described by the model. Cross-validation residuals are computed in a different manner, A subset of samples is removed from the data set and a PCA model is constructed. Then the residuals for the left out samples are calculated (cross-validation residuals). The subset of samples is returned to the data set and the process is repeated for different subsets of samples until each sample has been excluded from the data set one time. These cross-validation residuals are the portion of the left out sample vectors that is not described by the PCA model constructed from an independent sample set. In this sense they are like prediction residuals (vs. fit). [Pg.230]

Table 3. Results of PLS models for fresh Duke berry samples (r = coefficient of correlation RMSEC = root mean square of the standard error in calibration RMSEGV = root mean square of the standard error in cross-validation LV = latent variables). All data were preprocessed by second derivative of reduced and smoothed data. Table 3. Results of PLS models for fresh Duke berry samples (r = coefficient of correlation RMSEC = root mean square of the standard error in calibration RMSEGV = root mean square of the standard error in cross-validation LV = latent variables). All data were preprocessed by second derivative of reduced and smoothed data.
Now we come to the Standard Error of Estimate and the PRESS statistic, which show interesting behavior indeed. Compare the values of these statistics in Tables 25-IB and 25-1C. Note that the value in Table 25-1C is lower than the value in Table 25-1B. Thus, using either of these as a guide, an analyst would prefer the model of Table 25-1C to that of Table 25-1B. But we know a priori that the model in Table 25-1C is the wrong model. Therefore we come to the inescapable conclusion that in the presence of error in the X variable, the use of SEE, or even cross-validation as an indicator, is worse than useless, since it is actively misleading us as to the correct model to use to describe the data. [Pg.124]

SEP Standard error of prediction, SEPCV for cross validation, SEPtest for an... [Pg.307]

Calibration was carried out for 0-100%wt PP content region. Two PCs fully described the model with a root square standard error of prediction (RMSEP) equal to 0.91%wt. Since the method is intended to be used for determination of very small amounts of PP in recycled HDPE, which has been carefully separated from PP and other household plastics, a PLS in a narrower interval, 0-15%wt PP, was designed. The model was validated by either cross validation or a test set consisting of three samples with 0.5, 7.0 and 12.0%wt PP content. The RMSEP was equal to 0.21%wt obtained from the calibration curves. [Pg.221]

The statistics reported for the tit are the number of compounds used in the model (n), the squared multiple correlation coefficient (R2), the cross-validated multiple correlation coefficient (R2Cv) the standard error of the fit (s), and the F statistic. The squared multiple correlation coefficient can take values between 0 (no fit at all) and 1 (a perfect fit) and when multiplied by 100 gives the percentage of variance in the response explained by the model (here 83%). This equation is actually quite a good fit to the data as can be seen by the plot of predicted against observed values shown in Figure 7.6. [Pg.172]

Statistical parameters, when available, indicating the significance of each of the descriptor s contribution to the final regression equation are listed under its corresponding term in the equation. These include the standard errors written as values, the Student t test values, and the VIF. The significance of the equation will be indicated by the sample size, n the variance explained, r the standard error of the estimate, s the Fisher index, F and the cross-validated correlation coefficient, q. When known, outliers will be mentioned. The equations are followed by a discussion of the physical significance of the descriptor terms. [Pg.232]

Fit all the surfaces using kriging and validate the model. Once all the variables (surfaces) have been estimated by kriging, it is important validate the metamodel, i.e. using cross validation that allows us to asses the accuracy of the model without extra sampling [2], A kriging model can be considered correct if all the errors in cross validation are inside the interval [-3,+3] standard errors. [Pg.554]

David Haaland et al. added to the modeling literature with a 1992 paper [9]. This work used whole blood for the model. Scanning from 1500 to 2400 nm, a PLS equation was developed on glucose-spiked whole blood. The range between 0.17 and 41.3 mM yielded an equation with a standard error of 1.8 mM. Four patients were used as models for this project. Cross-validated PLS standard errors for glucose concentration based on data obtained from all four subjects were 2.2 mM. [Pg.143]


See other pages where Standard error of cross validation is mentioned: [Pg.328]    [Pg.291]    [Pg.66]    [Pg.168]    [Pg.137]    [Pg.390]    [Pg.300]    [Pg.328]    [Pg.291]    [Pg.66]    [Pg.168]    [Pg.137]    [Pg.390]    [Pg.300]    [Pg.411]    [Pg.264]    [Pg.123]    [Pg.113]    [Pg.23]    [Pg.134]    [Pg.99]    [Pg.100]    [Pg.133]    [Pg.125]    [Pg.230]    [Pg.173]    [Pg.284]    [Pg.213]    [Pg.9]    [Pg.176]    [Pg.480]    [Pg.429]    [Pg.123]    [Pg.83]   
See also in sourсe #XX -- [ Pg.410 ]

See also in sourсe #XX -- [ Pg.423 ]




SEARCH



Cross validated

Cross validation

Cross validation error

Errors standardization

Standard Error

Validation error

© 2024 chempedia.info