Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Error eigenvector

This is a statistical test designed by Malinowski [43] which compares the variance contributed by a structural eigenvector with that of the error eigenvectors. Let us suppose that is the variance contributed by the last structural eigen-... [Pg.143]

As more structural eigenvectors are included we expect PRESS to decrease up to a point when the structural information is exhausted. From this point on we expect PRESS to increase again as increasingly more error eigenvectors are included. In order to determine the transition point r one can compare PRESS(r -i-l) with the previously obtained PRESS(r ). The number of structural eigenvectors r is reached when the ratio ... [Pg.145]

So, we can discard the third eigenvector and, along with it, that portion of the variance in our spectra that displaced the data out of the plane of the noise-free data. We are in fact, discarding a portion of the noise without significantly distorting the spectra The portion of the noise we discard is called the extracted error or the residuals. Remember that the noise we added also displaced the points to some extent within the plane of the noise-free data. This portion of the noise remains in the data because it is spanned by the eigenvectors that we must retain. The noise that remains is called the imbedded error. The total error is sometimes called the real error. The relationship among the real error (RE), the extracted error (XE), and the imbedded error (IE) is... [Pg.95]

When we regard each of our spectra as a unique point in the n-dimensional absorbance space, we can say that the error in our data is isotropic. By this, we mean that the net effect of the errors in a given spectrum is to displace that spectrum some random distance in some random direction in the n-dimensional data space. As a result, when we find the eigenvectors for our data, each eigenvector will span its equivalent share of the error. But recall, we said that we must take degrees-of-ffeedom into account in order to understand what is meant by equivalent share. [Pg.104]

Again, it is clear from these plots that the errors are minimized when 5 factors are used. Thus, we will construct our calibration matrices using a basis space cromprised of the first 5 eigenvectors (factors). [Pg.116]

It is assumed that the structural eigenvectors explain successively less variance in the data. The error eigenvalues, however, when they account for random errors in the data, should be equal. In practice, one expects that the curve on the Scree-plot levels off at a point r when the structural information in the data is nearly exhausted. This point determines the number of structural eigenvectors. In Fig. 31.15 we present the Scree-plot for the 23x8 table of transformed chromatographic retention times. From the plot we observe that the residual variance levels off after the second eigenvector. Hence, we conclude from this evidence that the structural pattern in the data is two-dimensional and that the five residual dimensions contribute mostly noise. [Pg.143]

Principal component analysis is based on the eigenvalue-eigenvector decomposition of the n h empirical covariance matrix Cy = X X (ref. 22-24). The eigenvalues are denoted by > 2 — Vi > where the last inequality follows from the presence of same random error in the data. Using the eigenvectors u, U2,. . ., un, define the new variables... [Pg.65]


See other pages where Error eigenvector is mentioned: [Pg.95]    [Pg.202]    [Pg.144]    [Pg.109]    [Pg.163]    [Pg.95]    [Pg.202]    [Pg.144]    [Pg.109]    [Pg.163]    [Pg.175]    [Pg.225]    [Pg.422]    [Pg.426]    [Pg.323]    [Pg.102]    [Pg.131]    [Pg.181]    [Pg.202]    [Pg.187]    [Pg.145]    [Pg.24]    [Pg.35]    [Pg.384]    [Pg.206]    [Pg.380]    [Pg.297]    [Pg.298]    [Pg.306]    [Pg.757]    [Pg.58]    [Pg.58]    [Pg.109]    [Pg.181]    [Pg.206]    [Pg.28]    [Pg.122]    [Pg.99]    [Pg.189]    [Pg.63]    [Pg.183]    [Pg.312]   
See also in sourсe #XX -- [ Pg.143 ]




SEARCH



Eigenvector

© 2024 chempedia.info