Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Karhunen-Loeve transform

Figure 2. Projection of hoursO, with complaints, and , without complaints, of air pollution on the two most significant eigenvectors of the Karhunen-Loeve transformed, seven-dimensional feature space. Reproduced with permission from Ref. 7. Copyright 1984,... Figure 2. Projection of hoursO, with complaints, and , without complaints, of air pollution on the two most significant eigenvectors of the Karhunen-Loeve transformed, seven-dimensional feature space. Reproduced with permission from Ref. 7. Copyright 1984,...
The eigenvectors of this matrix are linear combinations of the measurements, and the eigenvalues are a direct measure of the fraction of total variance accounted for by the corresponding eigenvector. This analysis Is the basis for the Karhunen-Loeve transformation, In which the data are projected onto the plane of the two eigenvectors with largest eigenvalue. This choice of axes displays more of the data variance than any other. [Pg.163]

The basic idea of the Karhunen-Loeve Transform (KLT) is that, if the correlation in the image is known, then it is possible to calculate the optimal mathematical basis by using an eigen-decomposition. The optimal basis is defined here as the one that minimises the overall root-mean-square distortion. [Pg.462]

Generation of new orthogonal features can be carried out by a principal component analysis. A Karhunen-Loeve transformation (Chapter 8.2) gives new non-correlated vector components. [Pg.104]

If the noisy signal is projected onto the signal subspace, a least squares (LS) estimate is obtained. The projection can be accomplished by applying the Karhunen-Loeve transform (KLT) (Van Trees, 1968) to the noisy signal and nulling transform components obtained from eigenvectors in the noise subspace. [Pg.1468]

PCA is a method based on the Karhunen-Loeve transformation (KL transformation) of the data points in the feature space. In KL transformation, the data points in the feature space are rotated such that the new coordinates of the sample points become the linear combination of the original coordinates. And the first principal component is chosen to be the direction with largest variation of the distribution of sample points. After the KL transformation and the neglect of the components with minor variation of coordinates of sample points, we can make dimension reduction without significant loss of the information about the distribution of sample points in the feature space. Up to now PCA is probably the most widespread multivariate statistical technique used in chemometrics. Within the chemical community the first major application of PCA was reported in 1970s, and form the foundation of many modem chemometric methods. Conventional approaches are univariate in which only one independent variable is used per sample, but this misses much information for the multivariate problem of SAR, in which many descriptors are available on a number of candidate compounds. PCA is one of several multivariate methods that allow us to explore patterns in multivariate data, answering questions about similarity and classification of samples on the basis of projection based on principal components. [Pg.192]

G.W. Wornell, A Karhunen-Loeve-like Expansion for 1/f Processes via Wavelets, IEEE Transformation and Information Theory, 36 (1990), 859. [Pg.436]

PCR is a two-step multivariate calibration method involving compression of the data (x-) matrix into latent variables by principal components analysis (PCA), followed by MLR. PCA (also known as Karhunen-Loeve expansion or Eigen-xy analysis) mathematically transforms a number of possibly correlated variables into a smaller number of uncorrelated variables called eigenvectors (or PCs). Essentially, PCA is the breakdown of the original data matrix (X) to a product of scores matrix (T) and a loadings matrix (L). The loading matrix describes the direction of the PC. These relationships can be represented by the equation ... [Pg.593]

The principal component analysis (PCA) or Karhunen-Loeve analysis is useful for analysis of highly correlated multi-spectral remotely sensed data [13, 14]. The transformation of raw remote sensor data using PCA can result in new principal component images that may be more interpretable than the original data [14, 15]. For PCA the transformation is applied to a correlated set of multi-spectral data, application of the transformation to the correlated remote sensor data will result in another uncorrelated multi-spectral dataset that has certain ordered variance properties. This transformation is conceptualized by considering the two-dimensional distribution of pixel values obtained in two bands that can be labeled as and X. The spread or variance of the distribution of points is an indication of the correlation and quality of information associated with both bands, if aU the points are clustered in an extremely tight zone in two-dimensional space, these data will provide very little information. [Pg.65]


See other pages where Karhunen-Loeve transform is mentioned: [Pg.617]    [Pg.462]    [Pg.462]    [Pg.140]    [Pg.98]    [Pg.617]    [Pg.462]    [Pg.462]    [Pg.140]    [Pg.98]    [Pg.35]    [Pg.2107]   
See also in sourсe #XX -- [ Pg.462 ]




SEARCH



© 2024 chempedia.info