Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

PCA

Successive PCA and Wavelet analysis processes improve small flaw detection (figure 14), because small size involves linear physical processes, where PCA is efficient. [Pg.364]

Step 2 This ensemble is subjected to a principal component analysis (PCA) [61] by diagonalizing the covariance matrix C G x 7Z, ... [Pg.91]

As oversimplified cases of the criterion to be used for the clustering of datasets, we may consider some high-quality Kohonen maps, or PCA plots, or hierarchical clustering. [Pg.208]

We have to apply projection techniques which allow us to plot the hyperspaces onto two- or three-dimensional space. Principal Component Analysis (PCA) is a method that is fit for performing this task it is described in Section 9.4.4. PCA operates with latent variables, which are linear combinations of the original variables. [Pg.213]

What UV-scaling does is to concentrate the relevant information into the same range for all the variables (or, at least, for those subjected to this method). Then, the loading matrix yielded by PCA will show the importance of the initial variables. [Pg.215]

If the task is multivariate calibration, for example, the proper choice of a pre-processing method will essentially aflFect the quality of the resultant model. For more details about the use of these techniques together with PCA and PLS, readers are advised to consider the fundamental monograph by Erikson et al [8]. [Pg.215]

It may look weird to treat the Singular Value Decomposition SVD technique as a tool for data transformation, simply because SVD is the same as PCA. However, if we recall how PCR (Principal Component Regression) works, then we are really allowed to handle SVD in the way mentioned above. Indeed, what we do with PCR is, first of all, to transform the initial data matrix X in the way described by Eqs. (10) and (11). [Pg.217]

Here W is diagonal matrix of singular values, is the transpose of the second re-sultant matrix, being actually the same as the loading matrix in PCA, and X is the matrix, which is applied for further modeling. [Pg.217]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

Figure 4-8. Plot of the first two column vectors of the loadings matrlK of PCA,... Figure 4-8. Plot of the first two column vectors of the loadings matrlK of PCA,...
First, one can check whether a randomly compiled test set is within the modeling space, before employing it for PCA/PLS applications. Suppose one has calculated the scores matrix T and the loading matrix P with the help of a training set. Let z be the characteristic vector (that is, the set of independent variables) of an object in a test set. Then, we first must calculate the scores vector of the object (Eq. (14)). [Pg.223]

Variable and pattern selection in a dataset can be done by genetic algorithm, simulated annealing or PCA... [Pg.224]

L. Eriksson, E. Johansson, N. Kettaneh-Wold, S. Wold. Introduction to Multi-and Megavariate Data Analysis using Projection Methods (PCA ai PLS). Umetrics AB, Umea, 1999. [Pg.226]

Kohonen network Conceptual clustering Principal Component Analysis (PCA) Decision trees Partial Least Squares (PLS) Multiple Linear Regression (MLR) Counter-propagation networks Back-propagation networks Genetic algorithms (GA)... [Pg.442]

Sections 9A.2-9A.6 introduce different multivariate data analysis methods, including Multiple Linear Regression (MLR), Principal Component Analysis (PCA), Principal Component Regression (PCR) and Partial Least Squares regression (PLS). [Pg.444]

PCA is a frequently used method which is applied to extract the systematic variance in a data matrix. It helps to obtain an oveiwiew over dominant patterns and major trends in the data. [Pg.446]

The aim of PCA is to create a set of latent variables which is smaller than the set of original variables but still explains all the variance in the matrix X of the original variables. [Pg.446]

In mathematical terms, PCA transforms a number of correlated variables into a smaller number of uneorrelated variables, the so-called principal components. [Pg.447]

Prior to PCA the data are often pre-processed to convert them into a form most suitable for the application of PCA. Commonly used pre-processing methods for PCA arc scaling and mcaii-ccntcring of the data, which arc described in Section 4.3. [Pg.447]

From a geometric point of view PCA can be described as follows ... [Pg.447]

A second piece of important information obtained by PCA is the loadings, which are denoted by PI, P2, etc. They indicate which variables influence a model and how the variables are correlated In algebraic terms the loadings indicate how the variables are combined to build the scores. Figure 9-8 shows a loading plot each point is a feature of the data set, and features that are close in the plot are correlated. [Pg.448]

In matrix notation PCA approximates the data matrix X, which has n objects and m variables, by two smaller matrices the scores matrix T (n objects and d variables) and the loadings matrix P (d objects and m variables), where X = TPT... [Pg.448]

An advantage of PCA is its ability to cope with almost any kind of data matrix, e.g., it can also deal with matrices with many rows and few columns or vice versa. [Pg.448]

One widely used algorithm for performing a PCA is the NIPALS (Nonlineai Iterative Partial Least Squares) algorithm, which is described in Ref [5],... [Pg.448]

PCR is a combination of PCA and MLR, which are described in Sections 9.4.4 and 9.4.3 respectively. First, a principal component analysis is carried out which yields a loading matrix P and a scores matrix T as described in Section 9.4.4. For the ensuing MLR only PCA scores are used for modeling Y The PCA scores are inherently imcorrelated, so they can be employed directly for MLR. A more detailed description of PCR is given in Ref. [5. ... [Pg.448]

The selection of relevant effects for the MLR in PCR can be quite a complex task. A straightforward approach is to take those PCA scores which have a variance above a certain threshold. By varying the number of PCA components used, the... [Pg.448]

As described above, PCA can be used for similarity detection The score plot of two principal components can be used to indicate which objects are similar. [Pg.449]

PLS is a linear regression extension of PCA which is used to connect the information in two blocks of variables X and Yto each other. It can be applied even if the features are highly correlated. [Pg.481]

On the other hand, techniques like Principle Component Analysis (PCA) or Partial Least Squares Regression (PLS) (see Section 9.4.6) are used for transforming the descriptor set into smaller sets with higher information density. The disadvantage of such methods is that the transformed descriptors may not be directly related to single physical effects or structural features, and the derived models are thus less interpretable. [Pg.490]

The previously mentioned data set with a total of 115 compounds has already been studied by other statistical methods such as Principal Component Analysis (PCA), Linear Discriminant Analysis, and the Partial Least Squares (PLS) method [39]. Thus, the choice and selection of descriptors has already been accomplished. [Pg.508]

Spectral features and their corresponding molecular descriptors are then applied to mathematical techniques of multivariate data analysis, such as principal component analysis (PCA) for exploratory data analysis or multivariate classification for the development of spectral classifiers [84-87]. Principal component analysis results in a scatter plot that exhibits spectra-structure relationships by clustering similarities in spectral and/or structural features [88, 89]. [Pg.534]

The dimensionality of a data set is the number of variables that are used to describe eac object. For example, a conformation of a cyclohexane ring might be described in terms c the six torsion angles in the ring. However, it is often found that there are significai correlations between these variables. Under such circumstances, a cluster analysis is ofte facilitated by reducing the dimensionality of a data set to eliminate these correlation Principal components analysis (PCA) is a commonly used method for reducing the dimensior ality of a data set. [Pg.513]


See other pages where PCA is mentioned: [Pg.359]    [Pg.363]    [Pg.364]    [Pg.414]    [Pg.434]    [Pg.214]    [Pg.215]    [Pg.215]    [Pg.219]    [Pg.220]    [Pg.220]    [Pg.446]    [Pg.447]    [Pg.450]    [Pg.481]   
See also in sourсe #XX -- [ Pg.681 , Pg.686 ]

See also in sourсe #XX -- [ Pg.237 ]

See also in sourсe #XX -- [ Pg.423 , Pg.427 , Pg.432 ]

See also in sourсe #XX -- [ Pg.63 , Pg.67 , Pg.141 ]

See also in sourсe #XX -- [ Pg.28 , Pg.161 , Pg.178 , Pg.370 , Pg.373 , Pg.378 , Pg.384 , Pg.432 , Pg.433 ]

See also in sourсe #XX -- [ Pg.369 ]

See also in sourсe #XX -- [ Pg.681 , Pg.686 ]

See also in sourсe #XX -- [ Pg.99 , Pg.100 , Pg.125 , Pg.126 , Pg.290 , Pg.308 , Pg.313 ]

See also in sourсe #XX -- [ Pg.97 , Pg.100 ]

See also in sourсe #XX -- [ Pg.53 , Pg.54 , Pg.55 , Pg.56 ]

See also in sourсe #XX -- [ Pg.138 , Pg.166 , Pg.167 , Pg.168 , Pg.169 , Pg.170 , Pg.171 , Pg.172 , Pg.173 , Pg.174 , Pg.175 , Pg.176 , Pg.177 , Pg.185 , Pg.370 , Pg.524 , Pg.551 , Pg.635 , Pg.636 , Pg.637 , Pg.638 , Pg.639 , Pg.640 , Pg.641 , Pg.642 , Pg.643 , Pg.659 ]




SEARCH



A some PCA results

Administration Routes During PCA

Aims of PCA

Epidural PCA

Fuzzy PCA

Fuzzy PCA (Nonorthogonal Procedure)

Fuzzy PCA (Orthogonal)

GRID/PCA

Geometrical Description of PCA

Linear PCA

Loading in PCA

Mathematical Description of PCA

Non-linear PCA

Non-linear PCA biplot

Nonlinear PCA

PCA (Precipitation with Compressed

PCA Calculations Demonstrated with a Simple Example

PCA Decomposition

PCA Graphical User Interface

PCA Pumps

PCA analysis

PCA and PCR

PCA and cluster analysis

PCA calibration

PCA component

PCA levels

PCA loading

PCA loading plots

PCA model

PCA of Perfluorinated Phases

PCA scores

PCA to Traditional Methods of Analgesic Administration

PCA with multiple linear regression analysis

PCA — Spin Model Equivalence

PCAS method

PCAs

Patient-controlled analgesia PCA pumps

Peripheral PCA

Pharmacokinetic Basis for PCA

Polychlorinated n-Alkanes (PCAs)

Polymerase chain assembly (PCA)

Principal component analysis (PCA scores

Principal components analysis (PCA

Problems with PCA Delivery

Score in PCA

Sodium PCA

Special Concerns for PCA in Rehabilitation Patients

Three-way PCA

Totalistic PCA

Transdermal PCA

Types of Analgesics Used for PCA

© 2024 chempedia.info