Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Least squares, partial

Partial least-squares in latent variables (PLS) is sometimes called partial least-squares regression, or PLSR. As we are about to see, PLS is a logical, easy to understand, variation of PCR. [Pg.131]

In addition to the set of new coordinate axes (basis space) for the spectral data (the x-block), we also find a set of new coordinate axes (basis space) for the concentration data (the y-block). [Pg.131]

In addition to expressing the spectral data as projections onto the spectral factors (basis vectors), we express the concentration data as projections onto the concentration factors (basis vectors). [Pg.131]

On a rank-by-rank (i.e. factor-by-factor) basis, we rotate, or perturb, each pair of factors, (1 spectral factor and its corresponding concentration factor) towards each other to maximize the fit of the linear regression between the projections of the spectra onto the spectral factor with the projections of the concentrations onto the concentration factor. [Pg.132]

Partial Least Squares (PLS) attempts to find so-called latent variables that capture the variance in the data and at the same time achieves maximum correlation between predicted variables Y and predictor variables X. [Pg.317]

Originally, PLS was a technique that would produce a static linear model, although also non-linear and dynamic versions have been published in the literature. As principal components in principal component analysis, the use of latent variables in PLS can reduce the dimensionality of the problem considerably. [Pg.317]

The method of partial least squares (PLS) is also a regression technique which makes use of quantities like PCs derived from the set of independent variables. The PCs in PLS regression models are called latent variables (LV), as shown in the PLS equation, eqn (7.9). [Pg.153]

As in PCA, PLS will generate as many latent variables (q) as the smaller of N (dimensions) or P (samples). Thus far, PLS appears to generate identical models to PCR so what is the difference (other than terminology) The answer is that the PLS procedure calculates the latent variables and the regression coefficients in eqn (7.9) all at the same time. The algorithm is actually an iterative procedure (Wold 1978) but the effect is to combine the [Pg.153]

PCA step of PCR with the regression step. Latent variables, like PCs, are calculated to explain most of the variance in the x set while remaining orthogonal to one another. Thus, the first latent variable (LVi) will explain most of the variance in the independent set, LV2 the next largest amount of variance and so on. The important difference between PLS and PCR is that the latent variables are constructed so as to maximize their correlation with the dependent variable. Unlike PCR equations where the PCs do not enter in any particular order (see eqns 7.6 to 7.8) the latent variables will enter PLS equations in the order one, two, three, etc. The properties of latent variables are  [Pg.154]

Note that this sum of squares looks similar to the residual sum of squares (RSS) given by eqn (6.12) but is different in eqn (6.12) the j/i is predicted from an equation that includes that data point here the y, is not in the model hence the term predictive residual sum of squares. The difference in predictive ability of two PLS models can be evaluated by comparison of their PRESS values. [Pg.154]

The E statistic compares a PLS model of i components with the model containing one component less and, in order to evaluate PRESS for the one component model, PRESS for the model containing no components is calculated by comparing predicted values with the mean. A critical value of 0.4 has been suggested for E (Wold 1978) and when this is exceeded, the PLS equation with i components is doing no better (or worse) in prediction than the model with i—1 latent variables. [Pg.155]

Another solution of (10.5) called bidiagonalization is used in Partial Least Squares (PLS). Here the original input matrix X is linked to two orthogonal matrices O and W by a bidiagonal matrix L. [Pg.323]

In contrast with PCR, the PLS factors (vectors w) are not independent of each other. [Pg.323]

The selection of PLS factors is therefore simplified because they are a priori ranked according to their value. [Pg.323]

The validity of the parameterized techniques discussed thus far relies on the linear and stable relationship between concentration and response of individual sensors. With this caveat in mind, these data reduction and evaluation tools can also be used for optimization of arrays in terms of numbers and orthogonality. [Pg.323]

Another well-established method for building predictive models is partial least squares (PLS) regression. PLS is a modem relative of MLR, having been established in the 1960 s by Wold. The method reaches beyond linear regression by replacing the descriptors with a matrix of latent variables distilled from both the structural features of the training compounds and their experimental results. In PLS, the use of the term latent variables differs from its formal definition in other regression methods.  [Pg.367]

The use of these latent variables provides PLS with a greater capacity than MLR to distill most of the information contained in a large number of descriptors and properties into a smaller number of factors and thereby avoid some of the ravages of the Curse of Dimensionality. It is especially suited to dealing with systems containing many highly correlated descriptors. PLS is related to the methods of principal components analysis (PCA) and maximum redundancy analysis (MRA), in that all three methods augment the raw desaiptors with matrices derived from variance found in the descriptors themselves (PCA), the properties to be modeled (MRA), or a combination of both (PLS).  [Pg.367]

The performance of PLS is comparable to that of methods such as ridge regression and neural networks, particularly when it is used to model properties that have a principally linear relationship to combinations of the descriptors. An excellent example of the use of PLS in the interpretation of the spectra of chemical mixtures is provided by Tobias.  [Pg.367]

In Section 33.2.2 we showed how LDA classification can be described as a regression problem with class variables. As a regression model, LDA is subject to the problems described in Chapter 10. For instance, the number of variables should not exceed the number of objects. One solution is to apply feature selection or [Pg.232]

PLS is often presented as the major regression technique for multivariate data. In fact its use is not always justified by the data, and the originators of the method were well aware of this, but, that being said, in some applications PLS has been spectacularly successful. In some areas such as QSAR, or even biometrics and psychometrics, [Pg.297]

An important feature of PLS is that it takes into account errors in both the concentration estimates and the spectra. A method such as PCR assumes that the concentration estimates are error free. Much traditional statistics rests on this assumption, that all errors are of the variables (spectra). If in medicine it is decided to determine the concentration of a compound in the urine of patients as a function of age, it is assumed that age can be estimated exactly, the statistical variation being in the concentration of a compound and the nature of the urine sample. Yet in chemistry there are often significant errors in sample preparation, for example accuracy of weighings and dilutions, and so the independent variable in itself also contains errors. Classical and inverse calibration force the user to choose which variable contains the error, whereas PLS assumes that it is equally distributed in both the x and c blocks. [Pg.298]

The most widespread approach is often called PLS1. Although there are several algorithms, the main ones due to Wold and Martens, the overall principles are fairly straightforward. Instead of modelling exclusively the x variables, two sets of models are obtained as follows  [Pg.298]

Additionally, tire analogy to ga or tire eigenvalue of a PC involves multiplying tire sum of squares of both ta and pa together, so we define tire magnitude of a PLS component as [Pg.299]

This will have tire property that tire sum of values of ga for all nonzero components add up to tire sum of squares of tire original (preprocessed) data. Note that in contrast to PCA, tire size of successive values of ga does not necessarily decrease as each component is calculated. This is because PLS does not only model the x data, and is a compromise between x and c block regression. [Pg.299]


Some methods that paitly cope with the above mentioned problem have been proposed in the literature. The subject has been treated in areas like Cheraometrics, Econometrics etc, giving rise for example to the methods Partial Least Squares, PLS, Ridge Regression, RR, and Principal Component Regression, PCR [2]. In this work we have chosen to illustrate the multivariable approach using PCR as our regression tool, mainly because it has a relatively easy interpretation. The basic idea of PCR is described below. [Pg.888]

Another problem is to determine the optimal number of descriptors for the objects (patterns), such as for the structure of the molecule. A widespread observation is that one has to keep the number of descriptors as low as 20 % of the number of the objects in the dataset. However, this is correct only in case of ordinary Multilinear Regression Analysis. Some more advanced methods, such as Projection of Latent Structures (or. Partial Least Squares, PLS), use so-called latent variables to achieve both modeling and predictions. [Pg.205]

After an alignment of a set of molecules known to bind to the same receptor a comparative molecular field analysis CoMFA) makes it possible to determine and visuahze molecular interaction regions involved in hgand-receptor binding [51]. Further on, statistical methods such as partial least squares regression PLS) are applied to search for a correlation between CoMFA descriptors and biological activity. The CoMFA descriptors have been one of the most widely used set of descriptors. However, their apex has been reached. [Pg.428]

To gain insight into chemometric methods such as correlation analysis, Multiple Linear Regression Analysis, Principal Component Analysis, Principal Component Regression, and Partial Least Squares regression/Projection to Latent Structures... [Pg.439]

Kohonen network Conceptual clustering Principal Component Analysis (PCA) Decision trees Partial Least Squares (PLS) Multiple Linear Regression (MLR) Counter-propagation networks Back-propagation networks Genetic algorithms (GA)... [Pg.442]

One widely used algorithm for performing a PCA is the NIPALS (Nonlineai Iterative Partial Least Squares) algorithm, which is described in Ref [5],... [Pg.448]

Partial Least Squares Regression/Projection to Laterrt Structures (PLS)... [Pg.449]

Partial Least Squares Regression, also called Projection to Latent Structures, can be applied to estabfish a predictive model, even if the features are highly correlated. [Pg.449]

On the other hand, techniques like Principle Component Analysis (PCA) or Partial Least Squares Regression (PLS) (see Section 9.4.6) are used for transforming the descriptor set into smaller sets with higher information density. The disadvantage of such methods is that the transformed descriptors may not be directly related to single physical effects or structural features, and the derived models are thus less interpretable. [Pg.490]

The previously mentioned data set with a total of 115 compounds has already been studied by other statistical methods such as Principal Component Analysis (PCA), Linear Discriminant Analysis, and the Partial Least Squares (PLS) method [39]. Thus, the choice and selection of descriptors has already been accomplished. [Pg.508]

Partial Least Squares and Molecular Field Analysis... [Pg.724]

The ability of partial least squares to cope with data sets containing very many x values is considered by its proponents to make it particularly suited to modern-day problems, where it is very easy to compute an extremely large number of descriptors for each compound (as in CoMFA). This contrasts with the traditional situation in QSAR, where it could be time-consuming to measure the required properties or where the analysis was restricted to traditional substituent constants. [Pg.727]

E Johansson and M Cocchi 1993. PLS - Partial Least-squares Projections to Latent Structures. In binyi H (Editor) 3D QSAR in Drug Design. Leiden, ESCOM, pp. 523-550. [Pg.742]

The field points must then be fitted to predict the activity. There are generally far more field points than known compound activities to be fitted. The least-squares algorithms used in QSAR studies do not function for such an underdetermined system. A partial least squares (PLS) algorithm is used for this type of fitting. This method starts with matrices of field data and activity data. These matrices are then used to derive two new matrices containing a description of the system and the residual noise in the data. Earlier studies used a similar technique, called principal component analysis (PCA). PLS is generally considered to be superior. [Pg.248]

PLS (partial least-squares) algorithm used for 3D QSAR calculations PM3 (parameterization method three) a semiempirical method PMF (potential of mean force) a solvation method for molecular dynamics calculations... [Pg.367]

Partial least-squares path modeling with latent variables (PLS), a newer, general method of handling regression problems, is finding wide apphcation in chemometrics. This method allows the relations between many blocks of data ie, data matrices, to be characterized (32—36). Linear and multiple regression techniques can be considered special cases of the PLS method. [Pg.426]

Other chemometrics methods to improve caUbration have been advanced. The method of partial least squares has been usehil in multicomponent cahbration (48—51). In this approach the concentrations are related to latent variables in the block of observed instmment responses. Thus PLS regression can solve the colinearity problem and provide all of the advantages discussed earlier. Principal components analysis coupled with multiple regression, often called Principal Component Regression (PCR), is another cahbration approach that has been compared and contrasted to PLS (52—54). Cahbration problems can also be approached using the Kalman filter as discussed (43). [Pg.429]

Mancozeb is a dithiocarbamate pesticide with a very low solubility in organic and inorganic solvent. In this work we have developed a solvent free, accurate and fast photoacoustic FTIR-based methodology for Mancozeb determination in commercial fungicides. The proposed procedure was based on the direct measurement of the solid samples in the middle infrared region using a photoacoustic detector. A multivariate calibration approach based on the use of partial least squares (PLS) was employed to determine the pesticide content in commercially available formulations. [Pg.93]

DETERMINATION OF NUTRITIONAL PARAMETERS OF YOGHURT SAMPLES THROUGH PARTIAL-LEAST-SQUARES ATTENUATED TOTAL REFLECTANCE FOURIER TRANSFORM INFRARED SPECTROMETRY... [Pg.142]

The aim of this work is the determination of several nutritional parameters, such as Energetic Value, Protein, Fat, and Carbohydrates content, in commercially available yoghurt samples by using Attenuated Total Reflectance Fourier Transform Infrared (ATR-FT-IR) spectrometry and a partial least square approach. [Pg.142]

The aim of this researeh eonsists on the development of an analytieal methodology to quantify Ca(OH), eontent in neutralized anliydrite samples, using FT-IR speetroseopy and partial least squares quantitative analysis teehnique. [Pg.200]

Most of the 2D QSAR methods are based on graph theoretic indices, which have been extensively studied by Randic [29] and Kier and Hall [30,31]. Although these structural indices represent different aspects of molecular structures, their physicochemical meaning is unclear. Successful applications of these topological indices combined with multiple linear regression (MLR) analysis are summarized in Ref. 31. On the other hand, parameters derived from various experiments through chemometric methods have also been used in the study of peptide QSAR, where partial least square (PLS) [32] analysis has been employed [33]. [Pg.359]

S Wold, A Ruhe, H Wold, WJ Dunn III. The collmearity problem m linear regression. The partial least squares (PLS) approach to generalized inverses. SIAM I Sci Stat Comput 5 735-743, 1984. [Pg.367]


See other pages where Least squares, partial is mentioned: [Pg.429]    [Pg.484]    [Pg.722]    [Pg.722]    [Pg.723]    [Pg.724]    [Pg.724]    [Pg.725]    [Pg.726]    [Pg.168]    [Pg.168]    [Pg.426]    [Pg.328]    [Pg.19]    [Pg.366]    [Pg.359]    [Pg.360]    [Pg.366]   
See also in sourсe #XX -- [ Pg.702 ]

See also in sourсe #XX -- [ Pg.359 ]

See also in sourсe #XX -- [ Pg.86 , Pg.95 , Pg.107 ]

See also in sourсe #XX -- [ Pg.232 , Pg.331 ]

See also in sourсe #XX -- [ Pg.152 , Pg.160 ]

See also in sourсe #XX -- [ Pg.399 ]

See also in sourсe #XX -- [ Pg.107 , Pg.113 , Pg.114 , Pg.119 , Pg.125 , Pg.127 , Pg.131 , Pg.132 , Pg.134 , Pg.138 , Pg.159 , Pg.160 , Pg.418 , Pg.460 , Pg.494 , Pg.502 ]

See also in sourсe #XX -- [ Pg.166 ]

See also in sourсe #XX -- [ Pg.485 ]

See also in sourсe #XX -- [ Pg.383 ]

See also in sourсe #XX -- [ Pg.295 , Pg.306 ]

See also in sourсe #XX -- [ Pg.3 , Pg.67 ]

See also in sourсe #XX -- [ Pg.94 , Pg.95 ]

See also in sourсe #XX -- [ Pg.323 ]

See also in sourсe #XX -- [ Pg.64 , Pg.189 , Pg.221 , Pg.226 , Pg.266 , Pg.394 ]

See also in sourсe #XX -- [ Pg.216 ]

See also in sourсe #XX -- [ Pg.199 ]

See also in sourсe #XX -- [ Pg.46 , Pg.47 , Pg.48 , Pg.49 , Pg.50 , Pg.154 , Pg.271 ]

See also in sourсe #XX -- [ Pg.405 , Pg.406 , Pg.407 , Pg.408 , Pg.409 , Pg.410 , Pg.411 , Pg.412 , Pg.413 , Pg.414 , Pg.415 , Pg.416 , Pg.417 ]

See also in sourсe #XX -- [ Pg.297 , Pg.298 , Pg.299 , Pg.300 , Pg.301 , Pg.302 , Pg.303 , Pg.304 , Pg.305 , Pg.306 , Pg.307 , Pg.308 , Pg.309 , Pg.310 , Pg.311 , Pg.312 ]

See also in sourсe #XX -- [ Pg.52 ]

See also in sourсe #XX -- [ Pg.3435 ]

See also in sourсe #XX -- [ Pg.107 , Pg.113 , Pg.114 , Pg.119 , Pg.125 , Pg.127 , Pg.131 , Pg.132 , Pg.134 , Pg.138 , Pg.159 , Pg.160 , Pg.422 , Pg.464 , Pg.498 , Pg.506 ]

See also in sourсe #XX -- [ Pg.44 , Pg.57 ]

See also in sourсe #XX -- [ Pg.188 ]

See also in sourсe #XX -- [ Pg.152 , Pg.160 ]

See also in sourсe #XX -- [ Pg.61 , Pg.389 ]

See also in sourсe #XX -- [ Pg.47 ]

See also in sourсe #XX -- [ Pg.493 , Pg.498 , Pg.502 , Pg.592 ]

See also in sourсe #XX -- [ Pg.42 , Pg.79 ]

See also in sourсe #XX -- [ Pg.279 ]

See also in sourсe #XX -- [ Pg.39 , Pg.45 , Pg.47 ]

See also in sourсe #XX -- [ Pg.1011 , Pg.1036 ]

See also in sourсe #XX -- [ Pg.18 ]

See also in sourсe #XX -- [ Pg.2 , Pg.200 ]

See also in sourсe #XX -- [ Pg.323 , Pg.373 ]

See also in sourсe #XX -- [ Pg.702 ]

See also in sourсe #XX -- [ Pg.352 ]

See also in sourсe #XX -- [ Pg.271 ]

See also in sourсe #XX -- [ Pg.67 ]

See also in sourсe #XX -- [ Pg.67 ]

See also in sourсe #XX -- [ Pg.234 , Pg.254 , Pg.327 ]

See also in sourсe #XX -- [ Pg.225 ]

See also in sourсe #XX -- [ Pg.118 ]

See also in sourсe #XX -- [ Pg.143 , Pg.184 , Pg.231 , Pg.233 , Pg.235 , Pg.236 , Pg.238 , Pg.239 , Pg.245 , Pg.246 , Pg.254 , Pg.255 , Pg.256 , Pg.264 ]

See also in sourсe #XX -- [ Pg.493 , Pg.498 , Pg.502 , Pg.592 ]

See also in sourсe #XX -- [ Pg.307 ]

See also in sourсe #XX -- [ Pg.2 , Pg.465 ]

See also in sourсe #XX -- [ Pg.310 , Pg.314 , Pg.315 ]

See also in sourсe #XX -- [ Pg.8 ]

See also in sourсe #XX -- [ Pg.241 ]

See also in sourсe #XX -- [ Pg.235 ]

See also in sourсe #XX -- [ Pg.92 ]

See also in sourсe #XX -- [ Pg.235 ]

See also in sourсe #XX -- [ Pg.5 , Pg.14 , Pg.48 ]

See also in sourсe #XX -- [ Pg.319 ]

See also in sourсe #XX -- [ Pg.83 ]

See also in sourсe #XX -- [ Pg.52 ]

See also in sourсe #XX -- [ Pg.54 ]

See also in sourсe #XX -- [ Pg.72 , Pg.480 ]

See also in sourсe #XX -- [ Pg.151 , Pg.284 , Pg.311 , Pg.316 ]

See also in sourсe #XX -- [ Pg.136 , Pg.151 , Pg.153 , Pg.154 , Pg.189 , Pg.199 , Pg.209 , Pg.215 , Pg.295 ]

See also in sourсe #XX -- [ Pg.686 ]

See also in sourсe #XX -- [ Pg.147 , Pg.154 , Pg.155 ]

See also in sourсe #XX -- [ Pg.301 ]

See also in sourсe #XX -- [ Pg.223 , Pg.474 ]

See also in sourсe #XX -- [ Pg.53 ]

See also in sourсe #XX -- [ Pg.21 ]

See also in sourсe #XX -- [ Pg.354 ]

See also in sourсe #XX -- [ Pg.301 , Pg.348 , Pg.362 , Pg.373 , Pg.376 ]




SEARCH



Analytical methods partial least squares regression

Bootstrapping, partial least squares

Caffeine partial least-squares

Chapter 5 Partial Least-Squares Regression

Chemometrical partial least squares

Chemometrics partial least squares

Correlation partial least squares

GRID partial least squares

Genetic algorithm/Partial least squares

Genetic partial least squares

INDEX partial least squares

L-shaped partial least squares

Latent structures, partial least squares , projection

Molecular descriptors partial least squares

Moving window partial least-squares regression

Multi-way partial least squares

Multiple linear regression and partial least squares

Multivariate calibration techniques partial least squares

Multivariate partial least squares

Multivariate statistical analysis partial least squares projections

Multivariate statistical models Partial least square analysis

Non-linear iterative partial least squares

Non-linear iterative partial least squares NIPALS)

Nonlinear Iterative Partial Least Squares

Nonlinear Iterative Partial Least Squares NIPALS)

Nonlinear iterative partial least squares NIPALS) algorithm

Nonlinear partial least squares

PLS Partial Least Squares Projections to Latent Structures

PLS, Partial least squares

PLS, partial least squares regression

Partial Least Squares (PLS) Analysis and Other Multivariate Statistical Methods

Partial Least Squares Projection of Latent

Partial Least Squares Projection of Latent Structures

Partial Least Squares Projection of Latent Structures (PLS)

Partial Least Squares Projections to Latent Structures (PLS) in Chemistry

Partial Least Squares calibration

Partial Least Squares case study

Partial Least Squares examples

Partial Least Squares extensions

Partial Least Squares formulation

Partial Least Squares prediction

Partial Least Squares regression

Partial least square differential analysis

Partial least square regression modeling

Partial least squares , pattern recognition technique

Partial least squares <PtSJ and principal example

Partial least squares <PtSJ and principal summary of validation

Partial least squares NIPALS)

Partial least squares Quantitative Structure-Activity

Partial least squares Relationship)

Partial least squares Subject

Partial least squares algorithm

Partial least squares analysis

Partial least squares basis

Partial least squares block

Partial least squares chemometrical analysis

Partial least squares coefficient matrix

Partial least squares components

Partial least squares convergence

Partial least squares cross-validation

Partial least squares discriminant

Partial least squares discriminant analysis

Partial least squares discriminant analysis , exploratory

Partial least squares discriminant analysis PLS-DA)

Partial least squares discriminant data classification

Partial least squares discriminate analysis

Partial least squares discriminate analysis PLS-DA)

Partial least squares efficiency

Partial least squares extraction, component

Partial least squares factors

Partial least squares inner relations

Partial least squares method

Partial least squares model

Partial least squares model analysis

Partial least squares model chemometrical analysis

Partial least squares model chromatography

Partial least squares model modelling

Partial least squares modeling

Partial least squares models accuracy

Partial least squares models cross-validation

Partial least squares models dimensionality

Partial least squares models prediction

Partial least squares models selectivity

Partial least squares models sensitivity

Partial least squares models statistics

Partial least squares multiple responses

Partial least squares nonlinear iterative algorithm

Partial least squares principles

Partial least squares problem

Partial least squares projection

Partial least squares projections to latent structure

Partial least squares quantification

Partial least squares regression Subject

Partial least squares regression coefficients

Partial least squares regression models

Partial least squares regression, analytical

Partial least squares residuals matrices

Partial least squares weight vectors

Partial least squares-discriminant analysis classification

Partial least squares-discriminant analysis components

Partial least squares-discriminant analysis vectors, regression

Partial least-square methodology

Partial least-squares analysis between different

Partial least-squares analysis sites

Partial least-squares in latent variables

Partial least-squares regression analysis

Partial least-squares regression method

Partial least-squares technique

Partial least-squares technique properties

Partial least-squares technique regression model

Principal Component Regression and Partial Least Squares

Quantitative structure-activity relationship partial least square method

Regression on principal components and partial least squares

Statistical models partial least squares

Targeted orthogonal partial least squares

© 2024 chempedia.info