Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Chapter 5 Partial Least-Squares Regression

Human perception of flavor occurs from the combined sensory responses elicited by the proteins, lipids, carbohydrates, and Maillard reaction products in the food. Proteins Chapters 6, 10, 11, 12) and their constituents and sugars Chapter 12) are the primary effects of taste, whereas the lipids Chapters 5, 9) and Maillard products Chapter 4) effect primarily the sense of smell (olfaction). Therefore, when studying a particular food or when designing a new food, it is important to understand the structure-activity relationship of all the variables in the food. To this end, several powerful multivariate statistical techniques have been developed such as factor analysis Chapter 6) and partial least squares regression analysis Chapter 7), to relate a set of independent or "causative" variables to a set of dependent or "effect" variables. Statistical results obtained via these methods are valuable, since they will permit the food... [Pg.5]

In the above examples there is a natural way to order the complete data set in two blocks, where both blocks have one mode in common. In Chapter 3 the methods of multiple linear regression, principal component regression, and partial least squares regression will be discussed on an introductory level. [Pg.9]

Olive oil is an ideal candidate for multivariate analysis. For economic reasons, the labelling of olive oils is frequently falsified (Collins, 1993 Firestone and Reina, 1987 Firestone, Carson and Reina, 1988 Firestone et al., 1985 Li-Chan, 1994 Simpkins and Harrison, 1995b Zamora, Navarro and Hidalgo, 1994), so there is a need for easy and cheap methods for identification. This chapter concentrates on the application of multivariate methods to nuclear magnetic resonance (NMR) and pyrolysis mass spectrometry (PyMS) data. It provides a brief introduction to principal components analysis (PCA), principal components regression (PCR), partial least squares regression (PLS) and the use of artificial neural networks (ANNs), then moves on to variable selection and its application to olive oil data. [Pg.318]

This book contains several different NIR applications in food analysis, and many of them use multivariate data handling. Our aim in this chapter is to discuss the aspects of latent variable decomposition in principal component analysis and partial least squares regression and to illustrate their use by an application in the NIR region. [Pg.146]

For the principal components analysis (PCA) and partial least squares regression (PLS) in Chapters 22 and 23, this book makes use of a PLS Toolbox, which is a product of Eigenvector Research, Inc. The PLS Toolbox is a collection of essential and advanced chemometric routines that work within the MATLAB conmutational environment. We are grateful to Eigenvector for permission. For Eigenvector product information please contact ... [Pg.561]

Partial Least Squares (PLS) regression (Section 35.7) is one of the more recent advances in QSAR which has led to the now widely accepted method of Comparative Molecular Field Analysis (CoMFA). This method makes use of local physicochemical properties such as charge, potential and steric fields that can be determined on a three-dimensional grid that is laid over the chemical stmctures. The determination of steric conformation, by means of X-ray crystallography or NMR spectroscopy, and the quantum mechanical calculation of charge and potential fields are now performed routinely on medium-sized molecules [10]. Modem optimization and prediction techniques such as neural networks (Chapter 44) also have found their way into QSAR. [Pg.385]

Since most quantitative applications are on mixtures of materials, complex mathematical treatments have been developed. The most common programs are Multiple Linear Regression (MLR), Partial Least Squares (PLS), and Principal Component Analyses (PCA). While these are described in detail in another chapter, they will be described briefly here. [Pg.173]

This chapter ends with a short description of the important methods, Principal Component Regression (PCR) and Partial Least-Squares (PLS). Attention is drawn to the similarity of the two methods. Both methods aim at predicting properties of samples based on spectroscopic information. The required information is extracted from a calibration set of samples with known spectrum and property. [Pg.5]

If the system is not simple, an inverse calibration method can be employed where it is iKst necessary to obtain the spectra of the pure analytes. The three inverse methods discussed later in this chapter include multiple linear regression (MLR), jirincipal components regression (PCR), and partial least squares (PLS). Wlien using. MLR on data sees found in chemlstiy, variable. sciectson is... [Pg.98]

For inttoductory purposes multiple linear regression (MLR) is used to relate the experimental response to the conditions, as is common to most texts in this area, but it is important to realise that odter regression methods such as partial least squares (PLS) are applicable in many cases, as discussed in Chapter 5. Certain designs, such as dtose of Section 2.3.4, have direct relevance to multivariate calibration. In some cases multivariate methods such as PLS can be modified by inclusion of squared and interaction terms as described below for MLR. It is important to remember, however, diat in many areas of chemistry a lot of information is available about a dataset, and conceptually simple approaches based on MLR are often adequate. [Pg.19]

Although the emphasis in this chapter is on multiple hnear regression techniques, it is important to recognise that the analysis of design experiments is not restricted to such approaches, and it is legitimate to employ multivariate methods such as principal components regression and partial least squares as described in detail in Chapter 5. [Pg.36]

A general requirement for P-matrix analysis is n = rank(R). Unfortcmately, for most practical cases, the rank of R is greater than the number of components, i.e., rank(R) > n, and rank(R) = min(m, p). Thus, P-matrix analysis is associated with the problem of substituting R with an R that produces rank(R ) = n. This is mostly done by orthogonal decomposition methods, such as principal components analysis, partial least squares (PLS), or continuum regression [4]. Dimension requirements of involved matrices for these methods are m > n, and p > n. If the method of least squares is used, additional constraints on matrix dimensions are needed [4]. The approach of P-matrix analysis does not require quantitative concentration information of all constituents. Specifically, calibration samples with known concentrations of analytes under investigation satisfy the calibration needs. The method of PLS will be used in this chapter for P-matrix analysis. [Pg.27]

In the case of many X variables, principal component analysis (PCA) or partial least squares (PLS) analysis (for reviews, see, e.g., Ref. 52) can be used instead of regression analysis, often leading to more stable models. Both methods are discussed in other chapters of this book (see, e.g., Chapters 22 and 25). [Pg.548]


See other pages where Chapter 5 Partial Least-Squares Regression is mentioned: [Pg.373]    [Pg.1]    [Pg.107]    [Pg.127]    [Pg.472]    [Pg.477]    [Pg.28]    [Pg.3]    [Pg.107]    [Pg.128]    [Pg.209]    [Pg.124]    [Pg.473]    [Pg.278]    [Pg.215]    [Pg.8]    [Pg.298]    [Pg.166]    [Pg.3]    [Pg.211]    [Pg.297]    [Pg.84]    [Pg.158]    [Pg.163]    [Pg.181]    [Pg.332]    [Pg.33]    [Pg.26]    [Pg.274]    [Pg.332]    [Pg.352]    [Pg.220]    [Pg.216]    [Pg.111]    [Pg.141]    [Pg.216]   


SEARCH



Least squares regression

Partial Least Squares regression

Partial least squares

Regression partial

© 2024 chempedia.info