Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression least squares

CLS has a number of advantages over linear least-squares regression. First, it is a multivariate technique, and the restriction of spectrally isolated components is removed. Second, the entire spectrum can be used and the use of a large number of wavenumber positions has an averaging effect on the data and errors are reduced. CLS can accommodate some baseline effects that is, it can have nonzero intercepts for some components. (As noted above, at zero concentration the absorbance need not necessarily be zero because of spectral or instmmental anomalies.) [Pg.210]

There is one serious shortcoming to CLS The set of equations must be calibrated for every component in the mixture. This is a direct consequence of the relationship between absorbance and concentration. Absorbance is the sum of the concentrations of all the components in the mixture, and if one or more components are ignored, the equations are invalid. The corollary to this is that if an unknown or uncalibrated component appears in an unknown mixture, the model will be unable to predict any of the components correctly. [Pg.210]

Other problems are a little more subtle. If any of the constituents interact chemically (including hydrogen bonding), the bands of the pure components in the calibration mixtures will distort as the concentration is changed. This is equivalent to having an unknown component that will cause the model to fad. The compositions of the calibration mixtures must be random and encompass the concentrations in the unknown mixtures. If the concentration of a component of an unknown mixture is lower or higher than all of the concentrations for that component in the calibration mixtures, the method will fad unless the system is completely linear. In this case the method may be able to extrapolate to those values outside the data set, but it is difficult to assure that the system is completely linear. Finally, the method will fail if any of the calibration mixtures are linearly related. A linear relation simply means that the concentrations of all the components in one mixture are a factor of all the component concentrations in another mixture. In this case, the method will fail because it will be mathematically impossible to invert the C matrix will become indeterminate that is, it is equivalent to dividing by zero. [Pg.210]

Despite the shortcomings of CLS it is a valuable technique that can be used for quite complex mixmres. The shortcomings can be overcome in large part through rearrangement of Beer s law, as described in the following section. [Pg.210]

The greatest shortcomings to CLS are that all the components of a mixture must be known and that no new components can be present in unknown mixtures. Only under very strictly controlled conditions will this be true, and such a situation is not applicable to many real-world problems. Furthermore, it is often the case that only a subset of the components is of interest therefore, it is preferable that only those components are built into the model. This condition can be met if Beer s [Pg.210]


After an alignment of a set of molecules known to bind to the same receptor a comparative molecular field analysis CoMFA) makes it possible to determine and visuahze molecular interaction regions involved in hgand-receptor binding [51]. Further on, statistical methods such as partial least squares regression PLS) are applied to search for a correlation between CoMFA descriptors and biological activity. The CoMFA descriptors have been one of the most widely used set of descriptors. However, their apex has been reached. [Pg.428]

To gain insight into chemometric methods such as correlation analysis, Multiple Linear Regression Analysis, Principal Component Analysis, Principal Component Regression, and Partial Least Squares regression/Projection to Latent Structures... [Pg.439]

Partial Least Squares Regression/Projection to Laterrt Structures (PLS)... [Pg.449]

Partial Least Squares Regression, also called Projection to Latent Structures, can be applied to estabfish a predictive model, even if the features are highly correlated. [Pg.449]

On the other hand, techniques like Principle Component Analysis (PCA) or Partial Least Squares Regression (PLS) (see Section 9.4.6) are used for transforming the descriptor set into smaller sets with higher information density. The disadvantage of such methods is that the transformed descriptors may not be directly related to single physical effects or structural features, and the derived models are thus less interpretable. [Pg.490]

Dichromate-permanganate determination is an artificial problem because the matrix of coefficients can be obtained as the slopes of A vs. x from four univariate least squares regression treatments, one on solutions containing only at... [Pg.84]

The results obtained by lineal regression (LR) and by pareial least square regression (PLS) methods have been eompared to quantify tlie 0-H signal in anhydrite samples. The PLS quality is eharaeterized by a eoirelation eoeffieient of 0.9942 (eross-validation) using four faetors and a root mean square eiTor of ealibration (RMSEC) of 0.058. Tlie eoirelation eoeffieient of LR metliod obtained was 0.9753. [Pg.200]

A reading of Section 2.2 shows that all of the methods for determining reaction order can lead also to estimates of the rate constant, and very commonly the order and rate constant are determined concurrently. However, the integrated rate equations are the most widely used means for rate constant determination. These equations can be solved analytically, graphically, or by least-squares regression analysis. [Pg.31]

The simplest procedure is merely to assume reasonable values for A and to make plots according to Eq. (2-52). That value of A yielding the best straight line is taken as the correct value. (Notice how essential it is that the reaction be accurately first-order for this method to be reliable.) Williams and Taylor have shown that the standard deviation about the line shows a sharp minimum at the correct A . Holt and Norris describe an efficient search strategy in this procedure, using as their criterion minimization of the weighted sum of squares of residuals. (Least-squares regression is treated later in this section.)... [Pg.36]

The most widely used method for fitting a straight line to integrated rate equations is by linear least-squares regression. These equations have only two variables, namely, a concentration or concentration ratio and a time, but we will develop a more general relationship for use later in the book. [Pg.41]

Because there is only one independent variable, the subscript has been omitted. We now note that Zx/n = x and Zy/n = y, so we find Eqs. (2-75) as the normal equations for unweighted univariate least-squares regression. [Pg.44]

Carrying through the treatment as before yields Eqs. (2-78) as the normal equations for weighted linear univariate least-squares regression. [Pg.44]

Least-squares regression of In c on t then gives estimates of In Cq and k. Because time is usually measured with much greater accuracy than is concentration, we need only consider the uncertainty in the dependent variable. [Pg.45]

Calculations of the confidence intervals about the least-squares regression line, using Eq. (2-100), reveal that the confidence limits are curved, the interval being smallest at Xj = x. [Pg.49]

Referring to the earlier treatment of linear least-squares regression, we saw that the key step in obtaining the normal equations was to take the partial derivatives of the objective function with respect to each parameter, setting these equal to zero. The general form of this operation is... [Pg.49]

Now if the function is linear in the parameters, the derivative dyidaj does not contain the parameters, and the resulting set of equations can be solved for the parameters. If, however, the function is nonlinear in the parameters, the derivative contains the parameters, and the equations cannot in general be solved for the parameters. This is the basic problem in nonlinear least-squares regression. [Pg.49]

Obtain the weighting function required to carry out weighted least-squares regression analysis of Eq. (2-15). [Pg.57]

We wish to apply weighted linear least-squares regression to Eq. (6-2), the linearized form of the Arrhenius equation. Let us suppose that our kinetic studies have provided us with data consisting of Tj, and for at least three temperatures, where o, is the experimental standard deviation of fc,. We will assume that the error in T is negligible relative to that in k. For convenience we write Eq. (6-2) as... [Pg.247]

Equation (6-19) was said to provide a fit as good as or better than those with other equations. The parameters were evaluated by fixing C and carrying out a linear least-squares regression of In k on T C was then altered and the procedure was repeated. The residual sum of squares was taken as a criterion of best fit. [Pg.253]

Abstracted from the compilation by Jaffe, where original references may be found. Value of log k on the least-squares regression line where a = 0 the time unit is seeonds. [Pg.319]

Partial least-squares in latent variables (PLS) is sometimes called partial least-squares regression, or PLSR. As we are about to see, PLS is a logical, easy to understand, variation of PCR. [Pg.131]

Donahue, S.M., Brown, C.W., Scott, M.J., "Analysis of Deoxyribonucleotides with Principal Component and Partial Least-Squares Regression of UV Spectra after Fourier Processing", Appl. Spec. 1990 (44) 407-413. [Pg.194]

Geladi, P., Kowalski, B.R., "Partial Least-Squares Regression A Tutorial", Anal. Chim. Acta, 1986 (185) 1-17. [Pg.194]

The method of least squares provides the most powerful and useful procedure for fitting data. Among other applications in kinetics, least squares is used to calculate rate constants from concentration-time data and to calculate other rate constants from the set of -concentration values, such as those depicted in Fig. 2-8. If the function is linear in the parameters, the application is called linear least-squares regression. The more general but more complicated method is nonlinear least-squares regression. These are examples of linear and nonlinear equations ... [Pg.37]

Data analysis, methods for (see Least-squares regression)... [Pg.278]

Langmuir adsorption isotherm, 93 Laser flash photolysis, 263-266 Least-squares regression, 37-40 linear, 37-39 nonlinear, 39-40 unweighted, 38 weighted, 38-39 Lifetime, 16... [Pg.279]


See other pages where Regression least squares is mentioned: [Pg.523]    [Pg.849]    [Pg.39]    [Pg.41]    [Pg.44]    [Pg.49]    [Pg.51]    [Pg.51]    [Pg.52]    [Pg.72]    [Pg.73]    [Pg.250]    [Pg.442]    [Pg.445]    [Pg.99]    [Pg.183]    [Pg.203]    [Pg.361]   
See also in sourсe #XX -- [ Pg.217 ]

See also in sourсe #XX -- [ Pg.233 , Pg.234 , Pg.235 , Pg.236 , Pg.237 , Pg.238 , Pg.239 , Pg.240 , Pg.241 , Pg.242 , Pg.243 , Pg.244 ]

See also in sourсe #XX -- [ Pg.2 , Pg.456 ]

See also in sourсe #XX -- [ Pg.314 ]

See also in sourсe #XX -- [ Pg.48 ]

See also in sourсe #XX -- [ Pg.425 ]




SEARCH



Analytical methods partial least squares regression

Chapter 5 Partial Least-Squares Regression

Classical least-squares regression

Classical least-squares regression method

Complex Non-Linear Regression Least-Squares (CNRLS) for the Analysis of Impedance Data

Complex non-linear regression least-squares

Complex non-linear regression least-squares CNRLS)

Equations least squares regression line

Inverse least-squares regression

Least Squares Regression with an Explanatory Variable

Least median squares regression

Least squares linear regression analysi

Least squares linear regression continued)

Least squares methods regression

Least squares regression basic algorithm

Least squares regression line

Least-Squares Minimization (Regression Analysis)

Least-squares linear regression

Least-squares linear regression analysis of variable temperature

Linear least-squares regression analysis

Linear least-squares regression analysis kinetic data

Linear least-squares regression model

Method of least squares regression

Moving window partial least-squares regression

Multiple linear least squares regression

Multiple linear least squares regression MLLSR)

Multiple linear regression and partial least squares

Multiple linear regression inverse least squares model

Multiple linear regression. Least squares fitting of response surface models

Multivariate least squares regression

Non-linear least-squares regression

Nonlinear least squares regression analysis

Nonlinear least-squares regression

Nonlinear least-squares regression analysis kinetic data

Nonmatrix Solutions to the Linear, Least-Squares Regression Problem

Numerical Curve Fitting The Method of Least Squares (Regression)

Ordinary least squares regression independent variables

Ordinary least squares regression values, responses

Ordinary least-squares linear regression coefficients

Ordinary least-squares regression analysis

PLS, partial least squares regression

Partial Least Squares regression

Partial least square regression modeling

Partial least squares regression Subject

Partial least squares regression coefficients

Partial least squares regression models

Partial least squares regression, analytical

Partial least squares-discriminant analysis vectors, regression

Partial least-squares regression analysis

Partial least-squares regression method

Partial least-squares technique regression model

Performing least squares regression

Principal Component Regression and Partial Least Squares

Regression (Least-Squares Classification)

Regression analysis classical least squares

Regression analysis inverse least squares

Regression analysis linear least squares method

Regression analysis nonlinear least squares method

Regression analysis, least-squares

Regression least squares loss function

Regression matrix least squares

Regression on principal components and partial least squares

Regression ordinary least squares

Reweighted least squares regression

Simple least squares regression

Simple linear least squares regression

Simple linear least squares regression SLLSR)

Statistical Formulas Used in Linear Regression (Least Squares) Analyses

Statistical analysis least-square regression

The Method of Least Squares (Regression)

The Method of Least Squares and Simple Linear Regression

Trend Evaluation with Ordinary Least Squares Regression

Univariate least-squares regression

Unweighted least squares regression analysis

Useful Formulae for Ordinary, Least-Squares Regression

Weighted least-squares regression

© 2024 chempedia.info