Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression ridge

You may notice that the ridge regression is a straightforward statistical counterpart of the regularization methods discussed in Section 1.7. [Pg.179]

Example 3,5.1 Analysis of the rate coefficient of an acid-catalysed reaction by ridge regression [Pg.179]

We assume that the reaction considered in Example 3.2 is not only acid-catalysed but also basis-catalysed. Then its rate coefficient is of the form [Pg.179]

In this system [A-] = [NO H Q-]. Table 3.1 includes the data we need, since the concentration [OH-] can easily be obtained from the ionic product [H+][D-T] = 10- (mol/1) of the water. Fitting (3.64) to the data of Table [Pg.179]

MULTIVARIABLE LINEAR RE6RESSI0N flETHOD OF LEAST SQUARES [Pg.180]

When collinearity is severe, regression procedures must be modified. Two ways to do this are (1) rescaling the data (2) using ridge regression. [Pg.222]

Rescaling of the data should be performed, particularly when some predictor variable values have large ranges, relative to other predictor variables. For example, the model y = bo + biXi + 2X2 + + b Xk rescaled is [Pg.222]

Once the data have been rescaled, perform the regression analysis and check again for collinearity. If it is present, move to ridge regression. [Pg.222]

Ridge regression is also used extensively to remedy multicollinearity between the X, predictor variables. It does this by modifying the least-squares method of computing the coefficients with the addition of a biasing component. [Pg.222]

The sampling distribution of a biased b, estimator is usually much narrower in distribution and, thus, Is a better predictor of Pi, even though biased. [Pg.223]

RR was developed by Hoerl and Kennard. It is a technique that was specifically designed to overcome the ill-conditioning of the matrix X X [Pg.311]

RR is also referred to as a shrinkage regression technique (see, for example, Erank and Eriedman ). We will come back to this point when we have considered PCR, PLS, and CR, which are also shrinkage methods.  [Pg.312]

Much of the literature devoted to RR centers on the mean squared error of the estimates P, MSE(P), which may be written as [Pg.312]

Unfortunately, before RR can be applied, a value of the ridge parameter d must be fovmd. One method that is relatively simple to apply is to perform RR for a range of values of in the interval (0, 1) and to then select as the [Pg.312]

Because of the subjectivity attached to the ridge trace approach for finding the best value for d, many automatic alternative methods have been offered in the literature. The two that appear more frequently are given in Eqs. [18] and [19]. [Pg.313]


Some methods that paitly cope with the above mentioned problem have been proposed in the literature. The subject has been treated in areas like Cheraometrics, Econometrics etc, giving rise for example to the methods Partial Least Squares, PLS, Ridge Regression, RR, and Principal Component Regression, PCR [2]. In this work we have chosen to illustrate the multivariable approach using PCR as our regression tool, mainly because it has a relatively easy interpretation. The basic idea of PCR is described below. [Pg.888]

Hoerl, A. E., Kennard, R. W. Ridge regression Biased estimation for nonorthogonal problems. Technometrics 1970, 12, 55-67. [Pg.499]

Simple OLS, robust regression PLS, PCR, multiple OLS, robust regression, Ridge regression, Lasso regression PLS 2, CCA... [Pg.119]

This approximation can not only be used in the context of multiple OLS, but also for methods where the estimated values are obtained via a relation y = H y with H depending only on the x-variables (e.g., in Ridge regression—Section 4.8.2). [Pg.143]

There is also a link between Ridge regression and PCR (Section 4.6). PCR finds new regressor variables, the principal components of the x-variables, and they can be ordered according to decreasing variance. Since the first few PCs cover the most important information of the x-variables, they are often considered to be most useful for the prediction of the y-variable (although this might not be necessarily true—see... [Pg.181]

FIGURE4.38 Ridge regression for the PAC data set. The optimal ridge parameter AK —4.3 is evaluated using repeated 10-fold CV. The resulting (average) predictions (in black) versus the measured y-values are shown with two different scales because of severe prediction errors for two objects. [Pg.194]

The results are shown under SEP0 2 in the last column of Table 4.3. Ridge regression leads to the best model. [Pg.199]

T. Feam, Misuse of ridge regression in the calibration of near-infrared reflectance instruments, Appl. Statistics, 32, 73-79 (1983). [Pg.436]

PLS (similar to ridge regression) trades bias for variance In case of calculating fewer components (latent variables) than the number of predictor variables. [Pg.275]

In this paper the PLS method was introduced as a new tool in calculating statistical receptor models. It was compared with the two most popular methods currently applied to aerosol data Chemical Mass Balance Model and Target Transformation Factor Analysis. The characteristics of the PLS solution were discussed and its advantages over the other methods were pointed out. PLS is especially useful, when both the predictor and response variables are measured with noise and there is high correlation in both blocks. It has been proved in several other chemical applications, that its performance is equal to or better than multiple, stepwise, principal component and ridge regression. Our goal was to create a basis for its environmental chemical application. [Pg.295]

In Example 3.5.1 we used ridge regression to confirm that the simpler model... [Pg.213]

A.E. Hoerl and R.W. Kennard, Ridge regression Biased estimation for nonorthogona1 problems, Technometrics, 12 (1970) 55-67. [Pg.218]

D. W. Marquardt, Generalized inverses, ridge regression, biased linear estimation and nonlinear estimation. Technometrics, 12, 1970, 591-612. [Pg.179]

Fearn, T., Misuse of Ridge Regression in the Calibration of Near-Infrared Reflectance Instruments Appl. Statistics 1983, 32, 73-79. [Pg.327]

The regression analysis of multicollinear data is described in several papers e.g. [MANDEL, 1985 HWANG and WINEFORDNER, 1988], HWANG and WINEFORD-NER [1988] also discuss the principle of ridge regression, which is, essentially, the addition of a small contribution to the diagonal of correlation matrix. The method of partial least squares (PLS) described in Section 5.7.2 is one approach to solving this problem. [Pg.197]


See other pages where Regression ridge is mentioned: [Pg.426]    [Pg.86]    [Pg.95]    [Pg.346]    [Pg.367]    [Pg.946]    [Pg.485]    [Pg.180]    [Pg.180]    [Pg.181]    [Pg.181]    [Pg.182]    [Pg.182]    [Pg.193]    [Pg.194]    [Pg.195]    [Pg.195]    [Pg.198]    [Pg.199]    [Pg.199]    [Pg.203]    [Pg.476]    [Pg.275]    [Pg.276]    [Pg.95]    [Pg.182]   
See also in sourсe #XX -- [ Pg.86 , Pg.95 ]

See also in sourсe #XX -- [ Pg.367 ]

See also in sourсe #XX -- [ Pg.485 ]

See also in sourсe #XX -- [ Pg.179 ]

See also in sourсe #XX -- [ Pg.181 ]

See also in sourсe #XX -- [ Pg.40 , Pg.52 ]

See also in sourсe #XX -- [ Pg.139 ]

See also in sourсe #XX -- [ Pg.208 ]

See also in sourсe #XX -- [ Pg.310 , Pg.311 ]

See also in sourсe #XX -- [ Pg.40 , Pg.52 ]

See also in sourсe #XX -- [ Pg.98 ]

See also in sourсe #XX -- [ Pg.348 , Pg.375 ]




SEARCH



Kernel ridge regression

Regression analysis ridge

Ridge Regression Procedure

Ridges

© 2024 chempedia.info