Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression methods, assumptions linear

The prediction of Y-data of unknown samples is based on a regression method where the X-data are correlated to the Y-data. The multivariate methods, usually used for such a calibration, are principal component regression (PCR) and partial least squares regression (PLS). Both methods are based on the assumption of linearity and can deal with co-linear data. The problem of co-linearity is solved in the same way as the formation of a PCA plot. The X-variables are added together into latent variables, score vectors. These vectors are independent since they are orthogonal to each other and they can therefore be used to create a calibration model. [Pg.7]

Compared with linear and nonlinear regression methods, the advantage of ANN is its ability to correlate a nonlinear function without assumption about the form of this function beforehand. And the trained ANN can be used for unknown prediction. Therefore, ANN has been widely used in data processing of SAR. But if we use ANN solely, sometimes the results of prediction may be not very reliable. Experimental results indicate that some of the test samples predicted by ANN as optimal samples are really not true optimal samples. This is a typical example of so-called overfitting that makes the prediction results of trained ANN not reliable enough. Since the data files in many practical problems usually have strong noise and non-uniform sample point distribution, the overfitting problem may lead to more serious mistake in these practical problems. [Pg.195]

Multiple linear regression is strictly a parametric supervised learning technique. A parametric technique is one which assumes that the variables conform to some distribution (often the Gaussian distribution) the properties of the distribution are assumed in the underlying statistical method. A non-parametric technique does not rely upon the assumption of any particular distribution. A supervised learning method is one which uses information about the dependent variable to derive the model. An unsupervised learning method does not. Thus cluster analysis, principal components analysis and factor analysis are all examples of unsupervised learning techniques. [Pg.719]

In either case, reaching this point indicates that the drug is beneficial or not and is at least a qualitative endpoint. Last observation carried forward (LOCF), a standard method of data analysis, carries the last data point forward week by week. Random regression models can estimate what would happen at a later time point, assuming that patients change in a linear fashion. Improvement, however, often levels off. Thus, creating data points based on questionable assumptions can potentially introduce substantial bias. [Pg.24]

Another assumption, which becomes apparent when one carefully examines the model (Equation 8.7), is that all of the model error (f) is in the dependent variable (y). There is no provision in the model for errors in the independent variable (x). In PAC, this is equivalent to saying that there is error only in the reference method, and no error in the on-line analyzer responses. Although this is obviously not true, practical experience over the years has shown that linear regression can be very effective in analytical chemistry applications. [Pg.235]

Assumptions of a multiple regression analysis are identical to those for linear regression except for the p independent variables in this case. To reach regression coefficient estimates b by the method of least squares, we again have to minimize... [Pg.136]

As said before, linear models are used to reach (move towards) optimum, so that the significance of regression coefficients is an assumption for successful application of the steepest ascent method. Linear models, therefore, include as many factors as possible, and full factorial experiments are even replicated with increased factor variation intervals. [Pg.366]


See other pages where Regression methods, assumptions linear is mentioned: [Pg.575]    [Pg.145]    [Pg.165]    [Pg.276]    [Pg.83]    [Pg.67]    [Pg.247]    [Pg.92]    [Pg.310]    [Pg.382]    [Pg.625]    [Pg.1365]    [Pg.443]    [Pg.141]    [Pg.169]    [Pg.256]    [Pg.256]    [Pg.498]    [Pg.279]    [Pg.187]    [Pg.143]    [Pg.143]    [Pg.133]    [Pg.426]    [Pg.287]    [Pg.373]    [Pg.139]    [Pg.39]    [Pg.312]    [Pg.408]    [Pg.120]    [Pg.133]    [Pg.158]    [Pg.185]    [Pg.169]    [Pg.324]    [Pg.50]    [Pg.142]    [Pg.103]    [Pg.268]    [Pg.387]    [Pg.287]    [Pg.114]    [Pg.253]   
See also in sourсe #XX -- [ Pg.13 , Pg.47 , Pg.157 , Pg.158 ]




SEARCH



Linear methods

Linear regression

Linearized methods

Regression assumptions

Regression methods

Regression, linear method

© 2024 chempedia.info