Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Linear regression method

Microscopic dissociation constants of 3-hydroxy-a-(methylamino) methylbenzene-methanol have been calculated from the titration spectrophotometer data (c = 3.8 x 10" M. Ionic strengdi = 0.16 buffer system H3BO3/KOH) by application of a spectral deconvolution method. The results found (pKa = 9.48 pJCb = 9.71 pJCc = 10.12 pi d = 9.88) are in good concordance with those obtained from the conventional regression linear method (pJCa = 9.45 pXb = 9.77 pK, = 10.14 p d = 9.81)."... [Pg.232]

Once a linear relationship has been shown to have a high probability by the value of the correlation coefficient (r), then the best straight line through the data points has to be estimated. This can often be done by visual inspection of the calibration graph but in many cases it is far better practice to evaluate the best straight line by linear regression (the method of least squares). [Pg.145]

Additionally, Breiman et al. [23] developed a methodology known as classification and regression trees (CART), in which the data set is split repeatedly and a binary tree is grown. The way the tree is built, leads to the selection of boundaries parallel to certain variable axes. With highly correlated data, this is not necessarily the best solution and non-linear methods or methods based on latent variables have been proposed to perform the splitting. A combination between PLS (as a feature reduction method — see Sections 33.2.8 and 33.3) and CART was described by... [Pg.227]

Typical robust regression methods are linear methods. [Pg.146]

PLS and PCR are linear methods (although nonlinear versions exist) and therefore the final latent variable that predicts the modeled property, y, is a linear combination of the original variables, just as in OLS (Equation 4.1). In general, the resulting regression coefficients are different when applying OLS, PCR, and PLS, and the prediction performances of the models are different. [Pg.165]

In a paper that addresses both these topics, Gordon et al.11 explain how they followed a com mixture fermented by Fusarium moniliforme spores. They followed the concentrations of starch, lipids, and protein throughout the reaction. The amounts of Fusarium and even com were also measured. A multiple linear regression (MLR) method was satisfactory, with standard errors of prediction (SEP) for the constituents being 0.37% for starch, 4.57% for lipid, 4.62% for protein, 2.38% for Fusarium, and 0.16% for com. It may be inferred from the data that PLS or PCA (principal components analysis) may have given more accurate results. [Pg.387]

Differences in calibration graph results were found in amount and amount interval estimations in the use of three common data sets of the chemical pesticide fenvalerate by the individual methods of three researchers. Differences in the methods included constant variance treatments by weighting or transforming response values. Linear single and multiple curve functions and cubic spline functions were used to fit the data. Amount differences were found between three hand plotted methods and between the hand plotted and three different statistical regression line methods. Significant differences in the calculated amount interval estimates were found with the cubic spline function due to its limited scope of inference. Smaller differences were produced by the use of local versus global variance estimators and a simple Bonferroni adjustment. [Pg.183]

Regression generally means the fitting of mathematical equations to experimental data ( 3). Nonlinear regression, unlike linear regression, encompasses methods vdiich are not limited to fitting equations linear in the coefficients (e.g. simple polynomial forms). [Pg.203]

Current methods for supervised pattern recognition are numerous. Typical linear methods are linear discriminant analysis (LDA) based on distance calculation, soft independent modeling of class analogy (SIMCA), which emphasizes similarities within a class, and PLS discriminant analysis (PLS-DA), which performs regression between spectra and class memberships. More advanced methods are based on nonlinear techniques, such as neural networks. Parametric versus nonparametric computations is a further distinction. In parametric techniques such as LDA, statistical parameters of normal sample distribution are used in the decision rules. Such restrictions do not influence nonparametric methods such as SIMCA, which perform more efficiently on NIR data collections. [Pg.398]

Although ANNs can, in theory, model any relation between predictors and predictands, it was found that common regression methods such as PLS can outperform ANN solutions when linear or slightly nonlinear problems are considered [1-5]. In fact, although ANNs can model linear relationships, they require a long training time since a nonlinear technique is applied to linear data. Despite, ideally, for a perfectly linear and noise-free data set, the ANN performance tends asymptotically towards the linear model performance, in practical situations ANNs can reach a performance qualitatively similar to that of linear methods. Therefore, it seems not too reasonable to apply them before simpler alternatives have been considered. [Pg.264]

The multiple linear regression (MLR) method was historically the first and, until now, the most popular method used for building QSPR models. In MLR, a property is represented as a weighted linear combination of descriptor values F=ATX, where F is a column vector of property to be predicted, X is a matrix of descriptor values, and A is a column vector of adjustable coefficients calculated as A = (XTX) XTY. The latter equation can be applied only if the matrix XTX can be inverted, which requires linear independence of the descriptors ( multicollinearity problem ). If this is not the case, special techniques (e.g., singular value decomposition (SVD)26) should be applied. [Pg.325]

E436 Hadberg, A. (1988). Linear regression in method comparison Serum sodium measured on Kodak Ektachem 700 by direct potentiometry and on Technicon SMA II by flamephotometry. Scand. J. Clin. Lab. Invest. 48, Suppl. 190, 230, Abstr. P30-21-BRD 3I. [Pg.295]

When a linear relationship is observed between two variables, the correlation is quantified by a method such as linear least-squares regression. This method determines the equation for the best straight line that fits the experimental data. [Pg.326]

Panaye A, Fan BT, Doucet JP, Yao XJ, Zhang RS, Liu MC, et al. Quantitative structure-toxicity relationships (QSTRs) A comparative study of various non linear methods. General regression neural network, radial basis function neural network and support vector machine in predicting toxicity of nitro- and cyano-aromatics to Tetrahymena pyriformis. SAR QSAR Environ Res 2006 17 75-91. [Pg.235]

A feature of this group of methods was an attempt by the authors to structurally restrict the explored temperature dependency of vapor pressure to the theoretically derived dependency. This motivation is clearly justified by the nonlinear dependency of vapor pressure from temperature, which cannot be easily captured with the pure linear regression approach. The use of nonlinear methods, however, can solve this problem. For example, neural networks predicted saturated vapor pressure for 352 hydrocarbons with an RMSE = 0.12 compared to an RMSE = 0.25 using a linear method with the same descriptors [112]. [Pg.258]

Vyazovkin and Lesnikovich [42] have emphasized that the majority of NIK methods involve linearization of the appropriate rate equation, usually through a logarithmic transformation which distorts the Gaussian distribution of errors. Thus non-linear methods are preferable [89]. Militky and Sest [90] and Madarasz et al. [91] have outlined routine procedures for non-linear regression analysis of equation (5.5) above by transforming the relationship ... [Pg.162]


See other pages where Linear regression method is mentioned: [Pg.287]    [Pg.379]    [Pg.33]    [Pg.199]    [Pg.512]    [Pg.143]    [Pg.1462]    [Pg.1474]    [Pg.391]    [Pg.775]    [Pg.287]    [Pg.114]    [Pg.213]    [Pg.92]    [Pg.857]    [Pg.33]    [Pg.92]    [Pg.185]    [Pg.857]    [Pg.26]    [Pg.119]    [Pg.53]    [Pg.354]    [Pg.358]    [Pg.38]    [Pg.870]    [Pg.584]   


SEARCH



Analytical methods multiple linear regression

Linear methods

Linear regression

Linear regression, forecasting method

Linearized methods

Numerical methods linear regression

Quantitative structure-activity relationship linear regression methods

Regression analysis linear least squares method

Regression methods

Regression methods, assumptions linear

The Method of Least Squares and Simple Linear Regression

© 2024 chempedia.info