Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression estimation

A first evaluation of the data can be done by running nonparametric statistical estimation techniques like, for example, the Nadaraya-Watson kernel regression estimate [2]. These techniques have the advantage of being relatively cost-free in terms of assumptions, but they do not provide any possibility of interpreting the outcome and are not at all reliable when extrapolating. The fact that these techniques do not require a lot of assumptions makes them... [Pg.72]

Firstly, the krlglng estimator is optimal only for the least square criterion. Other criteria are known which yield no more complicated estimators such as the minimization of the mean absolute deviation (mAD), E P(2c)-P (3c), yielding median-type regression estimates. [Pg.110]

The least-squares approach can become very unreliable if outliers are present in the data (see Section 4.4). In this case, it is more advisable to minimize another function of the errors which results in more robust regression estimates. Although with the OLS approach the Equations 4.22 and 4.23 can always be applied, it is advisable to use the following assumptions for obtaining reliable estimates ... [Pg.135]

In Hitchell s work unequal variance of the response data was compensated for by weighting the data by the variance at each level. The regression parameters and the confidence band around the regression line were estimated by least squares ( ) The overall level of uncertainty, OL, was divided between the variation in response values and the variance in the regression estimation. His overall a was 0.05. The prediction interval was estimated around a single response determination. [Pg.184]

M. C. Denham, Choosing the number of factors in partial least squares regression estimating and minimizing the mean squared error of prediction, J. Chemom., 14, 2000, 351-361. [Pg.238]

Billings and Voon, 1986] Billings, S. A. and Voon, W. S. F. (1986). A prediction-error and stepwise-regression estimation algorithm for non-linear systems. Int. J. Control, 44(3) 803-822. [Pg.252]

This is a very popular method because it allows us to compute the regression estimates explicitly as 6 = (XrX) X v (where the design matrix X is enlarged with a column of ones for the intercept term and y = (yt..., v,)7 and, moreover, the least-squares method is optimal if the errors are normally distributed. [Pg.177]

In Rousseeuw et al. [43], it is proposed to use the MCD estimates for the center p and the scatter matrix I of the joint (x, y) variables in Equation 6.11 to Equation 6.13. The resulting estimates are called MCD-regression estimates. They inherit the breakdown value of the MCD estimator. To obtain a better efficiency, the reweighed MCD estimates are used in Equation 6.11 to Equation 6.13 and followed by a regression reweighing step. For any lit 0 = (/j(, /i7)7, denote the corresponding -dimensional residuals by r.(0) = yi- Brx. - Then the residual distance of the ith case is defined as... [Pg.184]

This RSIMPLS approach yields bounded-influence functions for the weight vectors r and q and for the regression estimates [66], Also, the breakdown value is inherited from the MCD estimator. Model calibration and validation is similar to the RPCR method and proceeds as in Section 6.6.3. [Pg.204]

Pfaffenberger, R.C. and Dielman, T.E. (1990) A comparison of regression estimators when both multicollinearity and outliers are present, in Robust Regression Analysis and Applications (eds K.D. Lawrence and J.L. Arthur), Marcel Dekker, New York, pp. 243-270. [Pg.180]

Ignoring heteroskedasticity in an ordinary calibration experiment will not lead to major errors when measuring in the middle of the calibration line since the estimated regression parameters are unbiased (it should be recalled that the regression estimators are inefficient, but unbiased). However, the uncertainty of the results is much larger in the lower part of the calibration line when incorrect or no weighing factors have been applied to correct for heteroskedasticity. It should be stressed that extreme errors are introduced by using the calibration line for the estimation of the detection capability, if heteroskedasticity is not corrected. [Pg.146]

The second interpretation of E(s2) arises from considering the estimation of the main effect of a single factor Xj, for example if only one factor appears to have a very large effect. The simple linear regression estimate of Pj from the model T = Po + PjXji + 6 ... [Pg.172]

Holcomb et al. (2003) studied the method of using the simple linear regression estimates in more detail. They described it as a contrast-based method obtained by using X y from the full model, but these are the simple linear regression estimates multiplied by n. They tried several procedures for separating the active from the... [Pg.180]

Centralized Production of Hydrogen Table 4. Descriptive statistics value ranges to generate regression estimates. [Pg.287]

Zweig MH, KroU MH. Linear regression estimation of minimal detectable concentration. Thyrotropin as an example. Arch Pathol Lab Med 1997 121 948-55. [Pg.407]

Lemma With ordinary least-squares regression on balanced data, taking arithmetic means over the covariates does not affect the results of nonlinear regression estimation under additive (normal) error model assumption. Similarly, taking geometric means over the covariates does not affect the results of nonlinear regression estimation under multiplicative (lognormal) error model assumption. [Pg.441]

Here, a is the population intercept and jS is the population slope of observed mass regressed on expected mass. The error term, e, is the residual added to the population line to obtain the observed mass. Least squares estimation (traditional regression estimation) will yield estimates of a and P, as well as an estimate for the standard deviation of e (RMSE). These estimates form the sample regression line for predicting observed mass (predicted mass or y) ... [Pg.35]

Ridge regression estimators are biased. The trade-off for stabilization and variance reduction in regression coefficient estimators is the bias in the estimators and the increase in the squared error. [Pg.78]

LFER = Linear Free Energy Relationship LRE = Linear Regression Estimation... [Pg.96]

In the following we will need an estimate of scale. Since the properties of the regression estimate do not depend markedly on the efficiency of this estimate, we will always use the resistant estimate... [Pg.38]

Grabowski Vernon, 1989 1970-79 (NCE approvals) Analysis of industry R D expenditures and nCE production. "R D time profiles modified from regression estimates. NCEs FDA. 125 million Total R D expenditures PMA surveys. 1986 9% Implicit Implicit... [Pg.49]

These assumptions were based in part on a regression estimate Thomas made in 1986 (421). [Pg.54]

However, when the number of replicates is small, as is usually the case, the estimated variance can be quite erroneous and unstable. Nonlinear regression estimates using this approach are more variable than their unweighted least-squares counterparts, unless the number of replicates at each level is 10 or more. For this reason, this method cannot be supported and the danger of unstable variance estimates can be avoided if a parametric residual variance model can be found. [Pg.132]


See other pages where Regression estimation is mentioned: [Pg.537]    [Pg.367]    [Pg.145]    [Pg.146]    [Pg.149]    [Pg.185]    [Pg.105]    [Pg.537]    [Pg.167]    [Pg.167]    [Pg.177]    [Pg.182]    [Pg.184]    [Pg.184]    [Pg.180]    [Pg.286]    [Pg.3256]    [Pg.254]    [Pg.786]    [Pg.35]    [Pg.286]    [Pg.63]    [Pg.88]    [Pg.533]    [Pg.248]    [Pg.151]   


SEARCH



Estimation of Kinetic Parameters for Non-Elementary Reactions by Linear Regression

Functional estimation problem regression function

Logistic regression model maximum likelihood estimation

Parameter estimation linear regression

Parameter estimation multiple regression

Parameter estimation nonlinear regression

Regression Analysis and Parameter Estimation

Regression analysis, initial estimates

Regression equations parameter estimates

Regression estimation response surface designs, model

Regression estimation substances

Regression, parameter estimation

Regression, parameter estimation dependent variable

Regression, parameter estimation independent variable

Regression, parameter estimation iterations

Regression, parameter estimation local optimization

© 2024 chempedia.info