Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Standard Error of the Regression Coefficient

RSM determines the best estimate of the coefficients for the Taylor equation which explains the response. The estimated regression coefficient [Pg.176]


The final statistical values that are reported for Equation 7.3 are the standard errors of the regression coefficients. These allow us to assess the significance of the individual terms by computing a statistic, called the t statistic, by dividing the regression coefficient by its standard error ... [Pg.173]

The squared multiple correlation coefficient gives a measure of how well a regression model fits the data and the F statistic gives a measure of the overall significance of the fit. What about the significance of individual terms This can be assessed by calculation of the standard error of the regression coefficients, a measure of how much of the dependent variable... [Pg.123]

After calculating calibration coefficients, it is worthwhile to examine the errors existing in b and establish confidence intervals. The standard error of each regression coefficient is computed according to... [Pg.126]

Power Consumption. A previous report (2) showed the power consumption of the pellet mill as a function of throughput. Subsequent work has shown there were errors in these measurements and the result is hereby withdrawn. Later work, using processed office waste with added moisture (75% of the samples contained 10 - 30 wt%, 22% of the samples contained 30 - 50 wt%, and the remainder 10 wt%) showed that the relationship between power consumption P(kW), and feedrate to the densifier F (short tons/h, dry wt basis), could be represented by P = 50.5F + 21.7, for 63 determinations, 0.3 < F < 1.7 Mg/h, with a correlation coefficient r = 0.964 and a standard error of the regression line of 5.7. The intercept should be equal to the idling power of the machine which was separately measured as 24 kW. There was no discernible relationship between power consumption and moisture content. [Pg.135]

The successful application of multivariate statistical methods is highly dependent on the quality of prior processing, and, thus, incomplete data reduction (e.g., presence of isotopic peaks and adducts) may lead to a multicolinearity problem. In this situation, many of the measured variables come from the same molecule and therefore, are not independent of each other. This can result in the regression coefficient estimates to behave erratically in response to small changes in the data as well as increase the standard errors of the affected coefficients (Listgarten and Emili, 2005). [Pg.713]

Data from Dzvinchuk and Lozinskii (88ZOR2167). Determined by H-NMR at 25°C (in CHCI3 at 20°C). The values (Xt)o were directly measured, rather than obtained from regression analysis. The standard errors of the slope are in the range 0.01-0.03. n = 8 (X = H, 3-NO2, 4-NO2, 4-Br, 4-Ph, 4-Me, 4-MeCONH, 4-MeO). Molar ratio. This is the coefficient of the Yukawa-Tsuno equation, not the correlation coefficient. The correlation coefficients lie in the range 0.997-0.999. [Pg.280]

Numbers in parentheses are the standard errors of the corresponding regression coefficients. [Pg.52]

The fitting process was carried out with a program based on a least square procedure [48] that allows us to calculate the best-fitting parameters of the equation defining the relation A(T) versus z, that is, Equation 4.25, specifically, a, 7 0, and AT. The regression coefficient and the standard errors were also calculated with the least square methodology. The calculated regression coefficients fluctuated between 0.98 and 0.99. The values calculated for the parameters T0 and AT, and the standard errors of the parameters, are reported in Table 4.9. [Pg.186]

Statistical parameters, when available, indicating the significance of each of the descriptor s contribution to the final regression equation are listed under its corresponding term in the equation. These include the standard errors written as values, the Student t test values, and the VIF. The significance of the equation will be indicated by the sample size, n the variance explained, r the standard error of the estimate, s the Fisher index, F and the cross-validated correlation coefficient, q. When known, outliers will be mentioned. The equations are followed by a discussion of the physical significance of the descriptor terms. [Pg.232]

In Equations 4 and 5, r is the multiple correlation coefficient, r2 is the percent correlation, SE is the standard error of the equation (i.e the error in the calculated error squares removed by regression to the mean sum of squares of the error residuals not removed by regression. The F-values were routinely used in statistical tests to determine the goodness of fit of the above and following equations. The numbers in parentheses beneath the fit parameters in each equation denote the standard error in the respective pa-... [Pg.262]

The standard errors of the three regression coefficients are then given by = OrVcai =... [Pg.76]

To test the significance of each coefficient, we obtain the value of t as the ratio of the regression coefficient to its standard error, and look up this value of t with degrees of freedom equal to those of the residual variance. For ba.tir) for example, t = 0.7685143/0.2709 = 2.837 with 135 degrees of freedom. Reference to Table I in the Appendix shows that this corresponds to a level of significance of between 1% and 0.1%. [Pg.76]

A necessary condition for the validity of a regression model is that the multiple correlation coefficient is as close as possible to one and the standard error of the estimate s small. However, this condition (fitting ability) is not sufficient for model validity as the models give a closer fit (smaller s and larger R ) the larger the number of parameters and variables in the models. Moreover, unfortunately, these parameters are not related to the capability of the model to make reliable predictions on future data. [Pg.461]

This method is the simplest and is well-suited to implementation in computer programs or spreadsheets [9]. Various standard statistical criteria, e g. the correlation coefficient, r the standard error of the slope of the regression line, 5 or the standard error of the estimate of g(or) from t, s, are used to quantify the deviation of a set of experimental points from the calculated regression line through them [10]. The inadequacies of r as an indicator of fit have been stressed [4,11]. The use of 5 is preferable in that its value is dependent upon the range of t used in the analysis. [Pg.143]

As you can see, cells E2 and F2 contain the slope and intercept of the least-squares line. Cells E3 and F3 are the respective standard deviations of the slope and intercept. Cell E4 contains the coefficient of determination (R-). The standard deviation about regression (s, standard error of the estimate) is located in cell F4. The smaller the Jr value, the better the fit. The square of the standard error of the estimate is the mean square for the residuals (eiTor). The value in cell E5 is the F statistic. Cell F5 contains the number of degrees of freedom associated with the error. [Pg.204]


See other pages where Standard Error of the Regression Coefficient is mentioned: [Pg.587]    [Pg.176]    [Pg.177]    [Pg.587]    [Pg.176]    [Pg.177]    [Pg.113]    [Pg.157]    [Pg.61]    [Pg.172]    [Pg.18]    [Pg.127]    [Pg.134]    [Pg.141]    [Pg.182]    [Pg.184]    [Pg.196]    [Pg.1182]    [Pg.209]    [Pg.77]    [Pg.133]    [Pg.276]    [Pg.271]    [Pg.207]    [Pg.400]    [Pg.100]    [Pg.64]    [Pg.85]    [Pg.114]    [Pg.173]    [Pg.173]    [Pg.213]    [Pg.210]    [Pg.176]    [Pg.296]    [Pg.76]    [Pg.501]   


SEARCH



Coefficient of regression

Coefficient of the

Coefficient regression

Error coefficients

Errors standardization

Regression errors

Standard Error

Standard error of the regression

Standard errors of the coefficient

Standardized regression

Standardized regression coefficients

The Standards

© 2024 chempedia.info