Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression errors

It is often helpful to examine the regression errors for each data point in a calibration or validation set with respect to the leverage of each data point or its distance from the origin or from the centroid of the data set. In this context, errors can be considered as the difference between expected and predicted (concentration, or y-block) values for the regression, or, for PCA, PCR, or PLS, errors can instead be considered in terms of the magnitude of the spectral... [Pg.185]

For a supercritical fluid (SCF) component, the pure component parameters were obtained by fitting P-v data on isotherms (300-380K). Preliminary data for these substances suggest that although the computed v is a weak function of temperature, exl is a constant within regression error. [Pg.90]

In the comparison of calibration methods, the results show that the non-parametric techniques decreased the regression error by approximately 50% over the parametric approaches. The sensor array used in these examples was... [Pg.311]

Analysis of Variance Source Regression Error Total... [Pg.366]

Subroutine REGRES. REGRES is the main subroutine responsible for performing the regression. It solves for the parameters in nonlinear models where all the measured variables are subject to error and are related by one or two constraints. It uses subroutines FUNG, FUNDR, SUMSQ, and SYMINV. [Pg.217]

Understanding the distribution allows us to calculate the expected values of random variables that are normally and independently distributed. In least squares multiple regression, or in calibration work in general, there is a basic assumption that the error in the response variable is random and normally distributed, with a variance that follows a ) distribution. [Pg.202]

The term on the left-hand side is a constant and depends only on the constituent values provided by the reference laboratory and does not depend in any way upon the calibration. The two terms on the right-hand side of the equation show how this constant value is apportioned between the two quantities that are themselves summations, and are referred to as the sum of squares due to regression and the sum of squares due to error. The latter will be the smallest possible value that it can possibly be for the given data. [Pg.211]

The most commonly used form of linear regression is based on three assumptions (1) that any difference between the experimental data and the calculated regression line is due to indeterminate errors affecting the values of y, (2) that these indeterminate errors are normally distributed, and (3) that the indeterminate errors in y do not depend on the value of x. Because we assume that indeterminate errors are the same for all standards, each standard contributes equally in estimating the slope and y-intercept. For this reason the result is considered an unweighted linear regression. [Pg.119]

Residual error in linear regression, where the filled circle shows the experimental value/, and the open circle shows the predicted value/,. [Pg.119]

Plot of the residual error in y as a function of X. The distribution of the residuals in (a) indicates that the regression model was appropriate for the data, and the distributions in (b) and (c) indicate that the model does not provide a good fit for the data. [Pg.124]

A linear regression analysis should not be accepted without evaluating the validity of the model on which the calculations were based. Perhaps the simplest way to evaluate a regression analysis is to calculate and plot the residual error for each value of x. The residual error for a single calibration standard, r , is given as... [Pg.124]

CA Weighted Linear Regression with Errors in Both x andy... [Pg.127]

The regression models considered earlier apply only to functions containing a single independent variable. Analytical methods, however, are frequently subject to determinate sources of error due to interferents that contribute to the measured signal. In the presence of a single interferent, equations 5.1 and 5.2 become... [Pg.127]

Nimura, Y. Carr, M. R. Reduction of the Relative Error in the Standard Additions Method, Analyst 1990, 115, 1589-1595. The following paper discusses the importance of weighting experimental data when using linear regression Karolczak, M. To Weight or Not to Weight An Analyst s Dilemma, Curr. Separations 1995, 13, 98-104. [Pg.134]

Algorithms for performing a linear regression with errors in both X and y are discussed in... [Pg.134]


See other pages where Regression errors is mentioned: [Pg.650]    [Pg.214]    [Pg.54]    [Pg.200]    [Pg.413]    [Pg.78]    [Pg.802]    [Pg.2893]    [Pg.3493]    [Pg.206]    [Pg.650]    [Pg.214]    [Pg.54]    [Pg.200]    [Pg.413]    [Pg.78]    [Pg.802]    [Pg.2893]    [Pg.3493]    [Pg.206]    [Pg.686]    [Pg.688]    [Pg.491]    [Pg.496]    [Pg.497]    [Pg.70]    [Pg.93]    [Pg.118]    [Pg.119]    [Pg.119]    [Pg.120]    [Pg.121]    [Pg.121]    [Pg.124]    [Pg.124]    [Pg.127]    [Pg.133]    [Pg.777]    [Pg.779]   
See also in sourсe #XX -- [ Pg.54 ]




SEARCH



Error-in-variables nonlinear regression

Error-in-variables regression

Linear regression residual error

Linear regression with errors

Regression Analysis in Context of Error Structure

Regression Errors and Tests of the Coefficients

Regression analysis error models

Regression measurements, error

Regression standard error

Standard Error of the Regression Coefficient

Standard error of the regression

Standard error, regression analysis

Unweighted Linear Regression with Errors in

Unweighted linear regression, with errors

Weighted Linear Regression with Errors in

Weighted linear regression with errors

© 2024 chempedia.info