Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Linear least-squares regression model

The correlation coefficient of the best linear least squares regression model should be between 0.98 and 1.00 or greater than 0.999 with the slope and intercept reported. However, there is no rule stating that the... [Pg.1699]

The correlation coefficient of the best linear least squares regression model should be between 0.98 and 1.00 or greater than 0.999 with the slope and intercept reported. However, there is no rule stating that the relationship between instrumental response and analyte concentration must be directly linear for a procedure to be valid. The desire to have a linear relationship reflects the practical consideration that a linear relationship can be accurately described with fewer standards than a nonlinear relationship, and the subjective expectation that a linear relationship is more rugged than a more complicated, nonlinear relationship. [Pg.1127]

In practice, spectral components almost invariably do overlap, and hence the simple linear least squares regression approach is not feasible. It is, however, possible to construct a linear least squares regression model for more than one component if Beer s law is modified to recognize that measured absorbance is the sum of the absorbances of all the components in the mixture. Now the model becomes a multivariate system. [Pg.207]

The experiments were carried out in random order and the responses analyzed with the program X-STAT(11) which runs on an IBM PC computer. The model was the standard quadratic polynomial, and the coefficients were determined by a linear least-squares regression. [Pg.78]

FIGURE 5.6 Partial least-squares regression model showing the correlation between alanine (nmol/g cheese) predicted by GC-FID and FTIR. The model shows a high degree of linear correlation (r-value = 0.99) and a low estimated standard error of prediction (12.70 nmol/g cheese). [Pg.199]

Usually, linear models are preferable (linear ordinary, i.e., unweighted, least squares regression model is not appropriate in many cases, in which weighted least squares model should be applied), but, if necessary, nonlinear (e.g., second order) models can be used [60],... [Pg.370]

The Arrhenius plot generated from cure kinetics parameters (Figure 3) for this system essentially is linear through the cure region. The excellent fit obtained with the linear least squares regression over the temperature range of the cure reaction confirms the validity of the nth order kinetic model used to describe the cure of the uncatalyzed gel coat resins. [Pg.382]

Finally there is the problem of selecting the correct polynomial. Most spectroscopic systems tend to follow linear relationships, but this is by no means true 100% of the time. Some systems are quadratic (parabolic) or even exponential in nature. Thus, the form of the model equation is another variable (along with the spectral band) to be carefully considered when using least squares regression models. [Pg.98]

Partial least-squares path modeling with latent variables (PLS), a newer, general method of handling regression problems, is finding wide apphcation in chemometrics. This method allows the relations between many blocks of data ie, data matrices, to be characterized (32—36). Linear and multiple regression techniques can be considered special cases of the PLS method. [Pg.426]

Partial least squares regression (PLS). Partial least squares regression applies to the simultaneous analysis of two sets of variables on the same objects. It allows for the modeling of inter- and intra-block relationships from an X-block and Y-block of variables in terms of a lower-dimensional table of latent variables [4]. The main purpose of regression is to build a predictive model enabling the prediction of wanted characteristics (y) from measured spectra (X). In matrix notation we have the linear model with regression coefficients b ... [Pg.544]

The structure of such models can be exploited in reducing the dimensionality of the nonlinear parameter estimation problem since, the conditionally linear parameters, kl5 can be obtained by linear least squares in one step and without the need for initial estimates. Further details are provided in Chapter 8 where we exploit the structure of the model either to reduce the dimensionality of the nonlinear regression problem or to arrive at consistent initial guesses for any iterative parameter search algorithm. [Pg.10]

Kittrell et al. (1965a) also performed two types of estimation. First the data at each isotherm were used separately and subsequently all data were regressed simultaneously. The regression of the isothermal data was also done with linear least squares by linearizing the model equation. In Tables 16.7 and 16.8 the reported parameter estimates are given together with the reported standard error. Ayen and Peters (1962) have also reported values for the unknown parameters and they are given here in Table 16.9. [Pg.290]

When experimental data is to be fit with a mathematical model, it is necessary to allow for the fact that the data has errors. The engineer is interested in finding the parameters in the model as well as the uncertainty in their determination. In the simplest case, the model is a linear equation with only two parameters, and they are found by a least-squares minimization of the errors in fitting the data. Multiple regression is just linear least squares applied with more terms. Nonlinear regression allows the parameters of the model to enter in a nonlinear fashion. See Press et al. (1986) for a description of maximum likelihood as it applies to both linear and nonlinear least squares. [Pg.84]

The methods used were those of Mitchell ( 1 ), Kurtz, Rosenberger, and Tamayo ( 2 ), and Wegscheider T ) Mitchell accounted for heteroscedastic error variance by using weighted least squares regression. Mitchell fitted a curve either to all or part of the calibration range, using either a linear or a quadratic model. Kurtz, et al., achieved constant variance by a... [Pg.183]

What is the equivalent four-parameter linear model expressing y, as a function of jci and xfl Use matrix least squares (regression analysis) to fit this linear model to the data. How are the classical factor effects and the regression factor effects related. Draw the sums of squares and degrees of freedom tree. How many degrees of freedom are there for SS, 55, and SS 7... [Pg.357]


See other pages where Linear least-squares regression model is mentioned: [Pg.39]    [Pg.41]    [Pg.220]    [Pg.635]    [Pg.40]    [Pg.326]    [Pg.109]    [Pg.115]    [Pg.170]    [Pg.159]    [Pg.85]    [Pg.172]    [Pg.434]    [Pg.317]    [Pg.411]    [Pg.391]    [Pg.324]    [Pg.46]    [Pg.207]    [Pg.445]    [Pg.279]    [Pg.51]    [Pg.3]    [Pg.353]    [Pg.371]    [Pg.24]    [Pg.28]    [Pg.203]    [Pg.63]    [Pg.217]   
See also in sourсe #XX -- [ Pg.207 ]




SEARCH



Least squares linear

Least squares models

Least squares regression

Least-squares modeling

Linear regression

Linear regression models

Linear regression squares

Linearized model

Model Linearity

Models linear model

Models linearization

Regression model

Regression modeling

© 2024 chempedia.info