Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Linear Regression Problems

A linear regression relationship for a quantity of concern Q can be written as  [Pg.43]

Bayesian Methods for Structural Dynamics and Civil Engineering [Pg.44]

The data T includes the measurement of x and the corresponding values for y. By assuming that the prediction errors in different records are statistically independent, the likelihood function is obtained  [Pg.44]

A smaller value of this function implies better fitting to the data. Two special but popular choices of prior distributions of the uncertain parameters are considered in the following sections. [Pg.44]

In the case of a uniform prior PDF of the coefficients, the optimal coefficient vector b can be obtained by minimizing Jg h T , C). This can be achieved by solving the linear algebraic equation dJg(h V, C)/db = 0, and the updated coefficient vector is readily obtained  [Pg.44]


The objectives in this chapter are two. The first one is to briefly review the essentials of linear regression and to present them in a form that is consistent with our notation and approach followed in subsequent chapters addressing nonlinear regression problems. The second objective is to show that a large number of linear regression problems can now be handled with readily available software such as Microsoft Excel and SigmaPlot . [Pg.23]

Having the smoothed values of the state variables at each sampling point and the derivatives, q we have essentially transformed the problem to a "usual" linear regression problem. The parameter vector is obtained by minimizing the following LS objective function... [Pg.117]

The above linear regression problem can be readily solved using any standard linear regression package. [Pg.119]

In general, a much larger number of parameters [wavelengths, frequencies, or factors] needs to be calculated in overlapping peak systems [some spectra or chromatograms] than in the linear regression problems, (p. 176)... [Pg.165]

To obtain the estimate (3.23) in a multivariate linear regression problem we... [Pg.178]

Once the kinetic parameters have been estimated, (3.60) becomes linear in the unknown parameters A Hr . Therefore, the errors between the total heat of reaction, computed via the detailed model, and the total heat, computed via each reduced model, can be minimized by resorting to the least squares solution of a linear regression problem, discussed in Sect. 3.4. The molar heats of reaction, included in the vector of parameters... [Pg.61]

Another approach that is different from the previous methods is the simplex method of minimization. " It involves the formation of a simplex, a geometric shape with (w + 1) sides, where m is the number of parameters on the WSS surface. The WSS is calculated at each corner of the simplex and compared. The movement of the simplex across the WSS surface (toward the minimum) is controlled by a small number of rules. For example, the point with the highest WSS is reflected across the centroid (center of the simplex) to produce a new point. If this point has the lowest WSS, it is extended again. A point with a larger WSS causes the simplex to contract. By a series of such steps, the simplex moves across the WSS surface to approach the minimum value. Although the simplex method can be relatively slow, it has the advantage of computational simplicity that makes it useful for a variety of non-linear regression problems. [Pg.2764]

When a model has been fitted and scores of new samples are required, the basic problem to solve is a simple multiple linear regression problem. The PARAFAC model is given... [Pg.118]

When a Tucker3 model is used for calculating the scores of a new sample, the loading parameters and the core parameters are usually considered fixed. This turns the problem into a multiple linear regression problem. Assume that the first mode is the sample mode and a new sample, Xnew, of size J x K is given. This sample matrix corresponds in size to one horizontal slice of the original array (Figure 6.4). The problem can be expressed... [Pg.123]

Influence measures may be calculated for objects and variables in a two-way array and in this section, only the main influence measure, leverage, is treated. Leverage was strictly developed for multiple linear regression problems [Hoaglin Welsch 1978], The leverage for a matrix X (/ x /) in the regression problem y = Xb + ey is defined as... [Pg.171]

Inference on the Expected Response Variables Solution of Multiple Linear Regression Problems... [Pg.14]

Procedure for Using SigmaPlot for Windows Solution of Multiresponse Linear Regression Problems Problems on Linear Regression... [Pg.14]

VII.52 Formulate the linear regression problem for the following assumed orders of unknown processes ... [Pg.352]

It is evident that all these multiple linear regression problems defined by Equation 13.2 to Equation 13.5 can be written in a vector form ... [Pg.206]

A general linear regression problem can be represented by the following relationship ... [Pg.229]

In linear regression problems (Buzzi-Ferraris and Manenti, 2010b), the parameters b that noinimize the residuals of the overdimensioned system... [Pg.112]

The Attic method is based on the idea of introducing one inequality constraint at a time, selecting each one from the most promising ones. Once a vertex of nv constraints is set up, more inequality constraints are simultaneously removed when opportune. From a certain point of view, this strategy is similar to the stepwise method of building the best model in a linear regression problem (Vol. 2 -Buzzi-Ferraris and Manenti, 2010b). A forward method is used to insert constraints and a backward method to remove them. [Pg.387]

These solutions are usually found by the same methods as those applied to multiple linear regression problems. Occasionally, however, if the various values of X occur at equal intervals and if the same number of observations are made on Y for each X, the calculations may be made easier through the use of orthogonal polynomial techniques. For details the reader is referred to Anderson and Houseman (1942) or Ostle (1954). [Pg.234]


See other pages where Linear Regression Problems is mentioned: [Pg.35]    [Pg.46]    [Pg.447]    [Pg.448]    [Pg.115]    [Pg.158]    [Pg.163]    [Pg.205]    [Pg.124]    [Pg.452]    [Pg.47]    [Pg.135]    [Pg.136]    [Pg.91]    [Pg.101]    [Pg.296]    [Pg.10]    [Pg.11]    [Pg.56]    [Pg.67]    [Pg.292]    [Pg.43]    [Pg.229]   


SEARCH



Linear problems

Linear regression

Nonmatrix Solutions to the Linear, Least-Squares Regression Problem

Regression problem

Special Problems in Simple Linear Regression

© 2024 chempedia.info