Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Recursive regression

Pd4oCu4oP2o, Pd5oCu3oP2o, and Pd6oCu2oP20 alloys were measured by resonant ultrasound spectroscopy (RUS). In this technique, the spectrum of mechanical resonances for a parallelepiped sample is measured and compared with a theoretical spectrum calculated for a given set of elastic constants. The true set of elastic constants is calculated by a recursive regression method that matches the two spectra [28,29]. [Pg.295]

In this chapter we discuss the principles of the Kalman filter with reference to a few examples from analytical chemistry. The discussion is divided into three parts. First, recursive regression is applied to estimate the parameters of a measurement equation without considering a systems equation. In the second part a systems equation is introduced making it necessary to extend the recursive regression to a Kalman filter, and finally the adaptive Kalman filter is discussed. In the concluding section, the features of the Kalman filter are demonstrated on a few applications. [Pg.577]

Before we introduce the Kalman filter, we reformulate the least-squares algorithm discussed in Chapter 8 in a recursive way. By way of illustration, we consider a simple straight line model which is estimated by recursive regression. Firstly, the measurement model has to be specified, which describes the relationship between the independent variable x, e.g., the concentrations of a series of standard solutions, and the dependent variable, y, the measured response. If we assume a straight line model, any response is described by ... [Pg.577]

The expression x (J)P(j - l)x(j) in eq. (41.4) represents the variance of the predictions, y(j), at the value x(j) of the independent variable, given the uncertainty in the regression parameters P(/). This expression is equivalent to eq. (10.9) for ordinary least squares regression. The term r(j) is the variance of the experimental error in the response y(J). How to select the value of r(j) and its influence on the final result are discussed later. The expression between parentheses is a scalar. Therefore, the recursive least squares method does not require the inversion of a matrix. When inspecting eqs. (41.3) and (41.4), we can see that the variance-covariance matrix only depends on the design of the experiments given by x and on the variance of the experimental error given by r, which is in accordance with the ordinary least-squares procedure. [Pg.579]

By way of illustration, the regression parameters of a straight line with slope = 1 and intercept = 0 are recursively estimated. The results are presented in Table 41.1. For each step of the estimation cycle, we included the values of the innovation, variance-covariance matrix, gain vector and estimated parameters. The variance of the experimental error of all observations y is 25 10 absorbance units, which corresponds to r = 25 10 au for all j. The recursive estimation is started with a high value (10 ) on the diagonal elements of P and a low value (1) on its off-diagonal elements. [Pg.580]

Linear f— Neural Regression/ Network PLS —f Recursive SVM —/ Partitioning... [Pg.318]

Recursive partitioning is a feature selection method. As such it shares the deficiencies of other feature selection methods such as stepwise or subset regression. The major deficiencies are as follows ... [Pg.324]

A number of reports also describe the prediction of mechanism-based inhibition (MBI) [17,18]. In this type of model, MBI is determined in part by spectral shift and inactivation kinetics. Jones et al. applied computational pharmacophores, recursive partitioning and logistic regression in attempts to predict metabolic intermediate complex (MIC) formation from structural inputs [17]. The development of models that accurately predict MIC formation will provide another tool to help reduce the overall risk of DDI [19]. [Pg.169]

If Yi equals 0, the model becomes partially recursive. The first equation becomes a regression which can be estimated by ordinary least squares. However, the second equation continues to fail the order condition. To see the problem, consider that even with the restriction, any linear combination of the two equations has the same variables as the original second eqation. [Pg.72]

Given a set of experimental data, we look for the time profile of A (t) and b(t) parameters in (C.l). To perform this key operation in the procedure, it is necessary to estimate the model on-line at the same time as the input-output data are received [600]. Identification techniques that comply with this context are called recursive identification methods, since the measured input-output data are processed recursively (sequentially) as they become available. Other commonly used terms for such techniques are on-line or real-time identification, or sequential parameter estimation [352]. Using these techniques, it may be possible to investigate time variations in the process in a real-time context. However, tools for recursive estimation are available for discrete-time models. If the input r (t) is piecewise constant over time intervals (this condition is fulfilled in our context), then the conversion of (C.l) to a discrete-time model is possible without any approximation or additional hypothesis. Most common discrete-time models are difference equation descriptions, such as the Auto-.Regression with eXtra inputs (ARX) model. The basic relationship is the linear difference equation ... [Pg.360]

Keywords soft-sensor, Just-In-Time modeling, recursive partial least squares regression, principal component analysis, estimation... [Pg.471]

Using this equation the absorbance measurements can be evaluated by linear regression. By this means one obtains the. r eigenvalues using the quantities 5f. The same methods can be used to transform the recursion equation, eq. (5.18), into another one for the absorbances. [Pg.373]

Recursive partitioning, Bayesian classifier, logisi-tic regression, k-nearest neighbor, support vector machine... [Pg.325]


See other pages where Recursive regression is mentioned: [Pg.575]    [Pg.577]    [Pg.582]    [Pg.586]    [Pg.589]    [Pg.575]    [Pg.577]    [Pg.582]    [Pg.586]    [Pg.589]    [Pg.720]    [Pg.29]    [Pg.579]    [Pg.433]    [Pg.199]    [Pg.123]    [Pg.324]    [Pg.250]    [Pg.261]    [Pg.95]    [Pg.343]    [Pg.471]    [Pg.439]    [Pg.496]    [Pg.600]    [Pg.308]    [Pg.14]    [Pg.125]    [Pg.704]    [Pg.322]    [Pg.323]    [Pg.327]    [Pg.328]   
See also in sourсe #XX -- [ Pg.575 , Pg.577 ]




SEARCH



Recursion

Recursive

© 2024 chempedia.info