Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Least squares recursive

The expression x (J)P(j - l)x(j) in eq. (41.4) represents the variance of the predictions, y(j), at the value x(j) of the independent variable, given the uncertainty in the regression parameters P(/). This expression is equivalent to eq. (10.9) for ordinary least squares regression. The term r(j) is the variance of the experimental error in the response y(J). How to select the value of r(j) and its influence on the final result are discussed later. The expression between parentheses is a scalar. Therefore, the recursive least squares method does not require the inversion of a matrix. When inspecting eqs. (41.3) and (41.4), we can see that the variance-covariance matrix only depends on the design of the experiments given by x and on the variance of the experimental error given by r, which is in accordance with the ordinary least-squares procedure. [Pg.579]

P.C. Thijssen, L.J.P. Vogels, H.C. Smit and G. Kateman, Optimal selection of wavelengths in spectrophotometric multicomponent analysis using recursive least squares. Z. Anal. Chem., 320(1985) 531-540. [Pg.603]

Equations 13.14 to 13.16 constitute the well known recursive least squares (RLS) algorithm. It is the simplest and most widely used recursive estimation method. It should be noted that it is computationally very efficient as it does not require a matrix inversion at each sampling interval. Several researchers have introduced a variable forgetting factor to allow a more precise estimation of 0 when the process is not "sensed" to change. [Pg.221]

Procedures on how to make inferences on the parameters and the response variables are introduced in Chapter 11. The design of experiments has a direct impact on the quality of the estimated parameters and is presented in Chapter 12. The emphasis is on sequential experimental design for parameter estimation and for model discrimination. Recursive least squares estimation, used for on-line data analysis, is briefly covered in Chapter 13. [Pg.448]

An alternative SPM framework for autocorrelated data is developed by monitoring variations in time series model parameters that are updated at each new measurement instant. Parameter change detection with recursive weighted least squares was used to detect changes in the parameters and the order of a time series model that describes stock prices in financial markets [263]. Here, the recursive least squares is extended with adaptive forgetting. [Pg.27]

In this chapter we present very briefly the basic algorithm for recursive least squares estimation and some of its variations for single input - single output systems. These techniques are routinely used for on-line parameter estimation in data acquisition systems. They are presented in this chapter without any proof for the sake of completeness and with the aim to provide the reader with a quick overview. For a thorough presentation of the material the reader may look at any of the following references Soderstrom et al. (1978), Ljung and Soderstrom (1983) Shanmugan and Breipohl, (1988), Wellstead and Zarrop (1991). The notation that will be used in this chapter is different from the one we have used up to now. Instead we shall follow the notation typically encountered in the analysis and control of sampled data systems. [Pg.239]

Structure of model of identification system ARX Identification method recursive least squares Adaptation algorithm parametric decreasing gain Te=3s Delay D=0... [Pg.43]

This chapter describes two new methods for obtaining frequency response and step response models from processes operating under relay feedback control. Both methods are based on the frequency sampling filter model structure and a recursive least squares estimator. [Pg.201]

Given the process input-output data generated from the relay experiment, the parameter vector 0 can be estimated using a recursive algorithm. Here, we propose to use the recursive least squares algorithm (Goodwin and Sin, 1984) given as follows... [Pg.204]

The second type of model is called recurrent model or N-step-ahead prediction model the recursive least squares algorithm can be used to identify this type of model. Only after the pH has been predicted at a particular time can the next predicted value be calculated. In addi-... [Pg.406]

The most simple model identification procedure for recurrent model identification is to assume fixed membership functions and use recursive least squares to identify the consequence part of the model. The model structure in this case is the model as shown in Eqn. (29.25b). [Pg.409]

F2912.m using recursive least squares for pH recurrent model development... [Pg.410]

If the filter order m is not fixed but increases with time index k instead, a recursive least-squares filter can be derived from the batch type least-squares [4]. [Pg.607]


See other pages where Least squares recursive is mentioned: [Pg.218]    [Pg.219]    [Pg.139]    [Pg.17]    [Pg.240]    [Pg.207]    [Pg.212]    [Pg.19]    [Pg.124]    [Pg.133]    [Pg.3921]    [Pg.439]    [Pg.1187]    [Pg.204]    [Pg.233]    [Pg.238]    [Pg.402]    [Pg.402]    [Pg.403]    [Pg.221]    [Pg.226]    [Pg.429]    [Pg.391]    [Pg.606]    [Pg.607]   
See also in sourсe #XX -- [ Pg.204 ]




SEARCH



Recursion

Recursive

Recursive Extended Least Squares (RELS)

Recursive Generalized Least Squares (RGLS)

Recursive Least Squares (RLS)

Recursive Least squares modeling

© 2024 chempedia.info