Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Functional estimation problem regression function

The formulation of the parameter estimation problem is equally important to the actual solution of the problem (i.e., the determination of the unknown parameters). In the formulation of the parameter estimation problem we must answer two questions (a) what type of mathematical model do we have and (b) what type of objective function should we minimize In this chapter we address both these questions. Although the primary focus of this book is the treatment of mathematical models that are nonlinear with respect to the parameters nonlinear regression) consideration to linear models linear regression) will also be given. [Pg.7]

The magnitude of /emp(g) will be referred as the empirical error. All regression algorithms, by minimizing the empirical risk /emp(g), produce an estimate, g(x), which is the solution to the functional estimation problem. [Pg.151]

A basic version of the hedonic price function for maize herbicides P z was estimated by regressing PRICE on the four characteristics RELIA, PERSI, ACTION and TOX. A standard problem in such an estimation is that all characteristics of importance for herbicides price are not likely to be included. In this study, the most obvious examples are characteristics that... [Pg.57]

When working with regression-based techniques for process model identification, one of the challenging tasks is to determine the most appropriate process model structure. In a linear model context, this would be information such as the number of poles and zeros to be included in the transfer function description. If the structure of the system being identified is known in advance, then the problem reduces to a much simpler parameter estimation problem. [Pg.3]

The unknown model parameters will be obtained by minimizing a suitable objective function. The objective function is a measure of the discrepancy or the departure of the data from the model i.e., the lack of fit (Bard, 1974 Seinfeld and Lapidus, 1974). Thus, our problem can also be viewed as an optimization problem and one can in principle employ a variety of solution methods available for such problems (Edgar and Himmelblau, 1988 Gill et al. 1981 Reklaitis, 1983 Scales, 1985). Finally it should be noted that engineers use the term parameter estimation whereas statisticians use such terms as nonlinear or linear regression analysis to describe the subject presented in this book. [Pg.2]

As seen in Chapter 2 a suitable measure of the discrepancy between a model and a set of data is the objective function, S(k), and hence, the parameter values are obtained by minimizing this function. Therefore, the estimation of the parameters can be viewed as an optimization problem whereby any of the available general purpose optimization methods can be utilized. In particular, it was found that the Gauss-Newton method is the most efficient method for estimating parameters in nonlinear models (Bard. 1970). As we strongly believe that this is indeed the best method to use for nonlinear regression problems, the Gauss-Newton method is presented in detail in this chapter. It is assumed that the parameters are free to take any values. [Pg.49]

Having the smoothed values of the state variables at each sampling point and having estimated analytically the time derivatives, n we have transformed the problem to a usual nonlinear regression problem for algebraic models. The parameter vector is obtained by minimizing the following LS objective function... [Pg.120]

This problem may be solved by linear regression using equations 3.4-11 (n = 1) and 3.4-9 (with n = 2), which correspond to the relationships developed for first-order and second-order kinetics, respectively. However, here we illustrate the use of nonlinear regression applied directly to the differential equation 3.4-8 so as to avoid use of particular linearized integrated forms. The method employs user-defined functions within the E-Z Solve software. The rate constants estimated for the first-order and second-order cases are 0.0441 and 0.0504 (in appropriate units), respectively (file ex3-8.msp shows how this is done in E-Z Solve). As indicated in Figure 3.9, there is little difference between the experimental data and the predictions from either the first- or second-order rate expression. This lack of sensitivity to reaction order is common when fA < 0.5 (here, /A = 0.28). [Pg.59]

Linear regression methods are convenient because they usually allow an interpretation, and methods like PLS or PCR avoid the problem of overfitting, especially if many parameters have to be estimated with only a few objects. However, the relation between a response variable and one or several predictor variables can in some situations be better described by nonlinear functions, like by the functional relation... [Pg.182]

In a strict sense parameter estimation is the procedure of computing the estimates by localizing the extremum point of an objective function. A further advantage of the least squares method is that this step is well supported by efficient numerical techniques. Its use is particularly simple if the response function (3.1) is linear in the parameters, since then the estimates are found by linear regression without the inherent iteration in nonlinear optimization problems. [Pg.143]

To solve this problem, some assumptions should be made on the relationship between the error of the regression line and the concentration. As a rule, one assumes that the error of the regression line is proportional to the concentration. The variance function Var(X) is obtained by plotting the standard error vs. the concentration. The function is consequently estimated with the least-squares method Var(X) = Sl = (c -T d cone)2. An alternative approach is described in the ISO 11483-2 standard, which uses an iterative procedure to estimate the variance function [18]. [Pg.145]


See other pages where Functional estimation problem regression function is mentioned: [Pg.29]    [Pg.158]    [Pg.163]    [Pg.166]    [Pg.447]    [Pg.365]    [Pg.185]    [Pg.520]    [Pg.343]    [Pg.14]    [Pg.143]    [Pg.148]    [Pg.10]    [Pg.166]    [Pg.260]    [Pg.376]    [Pg.88]    [Pg.341]    [Pg.129]    [Pg.159]    [Pg.160]    [Pg.162]    [Pg.543]    [Pg.307]    [Pg.501]    [Pg.304]    [Pg.158]    [Pg.227]    [Pg.66]    [Pg.77]    [Pg.182]    [Pg.145]    [Pg.146]    [Pg.149]    [Pg.680]    [Pg.227]    [Pg.518]   


SEARCH



Regression estimation

Regression problem

© 2024 chempedia.info