Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Nonlinear Regression Problems

In this section, nonlinear predictive model classes are considered and there is in general no closed-form solution for the updated model parameters in contrast to the previous linear case. An efficient algorithm is introduced in this section to search for the updated parameters. With a similar fashion to Equation (2.99), a nonlinear regression formula takes the following general form  [Pg.47]

The random variable is again modeled as a zero mean Gaussian random variable with variance CTj. Given some measurements of x and y, the likelihood function takes the same form as in [Pg.47]

Bayesian Methods for Structural Dynamics and Civil Engineering [Pg.48]

Given a particular parameter vector of n, the closed-form solution for the conditional optimal values for b and can be obtained. For example, with the prior PDF used in Section 2.4.1.1, the conditional optimal value for b can be computed in a similar fashion as Equation (2.103)  [Pg.48]

Once the conditional optimal values for b and are obtained, the value of the goodness-of-flt function can be computed by Equation (2.121) whereas the normalizing constant in the posterior PDF does not affect the parametric identification results. By maximizing the goodness-of-fit function with respect to n, the updated model parameters can be obtained. Therefore, the closed-form solution of the conditional optimal parameters reduces the dimension of the original optimization problem from - - - - 1 to N only. [Pg.48]


The structure of such models can be exploited in reducing the dimensionality of the nonlinear parameter estimation problem since, the conditionally linear parameters, kl5 can be obtained by linear least squares in one step and without the need for initial estimates. Further details are provided in Chapter 8 where we exploit the structure of the model either to reduce the dimensionality of the nonlinear regression problem or to arrive at consistent initial guesses for any iterative parameter search algorithm. [Pg.10]

The objectives in this chapter are two. The first one is to briefly review the essentials of linear regression and to present them in a form that is consistent with our notation and approach followed in subsequent chapters addressing nonlinear regression problems. The second objective is to show that a large number of linear regression problems can now be handled with readily available software such as Microsoft Excel and SigmaPlot . [Pg.23]

As seen in Chapter 2 a suitable measure of the discrepancy between a model and a set of data is the objective function, S(k), and hence, the parameter values are obtained by minimizing this function. Therefore, the estimation of the parameters can be viewed as an optimization problem whereby any of the available general purpose optimization methods can be utilized. In particular, it was found that the Gauss-Newton method is the most efficient method for estimating parameters in nonlinear models (Bard. 1970). As we strongly believe that this is indeed the best method to use for nonlinear regression problems, the Gauss-Newton method is presented in detail in this chapter. It is assumed that the parameters are free to take any values. [Pg.49]

Having the smoothed values of the state variables at each sampling point and having estimated analytically the time derivatives, n we have transformed the problem to a usual nonlinear regression problem for algebraic models. The parameter vector is obtained by minimizing the following LS objective function... [Pg.120]

The authors describe the use of a Taylor expansion to negate the second and the higher order terms under specific mathematical conditions in order to make any function (i.e., our regression model) first-order (or linear). They introduce the use of the Jacobian matrix for solving nonlinear regression problems and describe the matrix mathematics in some detail (pp. 178-181). [Pg.165]

Among the nonlinear methods, there are, besides nonlinear least squares regression, that is, polynomial regression, the nonlinear PLS method. Alternating Conditional Expectations (ACEj, SMART, and MARS. Moreover, some Artificial Neural Networks techniques have been specifically designed for nonlinear regression problems, such as the back-propagation method. [Pg.126]

Kowalik. J.. Morrison, J. F. Anal5rsis of Kinetic Data for Allosteric Enzyme Reactions as a Nonlinear Regression Problem. Mathemat. Biosci. 2, 57 (1968). [Pg.71]

Table 3.5 Example of jackknifing a nonlinear regression problem. Table 3.5 Example of jackknifing a nonlinear regression problem.
In Chapter 2, Section 2.4, parametric identification was introduced for linear and nonlinear regression problems. In this section, the Bayesian model class selection is applied to these problems. In order to smooth the presentation, some of the equations from Section 2.4 are repeated in this section. [Pg.229]

The training of the ME network may be performed using a maximum likelihood parameter estimator. For the class of nonlinear regression problems (which is our case) the objective is to map a set of training patterns x,d). [Pg.840]

Unlike in linear regression where exact results can be obtained under the stated assumptions, in nonlinear regression the results are only approximate. Furthermore, there do not exist nice matrix-based solutions for the various parameters. This section provides a convenient summary of the useful equations for nonlinear regression. In general, to compute the approximate confidence intervals for a nonlinear regression problem, the final grand Jacobian matrix, J, can be used in place of A and J in place of in the linear regression formulae. [Pg.122]

A detailed example on solving the nonlinear regression problem is given in Sect. 7.8.2, Nonlinear Regression Example for MATLAB , and Sect. 8.7.2, Nonlinear Regression Example for Excel . [Pg.125]

In this case, however, it is not possible to solve the nonlinear Equation A10.38 analytically, but a numerical algorithm, for example, the Newton-Raphson method for the solution of nonlinear equations, should be applied (Appendix 1). A general way to solve the nonlinear regression problem Equation A10.36 is to vary the value of a systematically by an optimum search method until the minimum is attained. This method is called nonlinear regression, and it is illustrated in Figure A 10.6. [Pg.597]

Fortunately, powerful nonlinear regression programs now are available. These programs allow us to minimize the sum of the squares of the deviations in any variable we choose, linear or not. Moreover, some of the easier nonlinear regression problems can be solved with a simple spreadsheet. Let s illustrate the use of a spreadsheet to carry out nonlinear regression by reanalyzing the AIBN decomposition data in Table 6-5. [Pg.171]

With aU nonlinear regression problems, it is advisable to check the final solution to be sure that a true minimum has been achieved. This is done by varying the values of k(363) and by a small amount in both directions around the values determined by nonlinear regression. The results in Appendix 6-A show that 2 i (, -,theo — i,exp) Ai,exp increases when k(363) and E are either increased or decreased slightly from the values determined by nonlinear regression. This is the behavior that would be expected if a minimum in l Li ki,thso - .exp)Ai,had, indeed, been found. [Pg.172]


See other pages where Nonlinear Regression Problems is mentioned: [Pg.55]    [Pg.26]    [Pg.82]    [Pg.76]    [Pg.382]    [Pg.8]    [Pg.9]    [Pg.11]    [Pg.47]    [Pg.219]    [Pg.234]    [Pg.397]    [Pg.121]    [Pg.85]    [Pg.172]    [Pg.215]    [Pg.376]   


SEARCH



Nonlinear problems

Nonlinear regression

Regression problem

© 2024 chempedia.info