Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Nonlinear Hessian matrix

This form is convenient in that the active inequality constraints can now be replaced in the QP by all of the inequalities, with the result that Sa is determined directly from the QP solution. Finally, since second derivatives may often be hard to calculate and a unique solution is desired for the QP problem, the Hessian matrix, is approximated by a positive definite matrix, B, which is constructed by a quasi-Newton formula and requires only first-derivative information. Thus, the Newton-type derivation for (2) leads to a nonlinear programming algorithm based on the successive solution of the following QP subproblem ... [Pg.201]

The methods differ in the determination of the step length factor ak at the Ath iteration, since the direction of the steepest descent is, due to nonlinearities, not necessarily the optimal one, but only for quadratic dependencies. Some methods therefore use the second derivative matrix of the objective function with respect to the parameters, the Hessian matrix, to determine the parameter improvement step-length and its direction ... [Pg.316]

Hessian matrix (nonlinear programming), 2546 Heterarchical systems vs, 697 Heuristics, 2014, 2198-2199 Heuristic search, 2589-2591... [Pg.2735]

As far as multidimensional optimization problems are concerned, the matrix B can be a bad approximation of the Hessian (provided it is positive definite) yet still be able to guarantee a reduction in the merit function. Conversely, the matrix B involved in the solution of nonlinear systems should be a good estimate of the Jacobian. [Pg.247]

In optimization problems, the Hessian is only occasionally ill-conditioned at the function minimum. In the solution of nonlinear equations systems, the Jacobian matrix may become singular when the gradient of the merit function approaches zero in correspondence with the minimum of the same function. [Pg.254]

Large sparse problems with bound, equality and inequality linear constraints, and nonlinear equality constraints. G is a BzzMatrixSparseSymmetricLocked class object where the structure of the Hessian of the function (13.1) is provided. nH is the number of nonlinear equality constraints, HName is the name of the function where the nonlinear equality constraints are calculated, H is the name of the BzzMatrixSparseLocked matrix with the structure of the nonlinear equality constraints and nlH is a BzzVectorInt whose elements indicate which variables are really nonlinear in the system of nonlinear equality constraints. The matrices E and D are sparse (BzzMatrixSparseLocked) ... [Pg.443]

Nonlinear CG methods form another popular type of optimization scheme for large-scale problems where memory and computational performance are important considerations. These methods were first developed in the 1960s by combining the linear CG method (an iterative technique for solving linear systems Ax = b where A is an /i x /i matrix ) with line-search techniques. The basic idea is that if / were a convex quadratic function, the resulting nonlinear CG method would reduce to solving the Newton equations (equation 27) for the constant and positive-definite Hessian H. [Pg.1151]

S(9ls) is the Hessian of S(9X evaluated at 0ls- Following our numerical treatment of nonlinear least squares, we define the matrix //(0ls) as... [Pg.391]


See other pages where Nonlinear Hessian matrix is mentioned: [Pg.292]    [Pg.207]    [Pg.247]    [Pg.68]    [Pg.28]    [Pg.50]    [Pg.50]    [Pg.839]    [Pg.2349]    [Pg.240]    [Pg.3345]    [Pg.2342]    [Pg.147]    [Pg.575]    [Pg.2334]    [Pg.202]    [Pg.2334]   
See also in sourсe #XX -- [ Pg.261 ]




SEARCH



Hessian

Hessian matrix

© 2024 chempedia.info