Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Optimization techniques Hessian computation

Newton s method and quasi-Newton techniques make use of second-order derivative information. Newton s method is computationally expensive because it requires analytical first-and second-order derivative information, as well as matrix inversion. Quasi-Newton methods rely on approximate second-order derivative information (Hessian) or an approximate Hessian inverse. There are a number of variants of these techniques from various researchers most quasi-Newton techniques attempt to find a Hessian matrix that is positive definite and well-conditioned at each iteration. Quasi-Newton methods are recognized as the most powerful unconstrained optimization methods currently available. [Pg.137]

Nonlinear CG methods form another popular type of optimization scheme for large-scale problems where memory and computational performance are important considerations. These methods were first developed in the 1960s by combining the linear CG method (an iterative technique for solving linear systems Ax = b where A is an /i x /i matrix ) with line-search techniques. The basic idea is that if / were a convex quadratic function, the resulting nonlinear CG method would reduce to solving the Newton equations (equation 27) for the constant and positive-definite Hessian H. [Pg.1151]


See other pages where Optimization techniques Hessian computation is mentioned: [Pg.238]    [Pg.44]    [Pg.78]    [Pg.143]    [Pg.188]    [Pg.66]    [Pg.50]    [Pg.104]    [Pg.238]    [Pg.282]    [Pg.21]    [Pg.112]   


SEARCH



Computed technique

Computer optimization

Computer techniques

Computing techniques

Hessian

Hessian computation

Optimization techniques

Optimizing Technique

© 2024 chempedia.info