Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Gradient-based optimization method

The problem of local minima in function minimization or optimization problems has given rise to the development of a variety of algorithms which are able to seek global minima. Traditional gradient based optimizers proceed by the selection of fruitful search directions and subsequent numerical one-dimensional optimization along these paths. Such methods are therefore inherently prone to the discovery of local minima in the vicinity of their starting point, as illustrated in Fig. 5.4(b). This property is in fact desirable in... [Pg.122]

As shown in the multimedia CD-ROM that accompanies this book (see HYSYS Tutorials Process Design —> Multi-draw Tower Optimization and ASPEN —> Tutorials —> Process Design — Multi-draw Tower Optimization), the solution of this NLP requires care, since the gradient-based SQP method is sen-... [Pg.638]

Gradient methods discussed above use a quadratic function (energy, gradient and approximate Hessian) to model the energy surface near the transition state. Distance-weighted interpolants provide a more flexible functional form that can interpolate arbitrarily spaced points with a smooth differentiable function. For a gradient-based optimization, the Shepard interpolation functions seem appropriate... [Pg.277]

Many analytical optimal placement methods have been proposed, including methods based on the principles of optimal control theory, gradient-based search methods, and an analysis/redesign method. [Pg.36]

Within some programs, the ROMPn methods do not support analytic gradients. Thus, the fastest way to run the calculation is as a single point energy calculation with a geometry from another method. If a geometry optimization must be done at this level of theory, a non-gradient-based method such as the Fletcher-Powell optimization should be used. [Pg.229]

The well-known Box-Wilson optimization method (Box and Wilson [1951] Box [1954, 1957] Box and Draper [1969]) is based on a linear model (Fig. 5.6). For a selected start hyperplane, in the given case an area A0(xi,x2), described by a polynomial of first order, with the starting point yb, the gradient grad[y0] is estimated. Then one moves to the next area in direction of the steepest ascent (the gradient) by a step width of h, in general... [Pg.141]

Other Gradient-Based NLP Solvers In addition to SQP methods, a number of NLP solvers have been developed and adapted for large-scale problems. Generally these methods require more function evaluations than of SQP methods, but they perform very well when interfaced to optimization modeling platforms, where function evaluations are cheap. All these can be derived from the perspective of applying Newton steps to portions of the KKT conditions. [Pg.63]

BUSTER/TNT (Bricogne and Irvin, 1996) is another likelihood based refinement package that excels especially in cases in which the model is still severely incomplete (Blanc et al., 2004 Tronrud et al., 1987). It uses atomic parameters but also has a novel solvent and missing model envelope fimc-tion. The optimization method is a preconditioned conjugate gradient as implemented in the TNT package (Tronrud et al., 1987) that had a faithful audience in the pre-likelihood era. [Pg.164]

Optimization methods are generally based on the typical algorithm as follows [9] Choose an initial set of coordinates (variables) x° and calculate the energy E(x°) and gradients g(x°), then... [Pg.245]


See other pages where Gradient-based optimization method is mentioned: [Pg.672]    [Pg.170]    [Pg.127]    [Pg.709]    [Pg.483]    [Pg.484]    [Pg.102]    [Pg.250]    [Pg.22]    [Pg.420]    [Pg.672]    [Pg.170]    [Pg.127]    [Pg.709]    [Pg.483]    [Pg.484]    [Pg.102]    [Pg.250]    [Pg.22]    [Pg.420]    [Pg.385]    [Pg.203]    [Pg.121]    [Pg.148]    [Pg.53]    [Pg.65]    [Pg.30]    [Pg.37]    [Pg.93]    [Pg.2333]    [Pg.71]    [Pg.207]    [Pg.62]    [Pg.70]    [Pg.70]    [Pg.195]    [Pg.34]    [Pg.151]    [Pg.202]    [Pg.76]    [Pg.154]    [Pg.159]    [Pg.167]    [Pg.168]    [Pg.105]    [Pg.21]    [Pg.207]    [Pg.75]    [Pg.162]    [Pg.169]    [Pg.526]   
See also in sourсe #XX -- [ Pg.127 ]




SEARCH



Based Optimization

Gradient method

Gradient optimization methods

Gradients optimization

Optimization methods

Optimization-Based Methods

Optimized method

© 2024 chempedia.info