Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Gradient-based algorithms

Using optimization algorithms with these black box models is a challenging problem for at least two reasons. First, some of those models can take significant CPU computation and second, derivatives for gradient based algorithms usually cannot be accurately estimated because most of these black box models introduce noise (small sensitivity of some variables, termination criteria in the algorithms, etc). [Pg.551]

Snyman, J.A. (2005). Practical Mathematical Optimization. An Introduction to Basic Optimization Theory ami Classical and New Gradient-Based Algorithms, Springer, New York, 257 p. [Pg.40]

Pressure or density programming is the most popular of the gradient techniques in SFC. Density is the important parameter with respect to retention but pressure is the physical property which is directly monitored by SFC instruments. If enough experimental density-volume-temperature data are available for the mobile phase then a computer-based algorithm can be used to generate specific density programs. Such data are available for only a few mobile phases, such as carbon dioxide and the n-... [Pg.830]

An evolutionary algorithm is included in the current release of Frontline Systems Premium Excel Solver (for current information, see www.frontsys.com). It is invoked by choosing Standard Evolutionary from the Solver dropdown list in the Solver Parameters Dialog Box. The other nonlinear solver is Standard GRG Nonlinear, which is the GRG2 solver described in Section 8.7. As discussed there, GRG2 is a gradient-based local solver, which will find the nearest local solution to its starting point. The evolutionary solver is much less likely to stop at a local minimum, as we illustrate shortly. [Pg.403]

Although this example results in fairly extreme conditions, it does point the need to alter the gradient based search to avoid well dewatering and algorithmic failure. Because the search trajectory of the optimization algorithm and the size and shape of the feasible region cannot be predicted, additional search procedures must be included for all unconfined aquifers. [Pg.35]

A neural network is typically trained by variations of gradient descent-based algorithms, trying to minimize an error function [77]. It is important that additional validation data be left untouched during ANN training, so as to have an objective measure of the model s generalization ability [78],... [Pg.360]

Conventional gradient base optimisation techniques are not effective to deal with objective functions with multiple local minima and can be trapped in local minima. Particle swam optimisation (PSO) is a recently developed optimisation technique that can cope with multiple local minima. This paper proposes using PSO and stacked neural networks to find the optimal control policy for batch processes. A standard PSO algorithm and three new PSO algorithms with local search were developed. In order to enhance the reliability of the obtained optimal control policy, an additional term is added to the optimisation objective function to penalise wide model prediction confidence bormds. [Pg.375]

All the four PSO algorithms can find the global optimal solutions whereas the gradient based optimisation algorithm from the MATLAB Optimisation Toolbox, fminunc, fails to find the global optimal solutions when the initial values are not close to the global optimal solutions. [Pg.377]


See other pages where Gradient-based algorithms is mentioned: [Pg.109]    [Pg.109]    [Pg.385]    [Pg.202]    [Pg.105]    [Pg.519]    [Pg.519]    [Pg.60]    [Pg.28]    [Pg.132]    [Pg.93]    [Pg.1138]    [Pg.3818]    [Pg.1786]    [Pg.109]    [Pg.109]    [Pg.385]    [Pg.202]    [Pg.105]    [Pg.519]    [Pg.519]    [Pg.60]    [Pg.28]    [Pg.132]    [Pg.93]    [Pg.1138]    [Pg.3818]    [Pg.1786]    [Pg.71]    [Pg.207]    [Pg.672]    [Pg.66]    [Pg.154]    [Pg.412]    [Pg.146]    [Pg.167]    [Pg.168]    [Pg.170]    [Pg.26]    [Pg.21]    [Pg.294]    [Pg.207]    [Pg.184]    [Pg.483]    [Pg.484]    [Pg.616]    [Pg.148]    [Pg.105]    [Pg.71]    [Pg.102]    [Pg.44]    [Pg.223]    [Pg.42]    [Pg.196]    [Pg.173]   
See also in sourсe #XX -- [ Pg.519 ]

See also in sourсe #XX -- [ Pg.2 , Pg.1138 ]




SEARCH



Gradient algorithms

© 2024 chempedia.info