Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Optimizing General Functions Finding Minima

The simple-minded approach for minimizing a function is to step one variable at a time until the function has reached a minimum, and then switch to another variable. This requires only the ability to calculate the function value for a given set of variables. As the variables are not independent, however, several cycles through the whole set are necessary for finding a minimum. This is impractical for more than five-ten variables, and may not work anyway. [Pg.383]

The Simplex method represents a more efficient approach using only function values for constructing an irregular polyhedron in parameters space, and moving this polyhedron towards the minimum, while allowing the size to contract or expands to improve the convergence. It is better than the simple-minded one-variable-at-a-time approach, but becomes too slow for many-dimensional functions. [Pg.383]

Since optimization problems in computational chemistry tend to have many variables, essentially all commonly used methods assume that at least the first derivative of the function with respect to all variables, the gradient g, can be calculated analytically (i.e. directly, and not as a numerical differentiation by stepping the variables). Some methods also assume that the second derivative matrix, the Hessian H, can be calculated. [Pg.383]

It should be noted that the target function and its derivative(s) are calculated with a finite precision, which depends on the computational implementation. A stationary point can therefore not be located exactly, the gradient can only be reduced to a certain value. Below this value, the numerical inaccuracies due to the finite precision wUl swamp the true functional behaviour. In practice, the optimization is considered converged if the gradient is reduced below a suitable cutoff value, or if the function change between two iterations becomes sufficiently small. Both these criteria may in some cases lead to problems, as a function with a very flat surface in a certain region may meet the criteria without containing a stationary point. [Pg.383]

Ihere are three classes of commonly used optimization methods for finding minima, each having their advantages and disadvantages. [Pg.383]


Many problems in computational chemistry can be formulated as an optimization of a multidimensional function/ Optimization is a general term for finding stationary points of a function, i.e. points where tlie first derivative is zero. In the majority of cases the desired stationary point is a minimum, i.e. all the second derivatives should be positive. In some cases the desired point is a first-order saddle point, i.e. the second derivative is negative in one, and positive in all other, directions. Some examples ... [Pg.316]

The free energy iv[f] must now be varied with respect to the location f as well as with respect to the transformation coefficients ao, aj j = 1,.. . , N. The details are given in Ref 107 and have been reviewed in Ref 49. The final result is that the frequency A and collective coupling parameter C are expressed in the continumn limit as functions of a generalized barrier frequency A, One then remains with a minimization problem for the free energy as a function of two variables - the location f and A, Details on the mmierical minimization may be found in Refs. 68,93. For a parabolic barrier one readily finds that the minimum is such that f = 0 and that X = In other words, in the parabolic barrier limit, optimal planar VTST reduces to the well known Kramers-Grote-Hynes expression for the rate. [Pg.13]

The multiple-minimum problem is a severe handicap of many large-scale optimization applications. The state of the art today is such that for reasonable small problems (30 variables or less) suitable algorithms exist for finding all local minima for linear and nonlinear functions. For larger problems, however, many trials are generally required to find local minima, and finding the global minimum cannot be ensured. These features have prompted research in conformational-search techniques independent of, or in combination with, minimization.26... [Pg.16]

An optimal control strategy for batch processes using particle swam optimisation (PSO) and stacked neural networks is presented in this paper. Stacked neural networks are used to improve model generalisation capability, as well as provide model prediction confidence bounds. In order to improve the reliability of the calculated optimal control policy, an additional term is introduced in the optimisation objective function to penalise wide model prediction confidence bounds. PSO can cope with multiple local minima and could generally find the global minimum. Application to a simulated fed-batch process demonstrates that the proposed technique is very effective. [Pg.375]

Optimal control problems involving multiple integrals are constrained by partial differential equations. A general theory similar to the Pontryagin s minimum principle is not available to handle these problems. To find the necessary conditions for the minimum in these problems, we assume that the variations of the involved integrals are weakly continuous and find the equations that eliminate the variation of the augmented objective functional. [Pg.178]


See other pages where Optimizing General Functions Finding Minima is mentioned: [Pg.383]    [Pg.389]    [Pg.159]    [Pg.447]    [Pg.329]    [Pg.458]    [Pg.2543]    [Pg.634]    [Pg.125]    [Pg.339]    [Pg.229]    [Pg.133]    [Pg.146]    [Pg.165]    [Pg.70]    [Pg.70]    [Pg.211]    [Pg.415]    [Pg.41]    [Pg.150]    [Pg.234]    [Pg.65]    [Pg.63]    [Pg.396]    [Pg.485]    [Pg.743]    [Pg.54]    [Pg.148]    [Pg.279]    [Pg.179]    [Pg.118]    [Pg.539]    [Pg.157]    [Pg.529]    [Pg.2598]    [Pg.3099]    [Pg.3115]    [Pg.265]    [Pg.399]    [Pg.1510]    [Pg.148]    [Pg.347]   


SEARCH



Functional general

General functions

Minimum function

Optimization function

Optimization functional

© 2024 chempedia.info