Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Optimization/minimization unconstrained

Banga et al. [in State of the Art in Global Optimization, C. Floudas and P. Pardalos (eds.), Kluwer, Dordrecht, p. 563 (1996)]. All these methods require only objective function values for unconstrained minimization. Associated with these methods are numerous studies on a wide range of process problems. Moreover, many of these methods include heuristics that prevent premature termination (e.g., directional flexibility in the complex search as well as random restarts and direction generation). To illustrate these methods, Fig. 3-58 illustrates the performance of a pattern search method as well as a random search method on an unconstrained problem. [Pg.65]

The condition number of the Hessian matrix of the objective function is an important measure of difficulty in unconstrained optimization. By definition, the smallest a condition number can be is 1.0. A condition number of 105 is moderately large, 109 is large, and 1014 is extremely large. Recall that, if Newton s method is used to minimize a function/, the Newton search direction s is found by solving the linear equations... [Pg.287]

Another useful program (E04HAA) provides constrained optimization with bounds for each parameter using a sequential penalty function technique, which effectively operates around unconstrained minimization cycles. [Pg.157]

The majority of minimization routines are designed for unconstrained optimization, in which the control algorithm is free to select any parameters it wishes. Only a minority can handle constrained optimization. [Pg.214]

There is no need for orbital constraints to enforce the fully symmetric nature of the two inner orbitals or the symmetry relations between the three pairs of valence orbitals. The fully-symmetric SC solution corresponds to a proper minimum in the unconstrained SC optimization space. It has been verified to be stable against symmetry-breaking perturbations, including the admixture of n basis functions into the orbitals, in the sense that energy minimization from such a perturbed initial guess spontaneously restores the orbitals to purely a character and to full symmetry, converging back onto the unperturbed solution. [Pg.292]

Sargent, R.W.H., and D.J. Sebastian, "Numerical Experience with Algorithms for Unconstrained Minimization" in F.A. Lootsma (Ed)., "Numerical Methods for Nonlinear Optimization", Academic Press 1972, pp45-68. [Pg.53]

K. Takayama, H. Imaizumi, N. Nambu, and T. Nagai, Mathematical optimization of formulation of indomethacin/polyvinylpolypyrrolidone/methyl cellulose solid dispersions by the sequential unconstrained minimization technique, Chem. Pharm. Bull., 33, 292-300 (1985). [Pg.257]

Lagrange multiphers for, 2553-2554 and nonsmooth optimization, 2562 quadratic programming problems, 2555, 2562 separable programming problems, 2556-2558 sequential unconstrained minimization techniques for, 2560-2562 successive linear programming, 2562 successive quadratic programming, 2562 Constraint(s) ... [Pg.2714]

Some well-known stochastic methods for solving SOO problems are simulated annealing (SA), GAs,DE and particle swarm optimization (PSO). These were initially proposed and developed for optimization problems with bounds only [that is, unconstrained problems without Equations (4.7) and (4.8)]. Subsequently, they were extended to constrained problems by incorporating a strategy for handling constraints. One relatively simple and popular sdategy is the penalty function, which involves modifying the objective function (Equation 4.5) by the addition (in the case of minimization) of a term which depends on constraint violation. Eor example, see Equation (4.9),... [Pg.109]

Using what-if analysis, demand forecasters can shape unconstrained demand based on current sales and marketing activities as well as external factors affecting demand. This includes weather, special events, and economic conditions to optimize volume and revenue while minimizing marketing investment. Figure 3.7 illustrates the four key steps in the market-driven demand management process. [Pg.127]

Once we select a merit function, it can be minimized using the unconstrained optimization methods (see Chapter 3). The gradient method is just one of the methods available. The function changes more rapidly in the direction of the gradient. With reference to the merit function (7.12), the gradient in x is given by... [Pg.244]

The penalty function is often used as the function to be minimized not only in unconstrained optimization programs but also as the check function in programs that use other constrained optimization strategies. [Pg.426]

Newton and Leibnitz. The foundations of calculus of variations were laid by Bernoulli, Euler, Lagrange and Weierstrass. The optimization of constrained problems, which involves the addition of unknown multipliers, became known by the name of its inventor Lagrange. Cauchy made the first application of the steepest descent method to solve unconstrained minimization problems. In spite of these early contributions, very little progress was made until the middle of the 20th century, when high-speed digital computers made the implementation of the optimization procedures possible and stimulated further research in new methods. [Pg.425]

Optimization of the objective function on its own, allowing the optimization variables to take any values, is called unconstrained optimization. Usually, we have additional information that has to be considered, such as the physical bounds of the reaction rates, limited from above by the collision rate theory and from below by 0, and the uncertainties in the experimental observations. Minimization of equation (1) or (2) simultaneously with additional equalities or inequalities applied to optimization variables is referred to as constrained optimization. [Pg.246]

Nonlinear optimization is one of the crucial topics in the numerical treatment of chemical engineering problems. Numerical optimization deals with the problems of solving systems of nonlinear equations or minimizing nonlinear functionals (with respect to side conditions). In this article we present a new method for unconstrained minimization which is suitable as well in large scale as in bad conditioned problems. The method is based on a true multi-dimensional modeling of the objective function in each iteration step. The scheme allows the incorporation of more given or known information into the search than in common line search methods. [Pg.183]


See other pages where Optimization/minimization unconstrained is mentioned: [Pg.206]    [Pg.206]    [Pg.220]    [Pg.206]    [Pg.184]    [Pg.288]    [Pg.388]    [Pg.402]    [Pg.78]    [Pg.160]    [Pg.275]    [Pg.131]    [Pg.68]    [Pg.2]    [Pg.93]    [Pg.90]    [Pg.363]    [Pg.196]    [Pg.283]    [Pg.202]    [Pg.121]    [Pg.184]    [Pg.218]    [Pg.2546]    [Pg.2560]    [Pg.2757]    [Pg.1379]    [Pg.153]    [Pg.613]    [Pg.109]    [Pg.163]    [Pg.258]    [Pg.267]   
See also in sourсe #XX -- [ Pg.112 ]




SEARCH



Unconstrained

© 2024 chempedia.info