Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Optimum search

Undoubtedly all of these parameters can influence both the rate and the certainty of optimum search. In this respect the importance of the size of the experimental region has already been discussed in our previous studies [23,24]. [Pg.308]

This variant was proposed by Floudas etal. (1989) and denoted as Global Optimum Search GOS and was applied to continuous as well as 0-1 set Y. It uses the same assumption as the one in v2-GBD but in addition assumes that... [Pg.135]

The Global Optimum Search GOS aimed at exploiting and invoking special structure for nonconvex nonseparable problems of the type (6.2). [Pg.136]

A. Aggarwal and C. A. Floudas. A decomposition approach for global optimum search in QP, NLP and MINLP problems. Ann. of Open Res., 25 119,1990b. [Pg.435]

Let us restrict ourselves to putting the numerous variants of sequential optimum search into some sort of order. Most of the methods mentioned in the following review are described by BUNDAY [1984 a], who also gives BASIC programs details of the simplex method and its programming may be found in BUNDAY [1984 b]. [Pg.91]

Floudas, C. A., and Aggarwal, A. A Decomposition Strategy for Global Optimum Search in the Pooling Problem, ORSA J. Comput. 2, 225-235 (1990). [Pg.241]

With all the necessary ingredients in place, the task is now to derive a reliable force field. In an automated refinement, the first step is to define in machine-readable form what constitutes a good force field. Following that, the parameters are varied, randomly or systematically (15,42). For each new parameter set, the entire data set is recalculated, to yield the quality of the new force field. The best force field so far is retained and used as the basis for new trial parameter sets. The task is a standard one in nonlinear numerical optimization many efficient procedures exist for selection of the optimum search direction (43). Only one recipe will be covered here, a combination of Newton-Raphson and Simplex methods that has been successfully employed in several recent parameterization efforts (11,19,20,28,44). [Pg.19]

In this case, however, it is not possible to solve the nonlinear Equation A10.38 analytically, but a numerical algorithm, for example, the Newton-Raphson method for the solution of nonlinear equations, should be applied (Appendix 1). A general way to solve the nonlinear regression problem Equation A10.36 is to vary the value of a systematically by an optimum search method until the minimum is attained. This method is called nonlinear regression, and it is illustrated in Figure A 10.6. [Pg.597]

Various optimum search methods exist for the minimization of objective functions, which can be used for the estimation of kinetic constants [3], for example, the Fibonacci method, the golden section method, the Newton-Raphson method, the Levenberg-Marquardt method, and the simplex method. Recently, even genetic algorithms have been... [Pg.598]

The multitude of search techniques to be considered makes it nearly impossible to work out an optimum search strategy right away. The fields Title, Index Terms and Keywords should be checked for their spelling or synonyms before the documents are displayed. The format DISPLAY SCAN is a free format, displaying the contents of the fields Title, Keywords and Index Terms in an arbitrary order. The command... [Pg.191]

Mountain-climbing analogy to using a searching algorithm to find the optimum response for a response surface. The path on the left leads to the global optimum, and the path on the right leads to a local optimum. [Pg.668]

Example of a false optimum for a one-factor-at-a-time searching algorithm. [Pg.671]

Find the optimum response for the response surface in Figure 14.7 using the fixed-sized simplex searching algorithm. Use (0, 0) for the initial factor levels, and set the step size for each factor to 1.0. [Pg.672]

Tor each of the following equations, determine the optimum response, using the one-factor-at-a-time searching algorithm. Begin the search at (0, 0) with factor A, and use a step size of 1 for both factors. The boundary conditions for each response surface are 0 < A < 10 and 0 < B < 10. Continue the search through as many cycles as necessary until the optimum response is found. Compare your optimum response for each equation with the true optimum. [Pg.700]

Once a direction is estabflshed for the next poiat ia the space of the variables of optimization (whether by random search, by systematic evaluation of gradients, or by any other methods of making perturbations), it is possible to take a jump ia the directioa of the improvement much greater than the size of the perturbations. This could speed up the process of finding the optimum and reduce computer time. If such a leap is successful, the next iteration may take a bigger leap and so on, until the improvement stops. Then one could reverse the direction and decrease the size of the step until the optimum is found. [Pg.79]

Most strategies hmit themselves to finding a local minimum point in the vicinity of the starting point for the search. Such a strategy will find the global optimum only if the problem has a single minimum point or a set of connected minimum points. A convex problem has only a global optimum. [Pg.485]

Once the objective and the constraints have been set, a mathematical model of the process can be subjected to a search strategy to find the optimum. Simple calculus is adequate for some problems, or Lagrange multipliers can be used for constrained extrema. When a Rill plant simulation can be made, various alternatives can be put through the computer. Such an operation is called jlowsheeting. A chapter is devoted to this topic by Edgar and Himmelblau Optimization of Chemical Processes, McGraw-HiU, 1988) where they list a number of commercially available software packages for this purpose, one of the first of which was Flowtran. [Pg.705]


See other pages where Optimum search is mentioned: [Pg.253]    [Pg.311]    [Pg.440]    [Pg.91]    [Pg.379]    [Pg.599]    [Pg.168]    [Pg.253]    [Pg.311]    [Pg.440]    [Pg.91]    [Pg.379]    [Pg.599]    [Pg.168]    [Pg.11]    [Pg.248]    [Pg.1772]    [Pg.182]    [Pg.98]    [Pg.668]    [Pg.668]    [Pg.668]    [Pg.669]    [Pg.669]    [Pg.669]    [Pg.670]    [Pg.671]    [Pg.674]    [Pg.699]    [Pg.21]    [Pg.60]    [Pg.79]    [Pg.79]    [Pg.522]    [Pg.522]    [Pg.744]    [Pg.744]   
See also in sourсe #XX -- [ Pg.91 ]




SEARCH



Sequential Methods for Optimum Search

© 2024 chempedia.info