Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Rosenbrock algorithm

Maple s stiff solver (Rosenbrock algorithm) is used by setting the option stiff = true ... [Pg.477]

It is noted that the Rosenbrock function given by the next equation has been used to test the performance of various algorithms including modified Newton s and conjugate gradient methods (Scales, 1986)... [Pg.77]

On the other hand, the optimal control problem with a discretized control profile can be treated as a nonlinear program. The earliest studies come under the heading of control vector parameterization (Rosenbrock and Storey, 1966), with a representation of U t) as a polynomial or piecewise constant function. Here the mode is solved repeatedly in an inner loop while parameters representing V t) are updated on the outside. While hill climbing algorithms were used initially, recent efficient and sophisticated optimization methods require techniques for accurate gradient calculation from the DAE model. [Pg.218]

Both the development and the optimization of simplex methods are still continuing. Several functions have been designed to test the performance of the simplex algorithms, one example is the famous ROSENBROCK valley. Other test functions have been reported by ABERG and GUSTAVSSON [1982]. Most analytical applications of simplex optimization are found in atomic spectroscopy [SNEDDON, 1990] and chromatography [BERRIDGE, 1990],... [Pg.92]

The pattern searches implement the method of Hooke and Jeeves using a line search and the method of Rosenbrock [6]. The Hooke and Jeeves algorithm that was used in this work can be thought of in two separate phases an exploratory search and a pattern search. The exploratory search successively perturbs each parameter and tests the resulting performance. The pattern search steps along the (iTfc+i-a ) direction, which is the direction between the last two points selected by the exploratory search. When both positive and negative perturbations of the parameters do not result in enhanced performance, the perturbation size is decreased. When the perturbation size is less than an arbitrary termination factor, e, the algorithm stops. [Pg.197]

Other methods of multidimensional search without using derivatives include Rosenbrock s method (1960) and the simplex method of Spendley et al. (1962), which was later modified by Nelder and Meade (1974). Although it has the same name, this simplex method is not the same algorithm as that used for linear progranuning it is a polytope algorithm that requires only functional evaluations and requires no smoothness assumptions. [Pg.2550]

Below, we describe four algorithms that are able to handle small and medium dimension problems even in the presence of relatively narrow valleys without using any gradient or Hessian the Rosenbrock method (1960), the Hooke-Jeeves method (1961), the Simplex method (Spendley et al, 1962 Nelder and Mead, 1965), and the Optnov method (Buzzi-Ferraris, 1967). Note that their current structure is slightly different from the original one. [Pg.87]

For the complex benchmark functions, simulations showed how the algorithm was able to reach smaller values than GAs, PSO and SGA obtaining good results with a basic set of values a population of only 10 elements and a maximum of 10 iterations per experiment, except for the Rosenbrock s valley function in which a Genetic Algorithm with a population of 150 individuals and 200 generations obtained better results than the CRA. [Pg.57]

In contrast with algorithms using univariate search, the Rosenbrock method is a so-called acceleration method, which makes the direction and/or the distance (in this case both) of the parameter jumps dependent on the degree of success of the previous parameter jumps. With p parameters, the algorithm proceeds as follows ... [Pg.288]

Fig. 9.2 illustrates the Rosenbrock method for a model with two parameters. Because the algorithm attempts searching along the axis of a valley of the objective function S, many evaluations of S can be omitted compared with, for example, the algorithm with univariate... [Pg.289]

More rapid non-derivative methods in which the minimum of the function O is searched for in the direction of decreasing values, are e.g. the Probe algorithm and Rosenbrock methods. The increased speed, however, is achieved at the expense of reliable convergence. [Pg.241]


See other pages where Rosenbrock algorithm is mentioned: [Pg.197]    [Pg.197]    [Pg.203]    [Pg.68]    [Pg.535]    [Pg.161]    [Pg.167]    [Pg.169]    [Pg.170]    [Pg.279]    [Pg.196]    [Pg.247]    [Pg.118]    [Pg.194]    [Pg.200]    [Pg.202]    [Pg.203]    [Pg.337]   
See also in sourсe #XX -- [ Pg.90 ]




SEARCH



Rosenbrock

© 2024 chempedia.info