Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Rosenbrock s function

Edgar and Himmelblau (1988) demonstrate the use of the method for a function of two variables. Nelder and Mead (1965) presented the method for a function of N variables as a flow diagram. They demonstrated its use by applying it to minimize Rosenbrock s function (Equation 5.22) as well as to the following functions ... [Pg.81]

Will Newton s method minimize Rosenbrock s function... [Pg.216]

Rosenbrock s function is often used as a minimization test problem, because its minimum lies at the base of a banana-shaped valley and can be difficult to locate. This function is defined for even integers n as the sum... [Pg.51]

The contour plot of Rosenbrock s function for n - 2 is shown in Figure 14. The minimum point is (1,1), where f( ) = 0. The gradient components of this function are given by... [Pg.51]

An illustration of simulated annealing optimization is presented in Figures 1-5. In this 2-dimensional example the objective fimction is a modified Rosenbrock s function ... [Pg.7]

Figure 2. Biased random walk with a random step size — Rosenbrock s function. Figure 2. Biased random walk with a random step size — Rosenbrock s function.
For example, let us consider the well-known Rosenbrock s function problem ... [Pg.89]

List the relative advantages and disadvantages (there can be more than one) of the following methods for a two-variable optimization problem such as Rosenbrock s banana function (see Fig. P6.19)... [Pg.217]

Other methods of multidimensional search without using derivatives include Rosenbrock s method (1960) and the simplex method of Spendley et al. (1962), which was later modified by Nelder and Meade (1974). Although it has the same name, this simplex method is not the same algorithm as that used for linear progranuning it is a polytope algorithm that requires only functional evaluations and requires no smoothness assumptions. [Pg.2550]

For the complex benchmark functions, simulations showed how the algorithm was able to reach smaller values than GAs, PSO and SGA obtaining good results with a basic set of values a population of only 10 elements and a maximum of 10 iterations per experiment, except for the Rosenbrock s valley function in which a Genetic Algorithm with a population of 150 individuals and 200 generations obtained better results than the CRA. [Pg.57]

It is noted that the Rosenbrock function given by the next equation has been used to test the performance of various algorithms including modified Newton s and conjugate gradient methods (Scales, 1986)... [Pg.77]

An alternative, called semi-implicit methods in such texts as [351], avoids the problems, and some of the variants are L-stable (see Chap. 14 for an explanation of this term), a desirable property. This was devised by Rosenbrock in 1962 474]. There are two strong points about this set of formulae. One is that the constants in the implicit set of equations for the k s are chosen such that each can be evaluated explicitly by easy rearrangement of each equation. The other is that the method lends itself ideally to nonlinear functions, not requiring iteration, because it is, in a sense, already built-in. This is explained below. [Pg.68]

Fig. 9.2 illustrates the Rosenbrock method for a model with two parameters. Because the algorithm attempts searching along the axis of a valley of the objective function S, many evaluations of S can be omitted compared with, for example, the algorithm with univariate... [Pg.289]


See other pages where Rosenbrock s function is mentioned: [Pg.196]    [Pg.203]    [Pg.133]    [Pg.7]    [Pg.12]    [Pg.137]    [Pg.139]    [Pg.196]    [Pg.203]    [Pg.133]    [Pg.7]    [Pg.12]    [Pg.137]    [Pg.139]    [Pg.100]    [Pg.101]    [Pg.118]    [Pg.68]    [Pg.168]    [Pg.133]   
See also in sourсe #XX -- [ Pg.196 ]




SEARCH



Rosenbrock

Rosenbrock function

S-function

© 2024 chempedia.info