Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

One-dimensional minimization

The classes implemented for one-dimensional minimization in the BzzMath library are [Pg.62]

The BzzMinimizationMono class is designed to solve one-dimensional minimization problems for unimodal functions. This class can be employed when the function is continuous and defined on the overall interval of uncertainty. [Pg.62]

The BzzMinimizationMono class does not use parallel computing however, the BzzMinimizationMonoMP class does and is valid for all cases from the previous class a multiprocessor machine, an adequate compiler, and opmMP directives are required to make proper use of shared memory. [Pg.62]

These classes do not use parallel computing and will be discussed in detail in Chapter 5. [Pg.62]

The BzzMinimumMono dass has numerous constructors. The most important are [Pg.63]


Another difference from steepest descent is that a one-dimensional minimization is performed in each search direction. Aline minimization is made along a direction hj until a minimum... [Pg.304]

In the global region the Newton or quasi-Newton step may not be satisfactory. It may, for example, increase rather than decrease the function to be minimized. Although the step must then be rejected we may still use it to provide a direction for a one-dimensional minimization of the function. We then carry out a search along the Newton step until an acceptable reduction in the function is obtained and the result of this line search becomes our next step. [Pg.311]

A line search consists of an approximate one-dimensional minimization of the objective function along the computed direction p. This produces an acceptable step X and a new iterate xk + Xp. Function and gradient evaluations of the objective function are required in each line search iteration. In contrast, the trust region strategy minimizes approximately a local quadratic model of the function using current Hessian information. An optimal step that lies within... [Pg.21]

The line search is essentially an approximate one-dimensional minimization problem. It is usually performed by safeguarded polynomial interpola-tion.5 6>S4 56 That is, in a typical line step iteration, cubic interpolation is performed in a region of X that ensures that the minimum of /along p has been bracketed. The minimum of that polynomial then provides a new candidate for X. If the search directions are properly scaled, the initial trial point Xt = 1 produces a first reasonable trial move from xk. A simple illustration of such a first line search step is shown in Figure 9. The minimized one-dimensional function at the current point xk is defined by /(X) = f(xk + Xp ). The vectors corresponding to different values of X are set by x(X) = xk + Xp. ... [Pg.22]

As noted in the introduction, energy-only methods are generally much less efficient than gradient-based techniques. The simplex method [9] (not identical with the similarly named method used in linear programming) was used quite widely before the introduction of analytical energy gradients. The intuitively most obvious method is a sequential optimization of the variables (sequential univariate search). As the optimization of one variable affects the minimum of the others, the whole cycle has to be repeated after all variables have been optimized. A one-dimensional minimization is usually carried out by finding the... [Pg.2333]

It is possible and helpful to use parallel computing (for one-dimensional minimization). [Pg.44]

Therefore, the multidimensional minimization problem is significantly harder to solve than one-dimensional minimization. [Pg.85]

It might be more interesting to use the dogieg method, by considering it from the second point of view the dogleg piecewise is used as a one-dimensional minimization, d is not given a priori according to validity considerations on the quadratic model, but it is used as an optimization variable. [Pg.125]

These classes are designed to search for the one-dimensional minimization of a function in the following cases ... [Pg.187]

The function 4>(x2) = Min j (F(xi,X2)) to be minimized with respect to X2 is also multimodal and its plot is shown in Figure 5.9. Its minimization, therefore, requires a robust algorithm for one-dimensional minimization. [Pg.203]

The step parameter in (3) is chosen by an approximate one-dimensional minimization ik argmin < iine(0- 0. The numerical practice shows, that this one-dimensional problem has not to be solved too accurate but has to fulfill certain weak step conditions, which ensure that the descent (1) is proportional to the step and the gradient (2). [Pg.184]

A one-dimensional minimization process, component of many nonlinear optimization methods, performed via quadratic or cubic interpolation in combination with bracketing strategies. Newton s method... [Pg.1143]


See other pages where One-dimensional minimization is mentioned: [Pg.2333]    [Pg.112]    [Pg.34]    [Pg.62]    [Pg.63]    [Pg.65]    [Pg.67]    [Pg.186]    [Pg.186]    [Pg.263]    [Pg.66]    [Pg.1139]   
See also in sourсe #XX -- [ Pg.62 , Pg.186 ]




SEARCH



© 2024 chempedia.info