Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Quadratic line search

In the case of a nonlinear operator, 4. it is preferable to use an algorithm of the steepest descent method with the quadratic line search. It can bo summarized as follows ... [Pg.131]

Note that one can consider a quadratic line search similar to one outlined for... [Pg.138]

Fig. 5.8 The minimum in a line search may be found more effectively by fitting an analytical function such as a quadratic to the initial set of three points (1, 2 and 3). A better estimate of the minimum can then be found by fitting a new function to the points 1, 2 and 4 and finding its minimum. (Figure adapted from Press W H, B P Flannery,... Fig. 5.8 The minimum in a line search may be found more effectively by fitting an analytical function such as a quadratic to the initial set of three points (1, 2 and 3). A better estimate of the minimum can then be found by fitting a new function to the points 1, 2 and 4 and finding its minimum. (Figure adapted from Press W H, B P Flannery,...
This formula is exact for a quadratic function, but for real problems a line search may be desirable. This line search is performed along the vector — x. . It may not be necessary to locate the minimum in the direction of the line search very accurately, at the expense of a few more steps of the quasi-Newton algorithm. For quantum mechanics calculations the additional energy evaluations required by the line search may prove more expensive than using the more approximate approach. An effective compromise is to fit a function to the energy and gradient at the current point x/t and at the point X/ +i and determine the minimum in the fitted function. [Pg.287]

Let II II denote the Euclidean norm and define = gk+i gk- Table I provides a chronological list of some choices for the CG update parameter. If the objective function is a strongly convex quadratic, then in theory, with an exact line search, all seven choices for the update parameter in Table I are equivalent. For a nonquadratic objective functional J (the ordinary situation in optimal control calculations), each choice for the update parameter leads to a different performance. A detailed discussion of the various CG methods is beyond the scope of this chapter. The reader is referred to Ref. [194] for a survey of CG methods. Here we only mention briefly that despite the strong convergence theory that has been developed for the Fletcher-Reeves, [195],... [Pg.83]

Essentially we need to perform a simple line search along the direction of Ak(j+I). The simplest way to do this is by approximating the objective function by a quadratic along this direction. Namely,... [Pg.140]

Another class of methods of unidimensional minimization locates a point x near x, the value of the independent variable corresponding to the minimum of /(x), by extrapolation and interpolation using polynomial approximations as models of/(x). Both quadratic and cubic approximation have been proposed using function values only and using both function and derivative values. In functions where/ (x) is continuous, these methods are much more efficient than other methods and are now widely used to do line searches within multivariable optimizers. [Pg.166]

In doing the line search we can minimize a quadratic approximation in a given search direction. This means that to compute the value for a for the relation x ""1 = x + ask we must minimize... [Pg.195]

Difficulty 3 can be ameliorated by using (properly) finite difference approximation as substitutes for derivatives. To overcome difficulty 4, two classes of methods exist to modify the pure Newton s method so that it is guaranteed to converge to a local minimum from an arbitrary starting point. The first of these, called trust region methods, minimize the quadratic approximation, Equation (6.10), within an elliptical region, whose size is adjusted so that the objective improves at each iteration see Section 6.3.2. The second class, line search methods, modifies the pure Newton s method in two ways (1) instead of taking a step size of one, a line search is used and (2) if the Hessian matrix H(x ) is not positive-definite, it is replaced by a positive-definite matrix that is close to H(x ). This is motivated by the easily verified fact that, if H(x ) is positive-definite, the Newton direction... [Pg.202]

Trust regions. The name trust region refers to the region in which the quadratic model can be trusted to represent /(x) reasonably well. In the unidimensional line search, the search direction is retained but the step length is reduced if the Newton step proves to be unsatisfactory. In the trust region approach, a shorter step length is selected and then the search direction determined. Refer to Dennis and Schnabel (1996) and Section 8.5.1 for details. [Pg.206]

If the BFGS algorithm is applied to a positive-definite quadratic function of n variables and the line search is exact, it will minimize the function in at most n iterations (Dennis and Schnabel, 1996, Chapter 9). This is also true for some other updating formulas. For nonquadratic functions, a good BFGS code usually requires more iterations than a comparable Newton implementation and may not be as accurate. Each BFGS iteration is generally faster, however, because second derivatives are not required and the system of linear equations (6.15) need not be solved. [Pg.208]

For process optimization problems, the sparse approach has been further developed in studies by Kumar and Lucia (1987), Lucia and Kumar (1988), and Lucia and Xu (1990). Here they formulated a large-scale approach that incorporates indefinite quasi-Newton updates and can be tailored to specific process optimization problems. In the last study they also develop a sparse quadratic programming approach based on indefinite matrix factorizations due to Bunch and Parlett (1971). Also, a trust region strategy is substituted for the line search step mentioned above. This approach was successfully applied to the optimization of several complex distillation column models with up to 200 variables. [Pg.203]

Quadratic convergence means that eventually the number of correct figures in Xc doubles at each step, clearly a desirable property. Close to x Newton s method Eq. (3.9) shows quadratic convergence while quasi-Newton methods Eq. (3.8) show superlinear convergence. The RF step Eq. (3.20) converges quadratically when the exact Hessian is used. Steepest descent with exact line search converges linearly for minimization. [Pg.310]

Global strategies for minimization are needed whenever the current estimate of the minimizer is so far from x that the local model is not a good approximation to fix) in the neighborhood of x. Three methods are considered in this section the quadratic model with line search, trust region (restricted second-order) minimization and rational function (augmented Hessian) minimization. [Pg.311]

A line search consists of an approximate one-dimensional minimization of the objective function along the computed direction p. This produces an acceptable step X and a new iterate xk + Xp. Function and gradient evaluations of the objective function are required in each line search iteration. In contrast, the trust region strategy minimizes approximately a local quadratic model of the function using current Hessian information. An optimal step that lies within... [Pg.21]

First, when Hk is not positive-definite, the search direction may not exist or may not be a descent direction. Strategies to produce a related positive-definite matrix Hk, or alternative search directions, become necessary. Second, far away from x, the quadratic approximation of expression [34] may be poor, and the Newton direction must be adjusted. A line search, for example, can dampen (scale) the Newton direction when it exists, ensuring sufficient decrease and guaranteeing uniform progress toward a solution. These adjustments lead to the following modified Newton framework (using a line search). [Pg.37]

Before starting another iteration, any trust-region values CHMAX(i) that were active in the quadratic programming solution may be enlarged according to the line search results. Then control is returned to subroutine GRl for a calculation of matrix A at the latest line search point. [Pg.105]

The strategy described in Section 6.4 for constrained minimization of the sum of squares S 9) is readily adapted to the multiresponse objective function S 9) of Eq. (7.2-16). The quadratic programming subroutine GRQP and the line search subroutine GRS2 are used, with the following changes ... [Pg.152]

This method has been found to reduce the overall computational effort in approaching the optimum by up to 30% compared to that obtained by accepting the first point satisfying an Armijo cone condition on the merit function while allowing step reductions of up to a factor of 10 at each line search iteration and using a quadratic interpolation formula to estimate the step reduction. [Pg.341]

Other line search methods that involve only function evaluations, that is, no derivative calculations, are the dichotomous search, the Fibonacci search (Kiefer 1957), and the quadratic fit line search. The Fibonacci search is the most efficient derivative-free line search technique in the sense that it requires the fewest function evaluations to attain a prescribed degree of accuracy. The quadratic fit method... [Pg.2548]

Quasi-Newton Methods In some sense, quasi-Newton methods are an attempt to combine the best features of the steepest descent method with those of Newton s method. Rec that the steepest descent method performs well during early iterations and always decreases the value of the function, whereas Newton s method performs well near the optimum but requires second order derivative information. Quasi-Newton methods are designed to start like the steepest descent method and finish like Newton s method while using only first order derivative information. The basic idea was originally proposed by Davidon (1959) and subsequently developed by Fletcher and PoweU (1963). An additioneil feature of quasi-Newton methods is that the minimum of a convex quadratic function ctm be found in at most n iterations if exact line searches are used. The basic... [Pg.2551]


See other pages where Quadratic line search is mentioned: [Pg.2335]    [Pg.280]    [Pg.318]    [Pg.66]    [Pg.210]    [Pg.201]    [Pg.68]    [Pg.45]    [Pg.252]    [Pg.263]    [Pg.274]    [Pg.167]    [Pg.318]    [Pg.616]    [Pg.2446]    [Pg.2447]    [Pg.2595]    [Pg.196]    [Pg.2335]    [Pg.628]    [Pg.262]    [Pg.2549]    [Pg.2757]    [Pg.384]    [Pg.263]   
See also in sourсe #XX -- [ Pg.127 , Pg.131 ]




SEARCH



Quadratic

© 2024 chempedia.info