Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Orthogonal search directions

From one viewpoint the search direction of steepest descent can be interpreted as being orthogonal to a linear approximation (tangent to) of the objective function at point x examine Figure 6.9a. Now suppose we make a quadratic approximation of/(x) at x ... [Pg.197]

At the minimum along the kth search direction, the old and the new directions should be orthogonal, i.e.,... [Pg.146]

Figure 6.1 Search for the minimum of the Gibbs function in a two-component space (nn and ni2 are mole numbers) with the mass conservation constraints Bn = q. The search direction is the projection of the gradient onto the constraint subspace. Minimum is attained when the gradient is orthogonal to the constraint direction, which is the geometrical expression of the Lagrange multiplier methods. Figure 6.1 Search for the minimum of the Gibbs function in a two-component space (nn and ni2 are mole numbers) with the mass conservation constraints Bn = q. The search direction is the projection of the gradient onto the constraint subspace. Minimum is attained when the gradient is orthogonal to the constraint direction, which is the geometrical expression of the Lagrange multiplier methods.
In our two-dimensional space, these two search directions are perpendicular to one another. Saying this in more general mathematical terms, the two search directions are orthogonal. This is not a coincidence that occurs just for the specific example we have defined it is a general property of steepest descent methods provided that the line search problem defined by Eq. (3.18) is solved optimally. [Pg.72]

In generating a third iterate for the conjugate-gradient method, we now estimate the search direction by VEfe) but insist that the search direction is orthogonal to both do and di. This idea is then repeated for subsequent iterations. [Pg.72]

We cannot continue this process indefinitely because in an fV-dimensional space we can only make a vector orthogonal to at most (N 1) other vectors. So to make this a well-defined algorithm, we have to restart the process of defining search directions after some number of iterations less than N. [Pg.73]

Indirect methods can also be applied to problems with two or more decision variables. In the steepest descent method (also known as the gradient method), the search direction is along the gradient at point (xi, xi), i.e., orthogonal to the contours of f xi, xi). A line search is then carried out to establish a new minimum point where the gradient is re-evaluated. This procedure is repeated until the convergence criterion is met, as shown in Figure 1.15b. [Pg.32]

The basis used in this method is conjugate search directions and orthogonal residuals which is equivalent to finding a minimum point along the search directions. [Pg.1097]

Moreover, it can be shown that the residuals are orthogonal when requiring conjugate search directions, i.e. (pm,Mpm-i) = = 0... [Pg.1098]

However, since the expanded system is indefinite the minimization argument of the CG-method becomes ineffective. The CG-method is thus modified by replacing the orthogonal sequence of residuals by two mutually orthogonal sequences. In addition, the conjugacy constraint of the search directions is replaced by a corresponding conjugacy constraint of the two mutual search directions. [Pg.1100]

The gradient at the minimum point obtained from the line search will be perpendicular to the previous direction. Thus, when the line search method is used to locate the minimum along the gradient then the next direction in the steepest descents algorithm will be orthogonal to the previous direction (i.e. gk Sk-i = 0)-... [Pg.281]

PVVaik) is therefore the direction of constrained minimization. As in the case of Lagrange multipliers, no progress can be made and search will stop when the (k + l)th minimization direction PV (k+1) is orthogonal to the fcth minimization direction PV inner product of these vectors becomes less than an arbitrarily small value. [Pg.334]

We continue with deriving the next set of components by maximizing the initial problem (Equation 4.67). This maximum is searched in a direction orthogonal to tx, and searching in the orthogonal complement is conveniently done by deflation of X. The deflated matrix X is... [Pg.171]

It can be demonstrated that, if we apply the steepest descent method with the line search, the subsequent gradient directions are mutually orthogonal... [Pg.128]


See other pages where Orthogonal search directions is mentioned: [Pg.187]    [Pg.197]    [Pg.187]    [Pg.197]    [Pg.2350]    [Pg.190]    [Pg.193]    [Pg.72]    [Pg.72]    [Pg.44]    [Pg.43]    [Pg.196]    [Pg.197]    [Pg.2350]    [Pg.99]    [Pg.162]    [Pg.261]    [Pg.222]    [Pg.222]    [Pg.284]    [Pg.101]    [Pg.328]    [Pg.329]    [Pg.107]    [Pg.175]    [Pg.79]    [Pg.384]    [Pg.345]    [Pg.189]    [Pg.51]    [Pg.172]    [Pg.328]    [Pg.329]    [Pg.145]   
See also in sourсe #XX -- [ Pg.188 ]




SEARCH



Direct search

Search direction

© 2024 chempedia.info