Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Quadratic Approximation optimization

The simplest smooth fiuictioii which has a local miiiimum is a quadratic. Such a function has only one, easily detemiinable stationary point. It is thus not surprising that most optimization methods try to model the unknown fiuictioii with a local quadratic approximation, in the fomi of equation (B3.5.1). [Pg.2333]

Parker, A. L., "Chemical Process Optimization by Flowsheet Simulation and Quadratic Approximation Programming", Ph.D. Thesis in Chemical Engineering, U. of Wisconsin, 1979. [Pg.35]

Therefore, lines of constant objective function are approximately quadratic functions as shown in Figure 9.6 and we use H as the A matrix. This quadratic approximation using the Hessian matrix evaluated at the optimum is accurate if we are in the neighborhood of the optimal parameter values. We also can obtain order-of-magnitude confidence intervals using the relation... [Pg.279]

The GGN method is based on a linearization of Fi and the constraints in (2.3) around the current estimate Wk, but yields a quadratic approximation of the optimization problem (rather than just a linear one). Accordingly, the method is superior to both derivative-free and higher-order optimization techniques in this context. Furthermore, it does not converge to minima which are statistically not significant, i.e., minima with large residuals [4, 7]. [Pg.145]

A great number of studies indicated that quadratic approximation methods, which are characterized by solving a sequence of quadratic subproblems recursively belong to the most efficient and reliable nonlinear programming algorithms presently available. This method combines the most efficient characteristics of different optimization techniques (see e.g, [19]). For equality constrained problems, the general nonlinear constrained optimization problem can be formulated by an... [Pg.396]

The next step in a quasi-Newton optimization is the update of the Hessian, from to H" . We wish to adjust the Hessian in the quadratic approximation represented by equation (I) so that it fits the gradient g, at the current point x, and the gradient g, i at the previous point. This requirement leads to ... [Pg.1139]

The idea in trust-region methods - the origin of the quadratic optimization subproblem in step 2 above - is to determine the vector s on the basis of the size of region within which the quadratic functional approximation can be trusted (i.e., is reasonable). The quality of the quadratic approximation can be assessed from the following ratio ... [Pg.1147]

CPR = conjugate peak refinement GDIIS = geometry direct inversion in the iterative subspace GE = gradient extremal LST = linear synchronous transit LTP = line then plane LUP = locally updated planes NR = Newton-Raph-son P-RFO = partitioned rational function optimization QA = quadratic approximation QST = quadratic synchronous transit SPW = self-penalty walk STQN = synchronous transit-guided quasi-Newton TRIM = trust radius image minimization TS = transition structure. [Pg.3114]

A quadratic approximation to the convolution integral has proved to be the best method for optimizing the results. A Levenberg-Marquardt algorithm performing a least-squares minimization is used. [Pg.1125]

It is further noticed that, in principle, the quadratic approximation Eq. 22 is as better as the base values are closer to the solution hence, the accuracy and stability of the optimal estimates should be carefully checked either by the complete correlation with the experimental data or by repeating the procedure with new base values. [Pg.44]

The preceding equations are valid for all types of parameter variation and lead to efficient methods of optimizing a variational wavefunction. Thus, if the parameter values po do not give a stationary point in parameter space, but one sufficiently close for a quadratic approximation to be good, we may seek such a point (p = po + d, say) by inserting D + dD in (2.4.19) and requiring d = 0 for the variation dD around the new point p. This variation, to first order, is... [Pg.45]

In simple relaxation (the fixed approximate Hessian method), the step does not depend on the iteration history. More sophisticated optimization teclmiques use infonnation gathered during previous steps to improve the estimate of the minunizer, usually by invoking a quadratic model of the energy surface. These methods can be divided into two classes variable metric methods and interpolation methods. [Pg.2336]

Optimization of Unconstrained Olnective Assume the objective Func tion F is a function of independent variables i = r. A computer program, given the values for the independent variables, can calculate F and its derivatives with respect to each Uj. Assume that F is well approximated as an as-yet-unknown quadratic function in u. [Pg.485]

Another class of methods of unidimensional minimization locates a point x near x, the value of the independent variable corresponding to the minimum of /(x), by extrapolation and interpolation using polynomial approximations as models of/(x). Both quadratic and cubic approximation have been proposed using function values only and using both function and derivative values. In functions where/ (x) is continuous, these methods are much more efficient than other methods and are now widely used to do line searches within multivariable optimizers. [Pg.166]

Since both spot price and quantity are modeled as variables, the resulting optimization problem of maximizing turnover is quadratic. In the following, we show how a linear approximation of the turnover function can be achieved (see also Habla 2006). This approach is based on the concavity property of the turnover function and the limited region of sales quantity flexibility to be considered. Approximation parameters are determined in a preprocessing phase based on the sales input and control data. The preprocessing is structured in two phases as shown in table 25 ... [Pg.162]

A piecewise linear turnover approximation supports effective and accurate decision making on sales turnover based on price-quantity functions and elasticity as an alternative to exact quadratic optimization. [Pg.257]


See other pages where Quadratic Approximation optimization is mentioned: [Pg.2334]    [Pg.188]    [Pg.51]    [Pg.205]    [Pg.68]    [Pg.69]    [Pg.520]    [Pg.221]    [Pg.51]    [Pg.2334]    [Pg.52]    [Pg.128]    [Pg.38]    [Pg.443]    [Pg.69]    [Pg.25]    [Pg.2335]    [Pg.2341]    [Pg.26]    [Pg.80]    [Pg.321]    [Pg.194]    [Pg.95]    [Pg.850]    [Pg.116]    [Pg.543]    [Pg.206]    [Pg.169]    [Pg.70]   
See also in sourсe #XX -- [ Pg.320 , Pg.335 ]

See also in sourсe #XX -- [ Pg.320 , Pg.335 ]

See also in sourсe #XX -- [ Pg.320 , Pg.335 ]




SEARCH



Quadratic

Quadratic approximants

Quadratic approximation

© 2024 chempedia.info