Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Successive quadratic programming algorithm

Fan, Y. S. Sarkar and L. Lasdon. Experiments with Successive Quadratic Programming Algorithms. J Optim Theory Appli 56 (3), 359-383 (March 1988). [Pg.328]

Temet, D. J. and L. T. Biegler. Recent Improvements to a Multiplier-free Reduced Hessian Successive Quadratic Programming Algorithm. Comp Chem Engin 22 963 (1998). [Pg.329]

Note that there are n + m equations in the n + m unknowns x and A. In Section 8.6 we describe an important class of NLP algorithms called successive quadratic programming (SQP), which solve (8.17)—(8.18) by a variant of Newton s method. [Pg.271]

Successive quadratic programming (SQP) methods solve a sequence of quadratic programming approximations to a nonlinear programming problem. Quadratic programs (QPs) have a quadratic objective function and linear constraints, and there exist efficient procedures for solving them see Section 8.3. As in SLP, the linear constraints are linearizations of the actual constraints about the selected point. The objective is a quadratic approximation to the Lagrangian function, and the algorithm is simply Newton s method applied to the KTC of the problem. [Pg.302]

This constrained nonlinear optimisation problem can be solved using a Successive Quadratic Programming (SQP) algorithm. In the SQP, at each iteration of optimisation a quadratic program (QP) is formed by using a local quadratic approximation to the objective function and a linear approximation to the nonlinear constraints. The resulting QP problem is solved to determine the search direction and with this direction, the next step length of the decision variable is specified. See Chen (1988) for further details. [Pg.138]

The computational effort in evaluating the Hessian matrix is significant, and quasi-Newton approximations have been used to reduce this effort. The Wilson-Han-Powell method is an enhancement to successive quadratic programming where the Hessian matrix, (q. ), is replaced by a quasi-Newton update formula such as the BEGS algorithm. Consequently, only first partial derivative information is required, and this is obtained from finite difference approximations of the Lagrangian function. [Pg.2447]

The description of the optimisation techniques is outside the scope of this work. An excellent introductory tutorial can be found in the book of Biegler, Grossmann and Westerberg (1997). As a general reference we recommend the work of Himmelblau and Edgar (1988). Here we only mention that the most efficient algorithm in flowsheet optimisation is based on Successive Quadratic Programming (SQP). [Pg.108]


See other pages where Successive quadratic programming algorithm is mentioned: [Pg.62]    [Pg.443]    [Pg.526]    [Pg.169]    [Pg.201]    [Pg.250]    [Pg.68]    [Pg.127]    [Pg.236]    [Pg.612]    [Pg.317]    [Pg.271]    [Pg.1346]    [Pg.624]    [Pg.150]    [Pg.2562]    [Pg.186]    [Pg.204]    [Pg.745]    [Pg.170]    [Pg.106]    [Pg.569]    [Pg.749]    [Pg.101]    [Pg.135]    [Pg.836]    [Pg.132]    [Pg.3818]    [Pg.3]    [Pg.88]   
See also in sourсe #XX -- [ Pg.302 ]




SEARCH



Quadratic

Quadratic algorithm

Quadratic program

Successive quadratic program, (

© 2024 chempedia.info