Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Optimization techniques quadratic functions

For well-posed quadratic objective functions the contours always form a convex region for more general nonlinear functions, they do not (see tlje next section for an example). It is helpful to construct contour plots to assist in analyzing the performance of multivariable optimization techniques when applied to problems of two or three dimensions. Most computer libraries have contour plotting routines to generate the desired figures. [Pg.134]

Another simple optimization technique is to select n fixed search directions (usually the coordinate axes) for an objective function of n variables. Then fix) is minimized in each search direction sequentially using a one-dimensional search. This method is effective for a quadratic function of the form... [Pg.185]

The application of the standard nonlinear programming techniques of constrained optimization on analyzing the mean and variance response surfaces has been investigated by Del Castillo and Montgomery [34]. These techniques are appropriate since both the primary and secondary responses are usually quadratic functions. [Pg.40]

With the feasible path approach the optimization algorithm automatically performs case studies by variing input data. There are several drawbacks the process equations (32c) have to be solved every time the performance index is evaluated. Efficient gradient-based optimization techniques can only be used with great difficulties because derivatives can only be evaluated by perturbing the entire flowsheet with respect to the decision variables. This is very time consuming. Second, process units are often described by discrete and discontinuous relations or by functions that may be nondifferentiable at certain points. To overcome these problems quadratic module models can be... [Pg.104]

In this way, such partial quadratic description is recursively used in a network of connected neurons to build the general mathematical relation of the inputs and output variables given in equation (4). The coefficients a, in equation (5) are calculated using regression techniques. It can be seen that a tree of polynomials is constructed using the quadratic form given in equation (5). In this way, the coefficients of each quadratic function Q are obtained to fit optimally the output in the whole set of input-output data pairs, that is... [Pg.13]

As numerical example a simple frame as Illustrated In Fig. 2 Is selected to demonstrate the efficiency of the suggested approach In reliability based optimization. A sequential quadratic optimization technique Is selected as numerical procedure [13], which appears most efficient especially with respect to low number of function calls and good convergence properties. [Pg.58]

In this section we first introduce the definition of the f-insensitive loss function, then show that the same quadratic optimization technique that was used in Section 2.3 for constructing approximations to indicator functions provides an approximation to real-valued functions, involving the linear case and nonlinear case. [Pg.44]

Nonlinear CG methods form another popular type of optimization scheme for large-scale problems where memory and computational performance are important considerations. These methods were first developed in the 1960s by combining the linear CG method (an iterative technique for solving linear systems Ax = b where A is an /i x /i matrix ) with line-search techniques. The basic idea is that if / were a convex quadratic function, the resulting nonlinear CG method would reduce to solving the Newton equations (equation 27) for the constant and positive-definite Hessian H. [Pg.1151]

Minimization of S(k) can be accomplished by using almost any technique available from optimization theory, however since each objective function evaluation requires the integration of the state equations, the use of quadratically convergent algorithms is highly recommended. The Gauss-Newton method is the most appropriate one for ODE models (Bard, 1970) and it presented in detail below. [Pg.85]

The statistical models determined by factorial design can be used as simplified models with the SQP technique. In this work the results obtained through this approach are compared with the results obtained using a rigorous model of the process. Costa et al. (5) determined quadratic models for productivity and % yield as functions of the significant input variables. These equations evaluated productivity and % yield and the SQP technique to determine the optimal values for S0, tr, R, and r. The optimization problem is postulated as follows ... [Pg.491]

Solution methods for optimization problems that involve only continuous variables can be divided into two broad classes derivative-free methods (e.g., pattern search and stochastic search methods) and derivative-based methods (e.g., barrier function techniques and sequential quadratic programming). Because the optimization problems of concern in RTO are typically of reasonably large scale, must be solved on-line in relatively small amounts of time and derivative-free methods, and generally have much higher computational requirements than derivative-based methods, the solvers contained in most RTO systems use derivative-based techniques. Note that in these solvers the first derivatives are evaluated analytically and the second derivatives are approximated by various updating techniques (e.g., BFGS update). [Pg.2594]

Non-linear programming technique (NLP) is used to solve the problems resulting from syntheses optimisation. This NLP approach involves transforming the general optimal control problem, which is of infinite dimension (the control variables are time-dependant), into a finite dimensional NLP problem by the means of control vector parameterisation. According to this parameterisation technique, the control variables are restricted to a predefined form of temporal variation which is often referred to as a basis function Lagrange polynoms (piecewise constant, piecewise linear) or exponential based function. A successive quadratic programming method is then applied to solve the resultant NLP. [Pg.642]

We now turn to methods for first-order saddle points. As already noted, saddle points present no problems in the local region provided the exact Hessian is calculated at each step. The problem with saddle point optimizations is that in the global region of the search, there are no simple criteria that allow us to select the step unambiguously. Thus, whereas for minimization methods it is often possible to give a proof of convergence with no significant restrictions on the function to be minimized, no such proofs are known for saddle-point methods, except, of course, for quadratic surfaces. Nevertheless, over the years several useful techniques have been developed for the determination of saddle points. We here discuss some of these techniques with no pretence at completeness. [Pg.128]

The introduction of inequality constraints results in a constrained optimization problem that can be solved numerically using linear or quadratic programming techniques (Edgar et al., 2001). As an example, consider the addition of inequality constraints to the MFC design problem in the previous section. Suppose that it is desired to calculate the M-step control policy AU(k) that minimizes the quadratic objective function J in Eq. 20-54, while satisfying the constraints in Eqs. 20-59, 20-60, and 20-61. The output predictions are made using the step-response model in Eq. 20-36. This MFC... [Pg.399]

Because the set-point calculations are repeated as often as every minute, the steady-state optimization problem must be solved quickly and reliably. If the optimization problem is based on a linear process model, linear inequality constraints, and either a linear or a quadratic cost function, the hnear and quadratic programming techniques discussed in Chapter 19 can be employed. [Pg.399]


See other pages where Optimization techniques quadratic functions is mentioned: [Pg.137]    [Pg.315]    [Pg.381]    [Pg.132]    [Pg.126]    [Pg.744]    [Pg.74]    [Pg.568]    [Pg.417]    [Pg.408]    [Pg.127]    [Pg.34]    [Pg.485]    [Pg.34]    [Pg.568]    [Pg.909]    [Pg.914]    [Pg.748]    [Pg.104]    [Pg.25]    [Pg.560]    [Pg.298]    [Pg.143]    [Pg.45]    [Pg.119]    [Pg.305]   


SEARCH



Functional techniques

Functionalization techniques

Optimization function

Optimization functional

Optimization techniques

Optimizing Technique

Quadratic

Quadratic functions

© 2024 chempedia.info