Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Searching procedures

Note that if Bn is zero, then T13 and T23 are also zero, so Equation (5.81) reduces to the specially orthotropic plate solution. Equation (5.65), if D11 =D22- Because Tn, T12, and T22 are functions of both m and n, no simple conclusion can be drawn about the value of n at buckling as could be done for specially orthotropic laminated plates where n was determined to be one. Instead, Equation (5.81) is a complicated function of both m and n. At this point, recall the discussion in Section 3.5.3 about the difference between finding a minimum of a function of discrete variables versus a function of continuous variables. We have already seen that plates buckle with a small number of buckles. Consequently, the lowest buckling load must be found in Equation (5.81) by a searching procedure due to Jones involving integer values of m and n [5-20] and not by equating to zero the first partial derivatives of N with respect to m and n. [Pg.308]

This mathematical optimization procedure is a rational process because the slope (or derivative) enables us to know which way to go and how far to go. In contrast, in the search procedure, we just arbitrarily choose some values of x at which to evaluate the function. Those arbitrary choices are much like what people do in most design situations. They are simply searching in a rather crude mannerfotThe solution to the problem, and they will not achieve the solution precisely. With mathematical optimization, our hope is both to speed up that process and to get a more precisely optimum solution. [Pg.430]

In each of these situations It Is possible to list the appropriate literature reference The search procedure has been designed In an Interactive mode so that even an Individual who has never used It may obtain an explanation for a particular problem, but the experienced searcher can by-pass the longer Interactive search routine This mode of operation Is In use only at the National Institutes of Health and a simpler procedure Is used In Evanston Hospital ... [Pg.282]

Search procedure, S. The search procedures explore the modified mapping models, tlr (X) =/(X), in order to generate and identify a set of final solutions, X, that look particularly promising according to the corresponding estimated performance scores, i/r (X ). [Pg.109]

The preceding set of characteristics and properties of the estimators makes our type of mapping procedures, /, particularly appealing for the kinds of systems that we are especially interested to study, i.e., manufacturing systems where considerable amounts of data records are available, with poorly understood behavior, and for which neither accurate first-principles quantitative models exist nor adequate functional form choices for empirical models can be made a priori. In other situations and application contexts that are substantially different from the above, while much can still be gained by adopting the same problem statements, solution formats and performance criteria, other mapping and search procedures (statistical, optimization theory) may be more efficient. [Pg.109]

A solution space, a, consisting of hyperrectangles defined in the decision space, X, is a basic characteristic common to all the learning methodologies that will be described in subsequent sections. The same does not happen with the specific performance criteria tfi, mapping models /, and search procedures 5, which obviously depend on the particular nature of the systems under analysis, and the type of the corresponding performance metric, y. [Pg.109]

Thus, they share exactly the same solution (H) and performance criteria (y ) spaces. Furthermore, since their role is simply to estimate y for a given X, no search procedures S are attached to classical pattern recognition techniques. Consequently, the only element that dilfers from one classification procedure to another is the particular mapping procedure / that is used to estimate y(x) and/ or ply = j x). The available set of (x, y) data records is used to build /, either through the construction of approximations to the decision boundaries that separate zones in the decision space leading to different y values (Fig. 2a), or through the construction of approximations to the conditional probability functions, piy =j ). [Pg.111]

The search procedure, S, used to uncover promising hyperrectangles in the decision space, X, associated with a desired y value (e.g., y = good ), is based on symbolic inductive learning algorithms, and leads to the identification of a final number of promising solutions, X, such as the ones in Fig. 2b. It is described in the following subsection. [Pg.112]

In order to introduce the search procedure, S, we start by showing how classification decision trees lead to the definition of a set of hyperrectangles, and how they can be eonstructed from a set of (x, y) data records. [Pg.112]

In the previous paragraphs we defined the solution format f, performance criterion i/r, mapping procedure /, and performance metric y that characterize our learning methodology for systems with a quantitative metric y. Here we will assemble all these pieces together and briefly discuss the search procedure, S (further details can be found in Saraiva... [Pg.124]

To identify this set of final feasible solutions, X e 1, with low scores, we developed a greedy search procedure, S (Saraiva and Stephanopoulos, 1992c), that has resulted, within an acceptable computation time, in almost-optimal solutions for all the cases studied so far, while avoiding the combinatorial explosion with the number of (x, y) pairs of an exhaustive enumeration/evaluation of all feasible alternatives. The algorithm starts by partitioning the decision space into a number of isovolu-... [Pg.125]

To support the application of the learning methodology, fix) was used to generate 500 (x, 2, w) records of simulated operational data, transformed by Eq. (26) into an equivalent number of (x, y) pairs. Finally, the following constraints were imposed to the search procedure, 5 ... [Pg.127]

However, conflicts between the fulfillment of different objectives and aspiration levels may prevent any feasible zone of the decision space from leading to satisfactory joint performances. If the search procedure fails to uncover at least one feasible final solution, X, consistent with y, a number of options are available to the decisionmaker to try to overcome this impasse. Namely, the decisionmaker can revise the initial problem definition, by either... [Pg.133]

Such revisions to the problem statement in order to overcome unsuccessful applications of the search procedure may have to be repeated a... [Pg.133]

After going through the complete search procedure, given the perceived ideal shown above, the following final solution, X, was selected ... [Pg.136]

First, we discuss the problem statements and key features of the learning architecture that are specific to complex systems. This is followed by a brief presentation of the search procedures that are used to build a final solution. The section ends with a summary of the application of the learning architecture to the analysis of a Kraft pulp mill. [Pg.138]

Our search procedures represent a departure from the above type of paradigm. Rather than simply accepting and implementing a decision policy found by DUg, that optimizes an overall measure of performance, the infimal subsystems and corresponding plant personnel play an active role in the construction and validation of solutions. One tries to build a consensus decision policy, Xpp, validated by all subsystems, DU , k = I,..., K, as well as by the whole plant, DUg, and only when that consensus has been reached does one move toward implementation. Within this context, the upper-level decision unit, DUg, assumes a eoordination role. [Pg.143]

Two different search procedures (bottom-up and top-down) can be followed to build active decision policies, X p. [Pg.145]


See other pages where Searching procedures is mentioned: [Pg.294]    [Pg.262]    [Pg.291]    [Pg.516]    [Pg.79]    [Pg.176]    [Pg.21]    [Pg.26]    [Pg.310]    [Pg.429]    [Pg.429]    [Pg.117]    [Pg.67]    [Pg.191]    [Pg.8]    [Pg.9]    [Pg.9]    [Pg.43]    [Pg.98]    [Pg.98]    [Pg.98]    [Pg.106]    [Pg.112]    [Pg.118]    [Pg.119]    [Pg.124]    [Pg.125]    [Pg.131]    [Pg.132]    [Pg.142]    [Pg.143]    [Pg.144]    [Pg.145]   
See also in sourсe #XX -- [ Pg.193 ]




SEARCH



Search procedures

© 2024 chempedia.info