Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Search evolutional algorithm

Having formally defined the branching structure, we must now make explicit the mechanisms by which we can eliminate subsets of the solution space from further consideration. Ibaraki (1978) has stated three major mechanisms for controlling the evolution of the branch-and-bound search algorithms, by eliminating potential solution through... [Pg.280]

The term evolutionary algorithm (EA) refers to a class of population based metaheuristic (probabilistic) optimization algorithms which imitate the Darwinian evolution ( survival ofthe fittest ). However, the biological terms are used as metaphors rather than in their exact meaning. The population of individuals denotes a set of solution candidates or points of the solution space. Each individual represents a point in the search space which is coded in the individual s representation (genome). The fitness of an individual is usually defined on the basis of the value of the objective function and determines its chances to stay in the population and to be used to generate new solution points. [Pg.202]

Landscape models are much more abstract than the laboratory technique-based models. As extensive as theory about evolution and optimization on fitness landscapes has become, there is still little work on matching a search algorithm to landscape properties. Additionally, much of this work is based on landscape properties that are presently very difficult to measure with any statistical significance for molecular landscapes. For these reasons, and for reasons of limited space, the landscape search results will be explained in much less detail than the laboratory-based techniques. This section is divided into four parts (i) definitions of terms used in fitness landscape studies and caveats about then-misuse (ii) review of models for fitness landscapes (iii) results from studies of search on fitness landscapes and (iv) conclusions from these results. [Pg.124]

Any theoretical study of applied molecular evolution needs information on the fitnesses of the molecules in the search space, as it is not possible to characterize the performance of search algorithms without knowing properties of the landscape being searched [63], Since the ideals of sequence-to-structure or sequence-to-function models are not yet possible, it is necessary to use approximations to these relationships or make assumptions about their functional form. To this end, a large variety of models have been developed, ranging from randomly choosing affinities from a probability distribution to detailed biophysical descriptions of sequence-structure prediction. These models are often used to study protein folding, the immune system and molecular evolution (the study of macromolecule evolution and the reconstruction of evolutionary histories), but they can also be used to study applied molecular evolution [4,39,53,64-67], A number ofthese models are reviewed below. [Pg.126]

D. E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, Reading, MA, 1993 H. P. Schwefel, Evolution and Optimum Seeking, Wiley, New York, 1995. [Pg.327]

EAs repeatedly carve up old members of the population to create fresh solutions. As in natural selection, competition within the population is essential, otherwise its evolution would be unpredictable and undirected, the algorithm would be as likely to retain poor solutions as promising ones and would make a lengthy and probably unproductive random walk over the search surface. Since individuals in the current population have evolved from those created in past generations, they reflect some of the lessons learned during previous attempts at solution. It is in this fashion that the algorithm learns about a problem. [Pg.17]

The third maj or class of search methods are genetic algorithms (GAs), which are widely used for docking purposes. GAs are stochastic optimization methods inspired by the concepts of evolution (172-174). The optimization problem is generally formulated in the Ian-... [Pg.298]

The time demanded for an optimization algorithm to solve the problem is a consequence of the number of function evaluations it uses to come to the response. The micro-GA (section 5.1) evaluated the profit 250 times, each one for each individual of each generation. The GA-SQP, however, demanded just 31 function evaluations and, even so, achieved a better objective function value than the isolated micro-GA. The evolution of both algorithms (isolated GA and GA-SQP) during the search, as a function of objective function evaluations, is shown in Figure 3. [Pg.488]


See other pages where Search evolutional algorithm is mentioned: [Pg.129]    [Pg.481]    [Pg.364]    [Pg.762]    [Pg.27]    [Pg.236]    [Pg.289]    [Pg.298]    [Pg.18]    [Pg.202]    [Pg.203]    [Pg.226]    [Pg.109]    [Pg.497]    [Pg.143]    [Pg.55]    [Pg.68]    [Pg.252]    [Pg.199]    [Pg.494]    [Pg.198]    [Pg.154]    [Pg.68]    [Pg.127]    [Pg.138]    [Pg.148]    [Pg.568]    [Pg.14]    [Pg.80]    [Pg.134]    [Pg.70]    [Pg.288]    [Pg.169]    [Pg.11]    [Pg.26]    [Pg.29]    [Pg.169]    [Pg.5547]    [Pg.85]    [Pg.258]    [Pg.341]    [Pg.483]   
See also in sourсe #XX -- [ Pg.123 , Pg.155 , Pg.244 , Pg.255 ]




SEARCH



Algorithms, searching

© 2024 chempedia.info