Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Variations of the Basic Rule

The basic backpiopagation algorithm described above is, in practice, often very slow to converge. Moreover, just as Hopfield nets can sometimes get stuck in undesired spurious attractor states (i.e. local minima see section 10.6.5), so to can multilayer perceptrons get trapped in some undesired local minimum state. This is an unfortunate artifact that plagues all energy (or cost-function) minimization schemes. [Pg.544]

Add Some Inertia Another way is to add a so-called momentum term to our previous expression for calculating the weight-changes (see equations 10.59 and [Pg.545]

Finally, we mention an approach based on using a different energy (or difference) function. The quadratic form used above (equation 10.58) is simple to use but it is not the only form possible. In fact, we could go through the same derivation steps as above by using any function /(Of, Sf) of the net s output Of and actual output [Pg.545]

as long as it is differentiable and is minimised by Of = Sf. One interesting form that has a natural interpretation in terms of learning the probabilities of a set of hypotheses represented by the output neurons, has recently been suggested by Hopfield [hopf87] and Banm and Wilczek [baumSSb]  [Pg.546]

In this expression, the term (1 + Of) is interpreted as the probability that the hypothesis represented by the neuron is true (with Of = +1 meaning true and Of = —i- 1 meaning false), and the term (1 + Sf) is interpreted as the set of desired probabilities. With these, interpretations, / effectively yields the relative entropy between these two sets of probability measnres. [Pg.546]


See other pages where Variations of the Basic Rule is mentioned: [Pg.544]   


SEARCH



The Basics

The rule

© 2024 chempedia.info