Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Hebbian learning rule

The so-called Hebbian learning rule (to honour Canadian neuropsychologist Donald Hebb, who proposed it in 1949) specifies how much the weight of the connection between two neurons should increase or decrease as a function of their activation. The rule states that the weight should increase whether the two interconnected neurons are active simultaneously (otherwise the weight should decrease). This rule is not used too frequently nowadays because it works well provided that all the input patterns are orthogonal or uncorrelated. [Pg.257]

To test this idea and the conditions on how the system needs to be organized in terms of the connectivity, initially and after learning, we introduced a Hebbian learning rule for the synapses from KCs to eKCs ... [Pg.13]

There has been a steady development of neuronal analogs over the past 50 years. An important early model was proposed in 1943 by McCulloch and Pitts [23]. They described the neuron as a logical processing unit, and the influence of their model set the mathematical tone of what is being done today. Adaption or learning is a major focus of neural net research. The development of a learning rule that could be used for neural models was pioneered by Hebb, who proposed the famous Hebbian model for synaptic modification [24]. Since then, many alternative quantitative interpretations of synaptic modification have been developed [15-22]. [Pg.3]

The paper is organised as follows. Sections 2, 3 and 4 introduce ED As, Walsh functions and Hopfield networks respectively. Sections 5 and 6 describe a Hopfield EDA (HEDA) and presents two learning rules one based on a standard Hebbian update and one designed to improve network capacity. Section 7 describes a set of experiments and an analysis of network size, capacity and the time taken during learning. Sections analyses the weights of a HEDA and Sect.9 offers some conclusions and discusses future work. [Pg.251]

Storkey [22] introduced a new learning rule for Hopfield networks that increased the capacity of a network compared to using the Hebbian rule. The new weight update rule is ... [Pg.258]

This set of experiments compares the capacity of a normally trained Hopfield network with the search capacity of a HEDA. We will compare two learning rules (Hebbian and Storkey). The experiments are repeated many times, all using randomly generated target patterns where each element has an equal probability of being +1 or —1. [Pg.260]

Fig. 9. Number of attractors correctly discovered from the same random set by a HEDA with a standard linear learning rule (square markers and solid error bars) and a HEDA trained with the log-Hebbian rule. In both cases, the target function contained n(2 Inn) attractors and the graph shows how many of them were found. Fig. 9. Number of attractors correctly discovered from the same random set by a HEDA with a standard linear learning rule (square markers and solid error bars) and a HEDA trained with the log-Hebbian rule. In both cases, the target function contained n(2 Inn) attractors and the graph shows how many of them were found.
Equation 10.49 embodies Hinton, et.al. s Boltzman Machine learning scheme. Notice that it consists of two different parts. The first part, < SiSj >ciamped) is essentially the same as the Hebb rule used in Hopfield s net (equation 10.19), and reinforces the connections that lead from input to output. The second part, < SiSj >free> Can be likened to a Hebbian unlearning, whereby poor associations are effectively unlearned. [Pg.535]


See other pages where Hebbian learning rule is mentioned: [Pg.512]    [Pg.538]    [Pg.83]    [Pg.250]    [Pg.262]    [Pg.2043]    [Pg.2043]    [Pg.30]    [Pg.512]    [Pg.538]    [Pg.83]    [Pg.250]    [Pg.262]    [Pg.2043]    [Pg.2043]    [Pg.30]    [Pg.176]    [Pg.34]    [Pg.262]    [Pg.269]    [Pg.46]    [Pg.255]    [Pg.257]    [Pg.257]    [Pg.263]   
See also in sourсe #XX -- [ Pg.257 ]

See also in sourсe #XX -- [ Pg.361 ]




SEARCH



© 2024 chempedia.info