Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Grossberg outstar layer

Hybrid networks combine the features of two or more types of ANN, the idea being to highlight the strengths and minimize the weaknesses of the different networks. Examples are the Hamming network, which has a perceptron-like and a Hopfield-like layei and the counterpropagation network, which has a Kohonen layer and a Grossberg outstar layer. [Pg.87]

The counterpropagation network-was originallyproposedby Hecht-Nilsen (1987). In this section a modified feedforward version as described by Zurada (1992) is discussed. This network, which is shown in Fig. 19.25, requires numbers of hidden neurons equal to the number of input patterns, or more exactly, to the number of input clusters. The first layer is known as the Kohonen layer with unipolar neurons. In this layer only one neuron, the winner, can be active. The second is the Grossberg outstar layer. The Kohonen layer can be trained in the unsupervised mode, but that need not be the case. When binary input patterns are considered, then the input weights must be exactly equal to the input patterns. In this case,... [Pg.2050]

This rule can be used in a Hopfield network, in the middle layer of the BSB network, and in the outer layer of a counterpropagation network. In the latter case, it is equivalent to the so-called Grossberg outstar learning rule. Cj is usually set to 0.1 or less and C2 is usually set to zero. [Pg.83]


See other pages where Grossberg outstar layer is mentioned: [Pg.95]    [Pg.97]    [Pg.95]    [Pg.97]   
See also in sourсe #XX -- [ Pg.83 , Pg.87 , Pg.95 , Pg.97 ]




SEARCH



© 2024 chempedia.info