Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Heteroassociative networks

With respect to associations of patterns, we distinguish between auto- and heteroassociations. In the first case, the number of input and output neurons is equal. Heteroassociative networks have a different number of neurons in the input and output layers. Pattern associations can be used, for example, to learn about character or image combinations or spectra-structure relationships. [Pg.311]

Heteroassociative networks termed counterpropagation can accept continuous inputs and outputs. They are two-layer ANNs (disregarding the... [Pg.94]

Fig. 10 A possible molecular packing and hydrogen bond scheme for (a) the heteroassembly formed from an equimolar mixture of 14a and 15a and (b) the homoassembly from 16a. (a, b) Top view of a layered structure composed of linear polymolecular arrays ( reversed Hoogsteen base pair configuration is employed here for the thymine-adenine heteroassociation), (c) Front view showing 2-D complementary and 1-D amide hydrogen bond network, (d) Side view of the polymolecular arrays. In (d), the one-dimensional amide hydrogen bond chain contributes to the stabilization of the base stacking and the formation of complementary hydrogen bonds. Reprinted with permission from J Am Chem Soc 2001, 123, 5947... Fig. 10 A possible molecular packing and hydrogen bond scheme for (a) the heteroassembly formed from an equimolar mixture of 14a and 15a and (b) the homoassembly from 16a. (a, b) Top view of a layered structure composed of linear polymolecular arrays ( reversed Hoogsteen base pair configuration is employed here for the thymine-adenine heteroassociation), (c) Front view showing 2-D complementary and 1-D amide hydrogen bond network, (d) Side view of the polymolecular arrays. In (d), the one-dimensional amide hydrogen bond chain contributes to the stabilization of the base stacking and the formation of complementary hydrogen bonds. Reprinted with permission from J Am Chem Soc 2001, 123, 5947...
We have made some assumptions about how our example network functions. Many types of ANN operate as we have assumed, but some do not, and we now indicate these differences. The just described ANNs are heteroassociative because the desired outputs differ from the inputs. When the desired outputs are the same as the inputs for all the training vectors, the network is autoassociative. This circumstance naturally requires that the number of input PEs be equal to the number of output PEs. Some types of network—for example, backpropaga-tion—may be configured as either hetero- or autoassociative, whereas other types must be heteroassociative, and still others must be autoassociative. [Pg.62]

Two-layer feedforward/feedback ANNs are heteroassociative. They can store input and output vectors and are useful in recalling an output vector when presented with a noisy or incomplete version of its corresponding input vector. They are also useful for classification problems. Typically, every feedforward connection between two PEs is accompanied by a feedback connection between the same two PEs. Both connections have weights, and these weights are usually different from each other. Examples are the adaptive resonance theory and bidirectional associative memory networks. [Pg.86]

Perceptron networks are feedforward, heteroassociative (or may be auto-associative) networks that accept continuous inputs. Within the last five years there have been no chemical applications of perceptrons applications before that time are now largely outmoded by the advent of more powerful ANNs. We mention them briefly for three reasons they have historical significance, they are ubiquitous in neural network texts, and you will find papers that claim to use perceptrons but in actuality do not. [Pg.98]

At this stage in tackling a problem neither you nor anyone else can choose the type of network that will work with 100% certainty. You probably can, however, make a list of several network types that appear to be viable candidates. Unless your objective is to develop a completely new use of a network type, make sure the ANNs on your list have been used to solve problems in the same general category as yours. For example, if yours is a mapping problem, your list should contain ANNs that have successfully solved mapping problems. Make sure the ANNs are auto- or heteroassociative, as required, and that they... [Pg.101]

The results show that the path of figure 2f is better than the path of figure2e. The selected paths should be saved because these are implicit non-linear functions. The p>aths can be saved as a look-up table, heteroassociative neural network memory (Fausset, 1994) or fuzzy curve expressions such as Takagi and Sugeno method (TSM) (Takagi and Sugeno, 1985). Look up tables are most convenient method and it is used for path saving in this example (Step 5). [Pg.197]


See other pages where Heteroassociative networks is mentioned: [Pg.96]    [Pg.96]    [Pg.97]    [Pg.99]   
See also in sourсe #XX -- [ Pg.62 , Pg.86 , Pg.93 , Pg.96 ]




SEARCH



Heteroassociate

© 2024 chempedia.info