Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Backpropagation networks

Neuronal networks are nowadays predominantly applied in classification tasks. Here, three kind of networks are tested First the backpropagation network is used, due to the fact that it is the most robust and common network. The other two networks which are considered within this study have special adapted architectures for classification tasks. The Learning Vector Quantization (LVQ) Network consists of a neuronal structure that represents the LVQ learning strategy. The Fuzzy Adaptive Resonance Theory (Fuzzy-ART) network is a sophisticated network with a very complex structure but a high performance on classification tasks. Overviews on this extensive subject are given in [2] and [6]. [Pg.463]

In a backpropagation network each neurone of a layer is coimected to each neurone in the previous and in the next layer. Connections spanning over one layer are forbidden in this architecture. [Pg.464]

The three introduced network structures were trained with the training data set and tested with the test dataset. The backpropagation network reaches its best classification result after 70000 training iterations ... [Pg.465]

Table 2 Classification results of the Backpropagation Network in percent... Table 2 Classification results of the Backpropagation Network in percent...
In many cases, structure elucidation with artificial neural networks is limited to backpropagation networks [113] and, is therefore performed in a supervised man-... [Pg.536]

Numeric-to-numeric transformations are used as empirical mathematical models where the adaptive characteristics of neural networks learn to map between numeric sets of input-output data. In these modehng apphcations, neural networks are used as an alternative to traditional data regression schemes based on regression of plant data. Backpropagation networks have been widely used for this purpose. [Pg.509]

Both cases can be dealt with both by supervised and unsupervised variants of networks. The architecture and the training of supervised networks for spectra interpretation is similar to that used for calibration. The input vector consists in a set of spectral features yt(Zj) (e.g., intensities at selected wavelengths zi). The output vector contains information on the presence and absence of certain structure elements and groups fixed by learning rules (Fig. 8.24). Various types of ANN models may be used for spectra interpretation, viz mainly such as Adaptive Bidirectional Associative Memory (BAM) and Backpropagation Networks (BPN). The correlation... [Pg.273]

Touretsky and Pomerleau (8) examined hidden-cell activity in a backpropagation network which had been trained to classify computergenerated pictures of road conditions and serve as a navigator for an autonomous land vehicle. Analogous with the phoneme-recognition results, their analysis indicated that the twenty-nine hidden units in the network had learned to respond to component features of the depicted roads. [Pg.66]

California Scientific Software s Brainmaker is a low-cost MS/DOS-based program for constructing multi-layer backpropagation networks based on several kinds of transfer functions. It comes with a set of trained networks whose tasks range from shape recognition to text-to-speech conversion. [Pg.69]

Figure 6 Principal structure of a backpropagation network a) Information processing of a single neuron b) Exemplaric structure of a network with two input neurons, one hidden layer of three neurons and an output layer consisting of one output neuron. Figure 6 Principal structure of a backpropagation network a) Information processing of a single neuron b) Exemplaric structure of a network with two input neurons, one hidden layer of three neurons and an output layer consisting of one output neuron.
Figure 7 Schematic illustration of the use of a backpropagation network for the prediction of protein secondary structures. Figure 7 Schematic illustration of the use of a backpropagation network for the prediction of protein secondary structures.
Figure 8.12 Multilayer perceptron as basis for a backpropagation network. Figure 8.12 Multilayer perceptron as basis for a backpropagation network.
Before training of a backpropagation network, the following settlements are required ... [Pg.317]

Elizondo and Fiesler, 1997). However, fiilly connected FNNs are still dominant because designing partially connected FNNs is complicated and usually data-dependent. In FNNs, backpropagation networks can have more than one hidden layer, while ELM networks and radial basis function networks have only one hidden layer each. [Pg.28]

Figure S A simple backpropagation network. The label to the right of each layer gives the layer name, and the summation function, transfer function, output function, and learning rule, respectively, for the layer in a typical network. The input layer is fully connected to the middle layer, which in turn is fully connected to the output layer. Figure S A simple backpropagation network. The label to the right of each layer gives the layer name, and the summation function, transfer function, output function, and learning rule, respectively, for the layer in a typical network. The input layer is fully connected to the middle layer, which in turn is fully connected to the output layer.
Finally, we briefly mention several other ANNs that have seen limited chemical applications. A connectionist hyperprism ANN has been used in the analysis of ion mobility spectra. This network shares characteristics of Kohonen and backpropagation networks. The DYSTAL network has been successfully used to classify orange juice as either adulterated or unadulte-rated.200 A learning vector quantizer (LVQ) network has been used to identify multiple analytes from optical sensor array data. A wavelet ANN has been applied to the inclusion of P-cyclodextrin with benzene derivatives, anj a... [Pg.100]


See other pages where Backpropagation networks is mentioned: [Pg.464]    [Pg.537]    [Pg.540]    [Pg.540]    [Pg.301]    [Pg.302]    [Pg.205]    [Pg.339]    [Pg.65]    [Pg.70]    [Pg.140]    [Pg.540]    [Pg.312]    [Pg.316]    [Pg.27]    [Pg.28]    [Pg.55]    [Pg.64]    [Pg.72]    [Pg.86]    [Pg.89]    [Pg.90]    [Pg.95]    [Pg.99]    [Pg.99]    [Pg.99]    [Pg.101]   
See also in sourсe #XX -- [ Pg.442 , Pg.463 , Pg.536 ]

See also in sourсe #XX -- [ Pg.49 ]




SEARCH



Backpropagation

Backpropagation (BP) and Related Networks

Neural network error backpropagation

Neural networks backpropagation

Training a Layered Network Backpropagation

© 2024 chempedia.info