Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Back-propagation learning

Let us start with a classic example. We had a dataset of 31 steroids. The spatial autocorrelation vector (more about autocorrelation vectors can be found in Chapter 8) stood as the set of molecular descriptors. The task was to model the Corticosteroid Ringing Globulin (CBG) affinity of the steroids. A feed-forward multilayer neural network trained with the back-propagation learning rule was employed as the learning method. The dataset itself was available in electronic form. More details can be found in Ref. [2]. [Pg.206]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

A detailed mathematical explanation of the adaptation of the weights is given, c.g.. ill Ref, [lOJ, The original publication of back-propagation learning is to be found in Ref. [13. ... [Pg.463]

The general principle behind most commonly used back-propagation learning methods is the delta rule, by which an objective function involving squares of the output errors from the network is minimized. The delta rule requires that the sigmoidal function used at each neuron be continuously differentiable. This methods identifies an error associated with each neuron for each iteration involving a cause-effect pattern. Therefore, the error for each neuron in the output layer can be represented as ... [Pg.7]

In following sections, the basis of the main components of a neuron, how it works and, finally, how a set of neurons are connected to yield an ANN are presented. A short description of the most common rules by which ANNs learn is also given (focused on the error back-propagation learning scheme). We also concentrate on how ANNs can be applied to perform regression tasks and, finally, a review of published papers dealing with applications to atomic spectrometry, most of them reported recently, is presented. [Pg.250]

Luo [86] proposed a kind of neural cluster structure embedded in neural networks. The ANN is based on the error back-propagation learning... [Pg.274]

Luo proposed a kind of neural cluster structure embedded in neural networks. The ANN was based on the error back-propagation learning algorithm. The predictive ability of the neural cluster structure was compared with that of common neural net structures. A comparison of predictability with four neural networks was presented and they were applied to correct for matrix effects in XRF. [Pg.403]

As is shown in Figure 1, BP neural network is a multi-layer feedforward network with error back-propagation learning, which consists of an input layer, an output layer and several hidden layers. Each layer includes a set of neurons interconnected by weight (Han et al. 2009). [Pg.857]

Affolter and Clerc" used the connectivity-based encoded structures as input to a two-layer feed-forward network that was trained by the back-propagation learning algorithm. In these experiments correlation coefficients of up to 0.8 for the prediction and up to 0.99 for the recall were achieved. Figure 8... [Pg.1304]


See other pages where Back-propagation learning is mentioned: [Pg.462]    [Pg.662]    [Pg.665]    [Pg.671]    [Pg.205]    [Pg.246]    [Pg.205]    [Pg.977]    [Pg.914]    [Pg.41]    [Pg.52]    [Pg.182]    [Pg.244]    [Pg.282]    [Pg.369]    [Pg.157]    [Pg.74]    [Pg.118]    [Pg.180]    [Pg.39]    [Pg.39]   


SEARCH



Back-propagation

© 2024 chempedia.info