Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Counter-propagation Network

Kohonen network Conceptual clustering Principal Component Analysis (PCA) Decision trees Partial Least Squares (PLS) Multiple Linear Regression (MLR) Counter-propagation networks Back-propagation networks Genetic algorithms (GA)... [Pg.442]

Besides the artihcial neural networks mentioned above, there are various other types of neural networks. This chapter, however, will confine itself to the three most important types used in chemoinformatics Kohonen networks, counter-propagation networks, and back-propagation networks. [Pg.455]

A counter-propagation network is a method for supervised learning which can be used for prediction, It has a two-layer architecture where each netiron in the upper layer, the Kohonen layer, has a corresponding netiron in the lower layer, the output layer (sec Figure 9-21). A trained counter-propagation network can be used as a look-up tabic a neuron in one layer is used as a pointer to the other layer. [Pg.459]

The architecture of a counter-propagation network resembles that of a Kohonen network, but in addition to the cubic Kohonen layer (input layer) it has an additional layer, the output layer. Thus, an input object consists of two parts, the m-dimeiisional input vector (just as for a Kohonen network) plus a second k-dimensional vector with the properties for the object. [Pg.459]

Tt provides unsupervised (Kohonen network) and supervised (counter-propagation network) learning techniques with planar and toroidal topology of the network. [Pg.461]

The usage of a neural network varies depending on the aim and especially on the network type. This tutorial covers two applications on the one hand the usage of a Kohonen network for classification, and on the other hand the prediction of object properties with a counter-propagation network,... [Pg.463]

Counter-propagation network this network also needs the input dimension. It gives the columns that arc used for the upper layer of the network. [Pg.464]

Rather than making this statement, one should consider first whether the representation of the Y-variablc is appropriate. What wc did here was to take categorical information as a quantitative value. So if wc have, for instance, a vector of class 1 and one of c lass 9 falling into the same neuron, the weights of the output layer will be adapted to a value between 1 and 9, which docs not make much sense. Thus, it is necessary to choose another representation with one layer for each biological activity. The architecture of such a counter-propagation network is shown in Figure 10.1 -11. Each of the nine layers in the output block corresponds to a different MOA. [Pg.509]

However, these results illustrate that the use of a counter-propagation network can lead to new insights when several biological activities arc given. Furthermore, a CPG network can also be applied for studying selectivity between different biological activities. [Pg.511]

If more than one property is relevant, then we have an A-matrix and a corresponding y-matrix. If the properties are highly correlated, a combined treatment of all properties is advisable, otherwise each property can be handled separately as described above. Mostly used for a joined evaluation of X and Y is PLS (then sometimes called PLS2) a nonlinear method is a Kohonen counter propagation network. [Pg.47]

More than 50 different types of neural network exist. Certain networks are more efficient in optimization others perform better in data modeling and so forth. According to Basheer (2000) the most popular neural networks today are the Hopfield networks, the Adaptive Resonance Theory (ART) networks, the Kohonen networks, the counter propagation networks, the Radial Basis Function (RBF) networks, the backpropagation networks and recurrent networks. [Pg.361]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

To understand neural networks, especially Kohonen, counter-propagation and back-propagation networks, and their applications... [Pg.439]

Supeiwised learning strategies are applied in counter-propagation and in back-propagation neural networks (see Sections 9.5.5 and 9.5,7, ... [Pg.455]

Kohonen network Counter-propagation Back-propagation... [Pg.465]

Association deals with the extraction of relationships among members of a data set. The methods applied for association range from rather simple ones, e.g., correlation analysis, to more sophisticated methods like counter-propagation or back-propagation neural networks (see Sections 9.5.5 and 9.5.7). [Pg.473]

A counter-propagation neural network is a method for supervised learning which can be used for predictions. [Pg.481]

The objective of this study is to show how data sets of compounds for which dif-ferent biological activities have been determined can be studied. It will be shown how the use of a counter-propagation neural networb can lead to new insights [46]. The cmpha.si.s in this example is placed on the comparison of different network architectures and not on quantitative results. [Pg.508]

The data set was then sent into a counter-propagation (CPG) network consisting of 13 X 9 neurons with 10 layers (one for each descriptor) in the input block and one layer in the output block (figure 10.1-9), with the output values having nine different values corresponding to the nine different MOA. [Pg.508]

Wu, C. Shivakumar, S. (1994). Back-propagation and counter-propagation neural networks for phylogenetic classification of ribosomal RNA sequences. Nucleic Acids Res 22,4291-9. [Pg.102]

Wu, C. H., Chen, H. L. Chen, S. C. (1997). Counter-propagation neural networks for molecular sequence classification Supervised LVQ and dynamic node allocatioa Applied Intelligence 7, 27-38. [Pg.102]

Neural network learning algorithms BP = Back-Propagation Delta = Delta Rule QP = Quick-Propagation RP = Rprop ART = Adaptive Resonance Theory, CP = Counter-Propagation. [Pg.104]

In the following study, selected descriptors were used for counter propagation neural network modeling. Figures 2a and b show the positions of molecules in a Kohonen network. The two molecules and their neighbors are indicated. The predictions for two molecules with unknown mutagenicity... [Pg.98]

CP-ANN Counter-Propagation Kohonen Artificial Neural Networks... [Pg.686]

Vracko M, Mills D, Basak SC. Structure-mutagenicity modelling using counter propagation neural networks. Environ Toxicol Pharmacol 2004 16(l-2) 25-36. [Pg.200]


See other pages where Counter-propagation Network is mentioned: [Pg.442]    [Pg.450]    [Pg.459]    [Pg.463]    [Pg.473]    [Pg.190]    [Pg.92]    [Pg.111]    [Pg.122]    [Pg.442]    [Pg.450]    [Pg.459]    [Pg.463]    [Pg.473]    [Pg.190]    [Pg.92]    [Pg.111]    [Pg.122]    [Pg.185]    [Pg.194]    [Pg.130]    [Pg.181]    [Pg.181]    [Pg.103]    [Pg.165]    [Pg.93]    [Pg.93]    [Pg.99]    [Pg.101]    [Pg.320]    [Pg.672]    [Pg.177]   


SEARCH



Counter-propagation

© 2024 chempedia.info