Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural Kohonen

This reaction data set of 626 reactions was used as a training data set to produce a knowledge base. Before this data set is used as input to a neural Kohonen network, each reaction must be coded in the form of a vector characterizing the reaction event. Six physicochemical effects were calculated for each of five bonds at the reaction center of the starting materials by the PETRA (see Section 7.1.4) program system. As shown in Figure 10,3-3 with an example, the physicochemical effects of the two regioisomeric products arc different. [Pg.546]

This format was developed in our group and is used fruitfully in SONNIA, software for producing Kohonen Self Organizing Maps (KSOM) and Coimter-Propaga-tion (CPG) neural networks for chemical application [6]. This file format is ASCII-based, contains the entire information about patterns and usually comes with the extension "dat . [Pg.209]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

Figure 8-1J. Training ofa Kohonen neural network with a chirality code, The number of weights in a neuron is the same as the number of elements in the chirality code vector, When a chirality code is presented to the network, the neuron with the most similar weights to the chirality code is excited (this is the ivinning or central neuron) (see Section 9.5,3),... Figure 8-1J. Training ofa Kohonen neural network with a chirality code, The number of weights in a neuron is the same as the number of elements in the chirality code vector, When a chirality code is presented to the network, the neuron with the most similar weights to the chirality code is excited (this is the ivinning or central neuron) (see Section 9.5,3),...
To understand neural networks, especially Kohonen, counter-propagation and back-propagation networks, and their applications... [Pg.439]

Kohonen networks, also known as self-organizing maps (SOMs), belong to the large group of methods called artificial neural networks. Artificial neural networks (ANNs) are techniques which process information in a way that is motivated by the functionality of biological nervous systems. For a more detailed description see Section 9.5. [Pg.441]

The Kohonen network i.s a neural network which uses an unsupervised learning strategy. Sec Section 9.5,3 for a more detailed description. [Pg.455]

Besides the artihcial neural networks mentioned above, there are various other types of neural networks. This chapter, however, will confine itself to the three most important types used in chemoinformatics Kohonen networks, counter-propagation networks, and back-propagation networks. [Pg.455]

The usage of a neural network varies depending on the aim and especially on the network type. This tutorial covers two applications on the one hand the usage of a Kohonen network for classification, and on the other hand the prediction of object properties with a counter-propagation network,... [Pg.463]

The GA was then applied to select those descriptors which give the best classification of the structures when a Kohonen network is used. The objeetive function was based on the quality of the classification done by a neural network for the I educed descriptors. [Pg.472]

Figure 10.1-4. Distribution of compounds from two data sets in the same KNN (Kohonen s self-organizing neural network) map by using 18 topological descriptors as input descriptors, where 1 represents the 1588 compounds in the Merck data set (excluding those compounds that are also in the Huuskonen data set) 2 represents the 799 compounds in the Huuskonen data set (excluding those compounds that are also in the Merck data set), and 3 represents the overlapping part of the Huuskonen data set and the Merck data set. Figure 10.1-4. Distribution of compounds from two data sets in the same KNN (Kohonen s self-organizing neural network) map by using 18 topological descriptors as input descriptors, where 1 represents the 1588 compounds in the Merck data set (excluding those compounds that are also in the Huuskonen data set) 2 represents the 799 compounds in the Huuskonen data set (excluding those compounds that are also in the Merck data set), and 3 represents the overlapping part of the Huuskonen data set and the Merck data set.
The second main category of neural networks is the feedforward type. In this type of network, the signals go in only one direction there are no loops in the system as shown in Fig. 3. The earliest neural network models were linear feed forward. In 1972, two simultaneous articles independently proposed the same model for an associative memory, the linear associator. J. A. Anderson [17], neurophysiologist, and Teuvo Kohonen [18], an electrical engineer, were unaware of each other s work. Today, the most commonly used neural networks are nonlinear feed-forward models. [Pg.4]

As described in the Introduction to this volume (Chapter 28), neural networks can be used to carry out certain tasks of supervised or unsupervised learning. In particular, Kohonen mapping is related to clustering. It will be explained in more detail in Chapter 44. [Pg.82]

T. Kohonen, Self Organization and Associated Memory. Springer-Verlag, Heidelberg, 1989. W.J. Meissen, J.R.M. Smits, L.M.C. Buydens and G. Kateman, Using artificial neural networks for solving chemical problems. II. Kohonen self-organizing feature maps and Hopfield networks. Chemom. Intell. Lab. Syst., 23 (1994) 267-291. [Pg.698]

X. H. Song and P.K. Hopke, Kohonen neural network as a pattern-recognition method, based on weight interpretation. Anal. Chim. Acta, 334 (1996) 57-66. [Pg.698]

R. Goodacre, J. Pygall and D.B. Kell, Plant seed classification using pyrolysis mass spectrometry with unsupervised learning the application of auto-associative and Kohonen artificial neural networks. Chemom. Intell. Lab. Syst., 33 (1996) 69-83. [Pg.698]

Anzali, S., Bamickel, G., Krug, M., Sadowski, J., Wagener, M. and Gasteiger, J. (1996) Evaluation of molecular surface properties using a Kohonen neural network. In Neural Networks in QSAR and Design, Devillers, J. (Ed.), Academic Press, London. [Pg.79]

That the SOM is often called a Kohonen map indicates the degree to which Kohonen and his co-workers have helped to define the field. Papers by Kohonen provide a rapid route into work with SOMs, but Zupan and Gasteiger s book Neural Networks for Chemists An Introduction 1 2 3 6 offers a broader look at the techniques and should be helpful for anyone starting work in this area. [Pg.93]

Use of multivariate approaches based on classification modelling based on cluster analysis, factor analysis and the SIMCA technique [98,99], and the Kohonen artificial neural network [100]. All these methods, though rarely implemented, lead to very good results not achievable with classical strategies (comparisons, amino acid ratios, flow charts) and, moreover it is possible to know the confidence level of the classification carried out. [Pg.251]

R. Lletf, L.A. Sarabia, M.C. Ortiz, R. Todeschini, M.P. Colombini, Application of the Kohonen Artifical Neural Network in the Identification of Proteinaceous Binders in Samples of Panel Painting using Gas Chromatography Mass Spectrometry, Analyst, 128, 281 286 (2003). [Pg.258]

Just as there are several varieties of evolutionary algorithm, so the neural network is available in several flavors. We shall consider feedforward networks and, briefly, Kohonen networks and growing cell structures, but Hop-field networks, which we shall not cover in this chapter, also find some application in science.31... [Pg.367]

Also nonlinear methods can be applied to represent the high-dimensional variable space in a smaller dimensional space (eventually in a two-dimensional plane) in general such data transformation is called a mapping. Widely used in chemometrics are Kohonen maps (Section 3.8.3) as well as latent variables based on artificial neural networks (Section 4.8.3.4). These methods may be necessary if linear methods fail, however, are more delicate to use properly and are less strictly defined than linear methods. [Pg.67]

Another type of ANNs widely employed is represented by the Kohonen self organizing maps (SOMs), used for unsupervised exploratory analysis, and by the counterpropagation (CP) neural networks, used for nonlinear regression and classification (Marini, 2009). Also, these tools require a considerable number of objects to build reliable models and a severe validation. [Pg.92]

Some of the pioneering studies published by several reputed authors in the chemometrics field [55] employed Kohonen neural networks to diagnose calibration problems related to the use of AAS spectral lines. As they focused on classifying potential calibration lines, they used Kohonen neural networks to perform a sort of pattern recognition. Often Kohonen nets (which were outlined briefly in Section 5.4.1) are best suited to perform classification tasks, whereas error back-propagation feed-forwards (BPNs) are preferred for calibration purposes [56]. [Pg.270]

Balbinot et al. [36] classified Antarctic algae by applying Kohonen neural networks to a data set composed of 14 elements determined by ICP-OES. [Pg.273]

Self-organizing ANNs (Kohonen neural nets) were employed for classifying different steels [88]. Twelve relevant elements were selected for data processing through ANNs. [Pg.275]

The utility of ANNs as a pattern recognition technique in the field of microbeam analysis was demonstrated by Ro and Linton [99]. Back-propagation neural networks were applied to laser microprobe mass spectra (LAMMS) to determine interparticle variations in molecular components. Selforganizing feature maps (Kohonen neural networks) were employed to extract information on molecular distributions within environmental microparticles imaged in cross-section using SIMS. [Pg.276]


See other pages where Neural Kohonen is mentioned: [Pg.193]    [Pg.450]    [Pg.497]    [Pg.499]    [Pg.530]    [Pg.3]    [Pg.555]    [Pg.450]    [Pg.692]    [Pg.454]    [Pg.21]    [Pg.98]    [Pg.298]    [Pg.304]    [Pg.307]    [Pg.307]    [Pg.467]    [Pg.27]    [Pg.257]    [Pg.270]   
See also in sourсe #XX -- [ Pg.5 , Pg.45 , Pg.424 , Pg.441 , Pg.471 , Pg.546 ]




SEARCH



Kohonen

Kohonen Neural Networks — The Classifiers

Kohonen neural nets

Kohonen neural network multilayer

Kohonen neural networks

Kohonen neural networks applications

Kohonen self-organizing Neural Network

Training Kohonen neural networks

© 2024 chempedia.info