Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Kohonen self-organizing Neural Network

Figure 10.1-4. Distribution of compounds from two data sets in the same KNN (Kohonen s self-organizing neural network) map by using 18 topological descriptors as input descriptors, where 1 represents the 1588 compounds in the Merck data set (excluding those compounds that are also in the Huuskonen data set) 2 represents the 799 compounds in the Huuskonen data set (excluding those compounds that are also in the Merck data set), and 3 represents the overlapping part of the Huuskonen data set and the Merck data set. Figure 10.1-4. Distribution of compounds from two data sets in the same KNN (Kohonen s self-organizing neural network) map by using 18 topological descriptors as input descriptors, where 1 represents the 1588 compounds in the Merck data set (excluding those compounds that are also in the Huuskonen data set) 2 represents the 799 compounds in the Huuskonen data set (excluding those compounds that are also in the Merck data set), and 3 represents the overlapping part of the Huuskonen data set and the Merck data set.
This format was developed in our group and is used fruitfully in SONNIA, software for producing Kohonen Self Organizing Maps (KSOM) and Coimter-Propaga-tion (CPG) neural networks for chemical application [6]. This file format is ASCII-based, contains the entire information about patterns and usually comes with the extension "dat . [Pg.209]

T. Kohonen, Self Organization and Associated Memory. Springer-Verlag, Heidelberg, 1989. W.J. Meissen, J.R.M. Smits, L.M.C. Buydens and G. Kateman, Using artificial neural networks for solving chemical problems. II. Kohonen self-organizing feature maps and Hopfield networks. Chemom. Intell. Lab. Syst., 23 (1994) 267-291. [Pg.698]

Another type of ANNs widely employed is represented by the Kohonen self organizing maps (SOMs), used for unsupervised exploratory analysis, and by the counterpropagation (CP) neural networks, used for nonlinear regression and classification (Marini, 2009). Also, these tools require a considerable number of objects to build reliable models and a severe validation. [Pg.92]

There are literally dozens of kinds of neural network architectures in use. A simple taxonomy divides them into two types based on learning algorithms (supervised, unsupervised) and into subtypes based upon whether they are feed-forward or feedback type networks. In this chapter, two other commonly used architectures, radial basis functions and Kohonen self-organizing architectures, will be discussed. Additionally, variants of multilayer perceptrons that have enhanced statistical properties will be presented. [Pg.41]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

Kohonen networks, also known as self-organizing maps (SOMs), belong to the large group of methods called artificial neural networks. Artificial neural networks (ANNs) are techniques which process information in a way that is motivated by the functionality of biological nervous systems. For a more detailed description see Section 9.5. [Pg.441]

Another approach for solving the problem of representing data points in an -dimensional measurement space involves using an iterative technique known as the Kohonen neural network [41, 42] or self-organizing map (SOM). A Kohonen neural network consists of a layer of neurons arranged in a two-dimensional grid or... [Pg.345]

Not all neural networks are the same their connections, elemental functions, training methods and applications may differ in significant ways. The types of elements in a network and the connections between them are referred to as the network architecture. Commonly used elements in artificial neural networks will be presented in Chapter 2. The multilayer perception, one of the most commonly used architectures, is described in Chapter 3. Other architectures, such as radial basis function networks and self organizing maps (SOM) or Kohonen architectures, will be described in Chapter 4. [Pg.17]

Some historically important artificial neural networks are Hopfield Networks, Per-ceptron Networks and Adaline Networks, while the most well-known are Backpropa-gation Artificial Neural Networks (BP-ANN), Kohonen Networks (K-ANN, or Self-Organizing Maps, SOM), Radial Basis Function Networks (RBFN), Probabilistic Neural Networks (PNN), Generalized Regression Neural Networks (GRNN), Learning Vector Quantization Networks (LVQ), and Adaptive Bidirectional Associative Memory (ABAM). [Pg.59]

Self-organizing maps (also called SOMs, Kohonen feature maps, or kmaps) are special kinds of artificial neural networks (ANNs) that are able to represent sets of descriptors in a low-dimensional map [114—116], and are increasingly applied for mapping of various molecular data in the fields of analytical chemistry and drug design [89, 117, 118]. [Pg.591]

A specialized method for similarity-based visualization of high-dimensional data is formed by self-organizing feature maps (SOM). The data items are arranged on a two-dimensional plane with the aid of neural networks, especially Kohonen nets. Similarity between data items is represented by spacial closeness, while large distances indicate major dissimilarities [968]. At the authors department, a system called MIDAS had already been developed which combines strategies for the creation of feature maps with the supervised generation of fuzzy-terms from the maps [967]. [Pg.680]

A type of neural network that has been proved to be successful in a series of applications is based on self-organizing maps (SOMs) or Kohonen neural networks [61]. Whereas most of the networks are designed for supervised learning tasks (i.e., the relationship between input and output must be known in form of a mathematical model), Kohonen neural networks are designed primarily for unsupervised learning where no prior knowledge about this relationship is necessary [62,63]. [Pg.105]

Kohonen Neural Networks or self-organizing maps (SOMs) are a type of ANN designed for unsupervised learning where no prior knowledge about this relationship is necessary. [Pg.114]


See other pages where Kohonen self-organizing Neural Network is mentioned: [Pg.113]    [Pg.113]    [Pg.193]    [Pg.497]    [Pg.56]    [Pg.308]    [Pg.190]    [Pg.361]    [Pg.157]    [Pg.298]    [Pg.27]    [Pg.18]    [Pg.51]    [Pg.165]    [Pg.185]    [Pg.190]    [Pg.958]    [Pg.348]    [Pg.343]    [Pg.83]    [Pg.555]    [Pg.21]    [Pg.98]    [Pg.257]    [Pg.573]    [Pg.323]    [Pg.123]    [Pg.484]    [Pg.136]    [Pg.13]    [Pg.93]    [Pg.93]    [Pg.364]    [Pg.579]    [Pg.152]   


SEARCH



Kohonen

Kohonen network

Kohonen neural networks

Network Organic

Neural Kohonen

Neural network

Neural networking

Neural self-organizing

Organic self-organizing

Organization network

Self-organizing

Self-organizing networks

Self-organizing neural network

© 2024 chempedia.info