Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Adaptive resonance theory networks

To provide you with a solid basis for deciding whether or not a given ANN is appropriate for your intended use, we describe briefly in this section many of the types of ANNs that have appeared in the literature in the past few years. For each network we focus on strengths and weaknesses, some practical aspects of operation, and a literature review of the chemical applications. We do not delve into detailed mathematical descriptions of networks, since these can be found in any number of texts. In particular we call attention to Ref. 19, which offers step-by-step developments of equations and detailed numerical examples for backpropagation, biassociative memory, counterpropagation, Hopfield, and Kohonen self-organizing map networks. Adaptive resonance theory networks are reviewed in detail in Ref. 27. [Pg.88]

Artificial neural networks (ANNs) are good at classifying non-linearly separable data. There are at least 30 different types of ANNs, including multilayer perceptron, radial basis functions, self-organizing maps, adaptive resonance theory networks and time-delay neural netwoiks. Indeed, the majority of ATI applications discussed later employ ANNs - most commonly, MLP (multilayer perceptron), RBF (radial basis function) or SOM (self-organizing map). A detailed treatise of neural networks for ATI is beyond the scope of this chapter and the reader is referred to the excellent introduction to ANNs in Haykin (1994) and neural networks applied to pattern recognition in Looney (1997) and Bishop (2(X)0). Classifiers for practical ATI systems are also described in other chapters of this volume. [Pg.90]

Neuronal networks are nowadays predominantly applied in classification tasks. Here, three kind of networks are tested First the backpropagation network is used, due to the fact that it is the most robust and common network. The other two networks which are considered within this study have special adapted architectures for classification tasks. The Learning Vector Quantization (LVQ) Network consists of a neuronal structure that represents the LVQ learning strategy. The Fuzzy Adaptive Resonance Theory (Fuzzy-ART) network is a sophisticated network with a very complex structure but a high performance on classification tasks. Overviews on this extensive subject are given in [2] and [6]. [Pg.463]

D. Wienke and L.M.C. Buydens, Adaptive resonance theory neural networks — the ART of real-time pattern recognition in chemical process monitoring Trends Anal. Chem., 99 (1995) 1-8. [Pg.699]

Neural network learning algorithms BP = Back-Propagation Delta = Delta Rule QP = Quick-Propagation RP = Rprop ART = Adaptive Resonance Theory, CP = Counter-Propagation. [Pg.104]

Wienke, D., Vandenbroek, W., Meissen, W., Buydens, L., Feldhoff, R., Kantimm, T., Huthfehre, T., Quick, L., Winter, F. Cammann, K. (1995) Comparison of an adaptive resonance theory-based neural network (ART- 2A) against other classifiers for rapid sorting of post consumer plastics by remote near-infrared spectroscopic sensing using an InGaAs diode array. Analytica ChimicaActa 317, 1-16. [Pg.75]

Two-layer feedforward/feedback ANNs are heteroassociative. They can store input and output vectors and are useful in recalling an output vector when presented with a noisy or incomplete version of its corresponding input vector. They are also useful for classification problems. Typically, every feedforward connection between two PEs is accompanied by a feedback connection between the same two PEs. Both connections have weights, and these weights are usually different from each other. Examples are the adaptive resonance theory and bidirectional associative memory networks. [Pg.86]

D. Wienke and L. Buydens, Trends AnaL Chem., 14, 398 (1995). Adaptive Resonance Theory Based Neural Networks—the ART of Real-Time Pattern Recognition in Chemical Process Monitoring ... [Pg.129]

Wienke, D., et al.. Comparison of an Adaptive Resonance Theory Based on Neural Network (ART-2a) against other Classifiers for Rapid Sorting of Post Consumer Plastics by Remote Near Infrared Spectroscopic Sensing Using an InGaAs Diode Array. AnaZ. Chim. Acta, 1995. 317 1-16. [Pg.564]

Domine and co-workers utilized the family of Adaptive Resonance Theory (ART and ART 2-A) based artificial neural networks for unsupervised and supervised pattern recognition (142,143). The simplest ART network is a vec-... [Pg.352]

More than 50 different types of neural network exist. Certain networks are more efficient in optimization others perform better in data modeling and so forth. According to Basheer (2000) the most popular neural networks today are the Hopfield networks, the Adaptive Resonance Theory (ART) networks, the Kohonen networks, the counter propagation networks, the Radial Basis Function (RBF) networks, the backpropagation networks and recurrent networks. [Pg.361]

Figure 5.1. Main data mining approaches and supporting technologies (abbreviations ILP — inductive logic programming, MLPs — multilayer perceptrons, RBF — radial basis functions networks, ARTMAP — adaptive resonance theory mapping networks, DOOT — distributed object-oriented technologies, CORBA — common object request broker, RMI — remote method invocation). Figure 5.1. Main data mining approaches and supporting technologies (abbreviations ILP — inductive logic programming, MLPs — multilayer perceptrons, RBF — radial basis functions networks, ARTMAP — adaptive resonance theory mapping networks, DOOT — distributed object-oriented technologies, CORBA — common object request broker, RMI — remote method invocation).
A second variation on this theme was used in a processor based on the adaptive resonance theory [74]. This system was essentially an SVMM with a smart pixel array as the central weighting mask. The functionality of the smart pixel was basically the same as that for the pixel in Fig. 62. The only difference was that a pre-determined weight mask was loaded and stored on the smart pixel array. The array was then turned on and the photodetectors were illuminated with the input light. If the light was present at the photodetector and the corresponding pixel mask was selected, then the pixel modulator was turned off. This functionality provides a powerful technique for fast processing and learning in a neural network. [Pg.846]

Now that all levels of the structural hierarchy within a fat crystal network are quantifiable (to various extents), as well as the amounts of solid fat within the network (by use mainly of pulsed nuclear magnetic resonance), it is important to relate these quantifiable parameters to rheological indicators such as the shear elastic modulus. One model to relate the microstructure to the shear elastic modulus was developed in colloidal physics by Shih et al. [57]. A brief chronology of the adaptation of this theory to the study of fat networks follows. [Pg.81]


See other pages where Adaptive resonance theory networks is mentioned: [Pg.540]    [Pg.692]    [Pg.540]    [Pg.540]    [Pg.64]    [Pg.540]    [Pg.692]    [Pg.540]    [Pg.540]    [Pg.64]    [Pg.465]    [Pg.350]    [Pg.692]    [Pg.5]    [Pg.63]    [Pg.5]    [Pg.63]    [Pg.86]    [Pg.103]    [Pg.110]    [Pg.161]    [Pg.162]    [Pg.41]    [Pg.42]    [Pg.52]    [Pg.88]   
See also in sourсe #XX -- [ Pg.692 ]




SEARCH



Adaptive Networks

Adaptive Resonance Theory

Adaptive Resonance Theory (ART) Networks

Network theory

Neural networks Adaptive resonance theory

Resonance theory

© 2024 chempedia.info