Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Multi-layer network

The ANN is mathematical model that simulates many characteristics of actual neurons in the brain. Generally, an ANN is a structurally multi-layered network which links aTarge number of nodes (the neuron-like computational elements) and operates dynamically. Although mathematical neurons were conceived as early as 1943, only recently have large-scale real-world applications become practical. [Pg.65]

The most common topology of ANN is a feed-forward multi-layer network where neurons are arranged in layers, from input to output with interconnecting weights between two adjacent layers. Each neuron within the network processes one or more inputs coming into it and produces an output (decided by activation function) for the next layer. At each... [Pg.242]

We have presented a neural network based spectrum classifier (NSC) aimed at ultrasonic resonance spectroscopy. The ultrasonic spectroscopy and the NSC has been evaluated in many industrial applications, such as concrete inspection, testing of aerospace composite structures, ball bearings, and aircraft multi-layer structures. The latter application has been presented in some detail. [Pg.111]

The Back-Propagation Algorithm (BPA) is a supervised learning method for training ANNs, and is one of the most common forms of training techniques. It uses a gradient-descent optimization method, also referred to as the delta rule when applied to feedforward networks. A feedforward network that has employed the delta rule for training, is called a Multi-Layer Perceptron (MLP). [Pg.351]

Chapter 10 covers another important field with a great overlap with CA neural networks. Beginning with a short historical survey of what is really an independent field, chapter 10 discusses the Hopfield model, stochastic nets, Boltzman machines, and multi-layered perceptrons. [Pg.19]

Homik, K., Stinchcombe, M., and White, H., Multi-layer feedforward networks are universal approximators. Neural Networks 2, 359 (1989). [Pg.204]

Many different types of networks have been developed. They all consist of small units, neurons, that are interconnected. The local behaviour of these units determines the overall behaviour of the network. The most common is the multi-layer-feed-forward network (MLF). Recently, other networks such as the Kohonen, radial basis function and ART networks have raised interest in the chemical application area. In this chapter we focus on the MLF networks. The principle of some of the other networks are explained and we also discuss how these networks relate with other algorithms, described elsewhere in this book. [Pg.649]

D. Svozil, Introduction to multi-layer feed-forward neural networks. Chemom. Intell. Lab. Syst., 39 (1997) 43-62. [Pg.695]

W.J. Meissen and L.M.C. Buydens, Aspects of multi-layer feed-forward neural networks influencing the quality of the fit of univariate non-linear relationships. Anal. Proc., 32 (1995) 53-56. [Pg.696]

Gardner, J.W., Craven, M., Dow, C., Hines, E.L (1998) The prediction of bacteria type and culture growth phase by an electronic nose with a multi-layer perceptron network. Meas. Sci. Technol. 9 120-127. [Pg.354]

Derks et al. [70] employed ANNs to cancel out noise in ICP. The results of neural networks (an Adaline network and a multi-layer feed-forward network) were compared with the more conventional Kalman filter. [Pg.272]

Zhang et al. [78] analysed the metal contents of serum samples by ICP-AES (Fe, Ca, Mg, Cr, Cu, P, Zn and Sr) to diagnose cancer. BAM was compared with multi-layer feed-forward neural networks (error back-propagation). The BAM method was validated with independent prediction samples using the cross-validation method. The best results were obtained using BAM networks. [Pg.273]

J. R. M. Smits, W. J. Meissen, L. M. C. Buydens and G. Kateman, Using artificial neural networks for solving chemical problems. Part I multi-layer feed-forward networks, Chemom. Intell. Lab. Syst., 22(2), 1994, 165-189. [Pg.276]

KNN)13 14 and potential function methods (PFMs).15,16 Modeling methods establish volumes in the pattern space with different bounds for each class. The bounds can be based on correlation coefficients, distances (e.g. the Euclidian distance in the Pattern Recognition by Independent Multicategory Analysis methods [PRIMA]17 or the Mahalanobis distance in the Unequal [UNEQ] method18), the residual variance19,20 or supervised artificial neural networks (e.g. in the Multi-layer Perception21). [Pg.367]

Figure 12.1. Multi-layered Feed Forward Neural Network Architecture... Figure 12.1. Multi-layered Feed Forward Neural Network Architecture...
Neural network architectures 2L/FF = two-layer, feed forward network (i.e., perceptron) 3L or 4L/FF = three- or four-layer, feed-forward network (i.e., multi-layer perceptron). [Pg.104]

Koene, R. A. Takane, Y. (1999). Discriminant component pruning. Regularization and interpretation of multi-layered back-propagation networks. Neural Comput 11,783-802. [Pg.150]

As a last resort it is possible to apply neural networks (NN). NN can in principle model surfaces with any complexity. However, the number of experiments required is laige. This, together with the fact that NN is a rather specialised technique, explains that the number of applications in the literature is limited. Examples are to be found in 70-72). In the latter application two variables (pH and modifier content) are investigated for four chlorophenols and the authors found that when 15 to 20 experiments are carried out, better results are obtained with a multi-layer feed-forward NN than when using quadratic or third-order models. Although we believe that for the optimization of separations, NN will prove practical only in few cases, it seems useful to explain the first principles of the methodology here. A simple network is shown in Fig. 6.25. [Pg.208]

Multi-layer feedforward networks contain an input layer connected to one or more layers of hidden neurons (hidden units) and an output layer (Figure 3.5(b)). The hidden units internally transform the data representation to extract higher-order statistics. The input signals are applied to the neurons in the first hidden layer, the output signals of that layer are used as inputs to the next layer, and so on for the rest of the network. The output signals of the neurons in the output layer reflect the overall response of the network to the activation pattern supplied by the source nodes in the input layer. This type of network is especially useful for pattern association (i.e., mapping input vectors to output vectors). [Pg.62]


See other pages where Multi-layer network is mentioned: [Pg.120]    [Pg.91]    [Pg.204]    [Pg.244]    [Pg.1813]    [Pg.1466]    [Pg.120]    [Pg.91]    [Pg.204]    [Pg.244]    [Pg.1813]    [Pg.1466]    [Pg.105]    [Pg.83]    [Pg.161]    [Pg.250]    [Pg.251]    [Pg.251]    [Pg.88]    [Pg.103]    [Pg.8]    [Pg.119]    [Pg.366]    [Pg.367]    [Pg.225]    [Pg.50]    [Pg.306]    [Pg.267]    [Pg.26]    [Pg.234]    [Pg.235]    [Pg.235]    [Pg.83]   
See also in sourсe #XX -- [ Pg.89 ]




SEARCH



Layered network

Multi-layer

Multi-layered

Network layer

© 2024 chempedia.info