Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Perceptrons multi-layered

In the Neural Spectrum Classifier (NSC) a multi-layer perceptron (MLP) has been used for classifying spectra. Although the MLP can perform feature extraction, an optional preprocessor was included for this purpose (see Figure 1). [Pg.106]

The Back-Propagation Algorithm (BPA) is a supervised learning method for training ANNs, and is one of the most common forms of training techniques. It uses a gradient-descent optimization method, also referred to as the delta rule when applied to feedforward networks. A feedforward network that has employed the delta rule for training, is called a Multi-Layer Perceptron (MLP). [Pg.351]

Chapter 10 covers another important field with a great overlap with CA neural networks. Beginning with a short historical survey of what is really an independent field, chapter 10 discusses the Hopfield model, stochastic nets, Boltzman machines, and multi-layered perceptrons. [Pg.19]

Being able to construct an e xplicit solution to a nonlinearly separable problem such as the XOR-problem by using a multi-layer variant of the simple perceptron does not, of course, guarantee that a multi-layer perceptron can by itself learn the XOR function. We need to find a learning rule that works not just for information that only propagates from an input layer to an output layer, but one that works for information that propagates through an arbitrary number of hidden layers as well. [Pg.538]

As we will discuss a bit later on in this section, adding hidden layers but generalizing the McCulloch-Pitts step-function thresholding to a linear function yields a multi-layer perceptron that is fundamentally no more powerful than a simple perceptron that has no hidden layers and uses the McCulloch-Pitts step-function threshold. [Pg.539]

Fig. 10.12 A schematic representation of the multi-layer perceptron model. Fig. 10.12 A schematic representation of the multi-layer perceptron model.
Just as was the case with simple perceptrons, the multi-layer perceptron s fundamental problem is to learn to associate given inputs with desired outputs. The input layer consists of as many neurons as are necessary to set up some natural... [Pg.540]

Multi-layered Perceptrons As might be expected, additioneil layers of neurons makes the analysis of the pattern capacity of multi-layered perceptrons more difficult. [Pg.550]

The three input parameters to the multi-layer perceptron are the atmospheric pressure and the mole fraction of liquid(Xl) and vapor (Yl) phases. The output parameter is the boiling temperature. [Pg.252]

Gardner, J.W., Craven, M., Dow, C., Hines, E.L (1998) The prediction of bacteria type and culture growth phase by an electronic nose with a multi-layer perceptron network. Meas. Sci. Technol. 9 120-127. [Pg.354]

Alvarez et al. [73] compared the performance of LDA and ANNs to classify different classes of wines. Metal concentrations (Ca, Mg, Sr, Ba, K, Na, P, Fe, Al, Mn, Cu and Zn) were selected as chemical descriptors for discrimination because of their correlation with soil nature, geographical origin and grape variety. Both LDA and ANNs led to a perfect separation of classes, especially when multi-layer perceptron nets were trained by error back-propagation. [Pg.273]

Neural network architectures 2L/FF = two-layer, feed forward network (i.e., perceptron) 3L or 4L/FF = three- or four-layer, feed-forward network (i.e., multi-layer perceptron). [Pg.104]

Cigizoglu, H. K. Estimation and forecasting of daily suspended sediment data by multi layer perceptrons. Adv. Water Resour. 27(2004) 185-195. [Pg.430]

The WNN is based on the similarity found between the inverse WT Stromberg s equation (Eq. 9.19) and a hidden layer in the Multi-Layer Perceptron (MLP) network structure (Meyer 1993). In fact, the wavelet decomposition can be seen like a neuronal network model, where the wavelets are indexed by i = instead of the double... [Pg.156]

At last but not least, the main goal of the present work is feature search in spontaneous speech and in emotional response aiming at pre-clinical evaluation in order to define tests for AD diagnosis. These features will define the control group (CR) and the three AD levels (ES, IS and AS). A secondary goal is the optimization of computational cost with the aim of making these techniques useful for real-time applications in real environments. Thus automatic classification will be modeled with this in mind. We have used a Multi Layer Perceptron (MLP) with neuron number in hidden layer (NNHL) of max(Attribute/Number + Classes/Number) and training step (TS) NNHL 10. [Pg.276]

Fig. 1. Multi-layered perceptron neural network with one hidden layer. Fig. 1. Multi-layered perceptron neural network with one hidden layer.

See other pages where Perceptrons multi-layered is mentioned: [Pg.105]    [Pg.112]    [Pg.347]    [Pg.536]    [Pg.537]    [Pg.539]    [Pg.541]    [Pg.543]    [Pg.545]    [Pg.547]    [Pg.548]    [Pg.549]    [Pg.549]    [Pg.551]    [Pg.551]    [Pg.553]    [Pg.782]    [Pg.250]    [Pg.251]    [Pg.251]    [Pg.119]    [Pg.368]    [Pg.267]    [Pg.234]    [Pg.235]    [Pg.235]    [Pg.59]    [Pg.64]    [Pg.64]    [Pg.29]    [Pg.104]   
See also in sourсe #XX -- [ Pg.536 , Pg.550 ]




SEARCH



Multi-layer

Multi-layer perceptron

Multi-layered

Perceptron

© 2024 chempedia.info