Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural feed-forward

All exocrine and endocrine glands have a considerable nerve supply. Well known is Pavlov s conditioned reflex [13], where the digestive juices can be secreted via, for example, a sound. But also for many relatively fast reacting hormones - including insulin - the neural feed forward control plays a considerable role. [Pg.150]

The application of neural feed-forward networks has proved effective in protein 2D structure prediction (see Chapter 6 of Volume I, (Rost et al.,... [Pg.430]

Let us start with a classic example. We had a dataset of 31 steroids. The spatial autocorrelation vector (more about autocorrelation vectors can be found in Chapter 8) stood as the set of molecular descriptors. The task was to model the Corticosteroid Ringing Globulin (CBG) affinity of the steroids. A feed-forward multilayer neural network trained with the back-propagation learning rule was employed as the learning method. The dataset itself was available in electronic form. More details can be found in Ref. [2]. [Pg.206]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

The structure of a neural network forms the basis for information storage and governs the learning process. The type of neural network used in this work is known as a feed-forward network the information flows only in the forward direction, i.e., from input to output in the testing mode. A general structure of a feed-forward network is shown in Fig. I. Connections are made be-... [Pg.2]

Figure 1 A general structure of a feed-forward neural network. Figure 1 A general structure of a feed-forward neural network.
Neural networks can be broadly classified based on their network architecture as feed-forward and feed-back networks, as shown in Fig. 3. In brief, if a neuron s output is never dependent on the output of the subsequent neurons, the network is said to be feed forward. Input signals go only one way, and the outputs are dependent on only the signals coming in from other neurons. Thus, there are no loops in the system. When dealing with the various types of ANNs, two primary aspects, namely, the architecture and the types of computations to be per-... [Pg.4]

Figure 3 Feed-back and feed-forward artificial neural networks. Figure 3 Feed-back and feed-forward artificial neural networks.
The second main category of neural networks is the feedforward type. In this type of network, the signals go in only one direction there are no loops in the system as shown in Fig. 3. The earliest neural network models were linear feed forward. In 1972, two simultaneous articles independently proposed the same model for an associative memory, the linear associator. J. A. Anderson [17], neurophysiologist, and Teuvo Kohonen [18], an electrical engineer, were unaware of each other s work. Today, the most commonly used neural networks are nonlinear feed-forward models. [Pg.4]

Figure 20 Feed-forward neural network training and testing results with back-propagation training for solvent activity predictions in polar binaries (with learning parameter rj = O.l). Figure 20 Feed-forward neural network training and testing results with back-propagation training for solvent activity predictions in polar binaries (with learning parameter rj = O.l).
The basic component of the neural network is the neuron, a simple mathematical processing unit that takes one or more inputs and produces an output. For each neuron, every input has an associated weight that defines its relative importance, and the neuron simply computes the weighted sum of all the outputs and calculates an output. This is then modified by means of a transformation function (sometimes called a transfer or activation function) before being forwarded to another neuron. This simple processing unit is known as a perceptron, a feed-forward system in which the transfer of data is in the forward direction, from inputs to outputs, only. [Pg.688]

D. Svozil, Introduction to multi-layer feed-forward neural networks. Chemom. Intell. Lab. Syst., 39 (1997) 43-62. [Pg.695]

W.J. Meissen and L.M.C. Buydens, Aspects of multi-layer feed-forward neural networks influencing the quality of the fit of univariate non-linear relationships. Anal. Proc., 32 (1995) 53-56. [Pg.696]

Artificial neural networks (ANN) are computing tools made up of simple, interconnected processing elements called neurons. The neurons are arranged in layers. The feed-forward network consists of an input layer, one or more hidden layers, and an output layer. ANNs are known to be well suited for assimilating knowledge about complex processes if they are properly subjected to input-output patterns about the process. [Pg.36]

Multilayer feed-forward neural networks (MLF) represent the type of ANNs most widely applied to electronic tongue data. Their scheme is shown in Fig. 2.17. [Pg.91]

FIGURE 2.17 Scheme of multilayer feed-forward neural networks. [Pg.92]

An ANN is a set of interconnected neurons (also termed nodes, cells, units or process elements distributed in a specific arrangement, usually termed an architecture. In general, neurons are organised in layers (see Figure 5.3).The most common neural nets, the feed-forward nets, are fully connected, i.e. each node is connected to all the nodes in the next layer. The information we want to enter in... [Pg.248]

The neural networks where information flows from the input to the output layer are frequently termed feed-forward ANNs and they are by far the most often employed type in Analytical Chemistry they are considered here by default , so this term will not be mentioned again for brevity. [Pg.249]

Some of the pioneering studies published by several reputed authors in the chemometrics field [55] employed Kohonen neural networks to diagnose calibration problems related to the use of AAS spectral lines. As they focused on classifying potential calibration lines, they used Kohonen neural networks to perform a sort of pattern recognition. Often Kohonen nets (which were outlined briefly in Section 5.4.1) are best suited to perform classification tasks, whereas error back-propagation feed-forwards (BPNs) are preferred for calibration purposes [56]. [Pg.270]

The recoveries of several rare earth elements in leachates obtained from apatite concentrates were determined by Jorjani et al. [68] for ICP-AES and ICP-MS. A neural network model was used to predict the effects of operational variables on the La, Ce, Y and Nd recoveries in the leaching process. The neural network employed was a feed-forward one. [Pg.272]

Derks et al. [70] employed ANNs to cancel out noise in ICP. The results of neural networks (an Adaline network and a multi-layer feed-forward network) were compared with the more conventional Kalman filter. [Pg.272]

Zhang et al. [78] analysed the metal contents of serum samples by ICP-AES (Fe, Ca, Mg, Cr, Cu, P, Zn and Sr) to diagnose cancer. BAM was compared with multi-layer feed-forward neural networks (error back-propagation). The BAM method was validated with independent prediction samples using the cross-validation method. The best results were obtained using BAM networks. [Pg.273]

J. R. M. Smits, W. J. Meissen, L. M. C. Buydens and G. Kateman, Using artificial neural networks for solving chemical problems. Part I multi-layer feed-forward networks, Chemom. Intell. Lab. Syst., 22(2), 1994, 165-189. [Pg.276]

An artificial neural network (ANN) model was developed to predict the structure of the mesoporous materials based on the composition of their synthesis mixtures. The predictive ability of the networks was tested through comparison of the mesophase structures predicted by the model and those actually determined by XRD. Among the various ANN models available, three-layer feed-forward neural networks with one hidden layer are known to be universal approximators [11, 12]. The neural network retained in this work is described by the following set of equations that correlate the network output S (currently, the structure of the material) to the input variables U, which represent here the normalized composition of the synthesis mixture ... [Pg.872]

A series of multielectrode sensors were developed based on Drosophila mutant AChE immobilised via photocrosslinking onto screen-printed carbon electrodes [8]. Four different mutant and wild-type AChE were evaluated for their sensitivity to the organophosphate paraoxon and the carbamate pesticide carbofuran. The response of the electrodes in thiocholine before and following a 15-min exposure to solutions of the pesticides was compared. The data was then processed using a feed-forward neural network generated with NEMO 1.15.02 as previously described [8,9]. Networks with the smallest errors were selected and further refined. This approach together with varying the AChE led to the construction of a sensor with capability to analyse the binary pesticide mixtures. [Pg.321]

As a chemometric quantitative modeling technique, ANN stands far apart from all of the regression methods mentioned previously, for several reasons. First of all, the model structure cannot be easily shown using a simple mathematical expression, but rather requires a map of the network architecture. A simplified example of a feed-forward neural network architecture is shown in Figure 8.17. Such a network structure basically consists of three layers, each of which represent a set of data values and possibly data processing instructions. The input layer contains the inputs to the model (11-14). [Pg.264]


See other pages where Neural feed-forward is mentioned: [Pg.4]    [Pg.5]    [Pg.21]    [Pg.115]    [Pg.205]    [Pg.268]    [Pg.180]    [Pg.180]    [Pg.251]    [Pg.229]    [Pg.708]    [Pg.179]    [Pg.259]    [Pg.205]    [Pg.573]    [Pg.140]    [Pg.325]    [Pg.24]   
See also in sourсe #XX -- [ Pg.206 , Pg.217 , Pg.491 ]




SEARCH



Feed-forward

Feed-forward network, artificial neural

Feed-forward neural nets

Feed-forward neural network

Forward

Forwarder

Neural multi-layer-feed-forward network

Neural networks feed-forward computational

Three-layer forward-feed neural

Three-layer forward-feed neural network

© 2024 chempedia.info