Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Forward-feed neural network

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

Figure 1 A general structure of a feed-forward neural network. Figure 1 A general structure of a feed-forward neural network.
Figure 20 Feed-forward neural network training and testing results with back-propagation training for solvent activity predictions in polar binaries (with learning parameter rj = O.l). Figure 20 Feed-forward neural network training and testing results with back-propagation training for solvent activity predictions in polar binaries (with learning parameter rj = O.l).
D. Svozil, Introduction to multi-layer feed-forward neural networks. Chemom. Intell. Lab. Syst., 39 (1997) 43-62. [Pg.695]

W.J. Meissen and L.M.C. Buydens, Aspects of multi-layer feed-forward neural networks influencing the quality of the fit of univariate non-linear relationships. Anal. Proc., 32 (1995) 53-56. [Pg.696]

Multilayer feed-forward neural networks (MLF) represent the type of ANNs most widely applied to electronic tongue data. Their scheme is shown in Fig. 2.17. [Pg.91]

FIGURE 2.17 Scheme of multilayer feed-forward neural networks. [Pg.92]

Zhang et al. [78] analysed the metal contents of serum samples by ICP-AES (Fe, Ca, Mg, Cr, Cu, P, Zn and Sr) to diagnose cancer. BAM was compared with multi-layer feed-forward neural networks (error back-propagation). The BAM method was validated with independent prediction samples using the cross-validation method. The best results were obtained using BAM networks. [Pg.273]

An artificial neural network (ANN) model was developed to predict the structure of the mesoporous materials based on the composition of their synthesis mixtures. The predictive ability of the networks was tested through comparison of the mesophase structures predicted by the model and those actually determined by XRD. Among the various ANN models available, three-layer feed-forward neural networks with one hidden layer are known to be universal approximators [11, 12]. The neural network retained in this work is described by the following set of equations that correlate the network output S (currently, the structure of the material) to the input variables U, which represent here the normalized composition of the synthesis mixture ... [Pg.872]

A series of multielectrode sensors were developed based on Drosophila mutant AChE immobilised via photocrosslinking onto screen-printed carbon electrodes [8]. Four different mutant and wild-type AChE were evaluated for their sensitivity to the organophosphate paraoxon and the carbamate pesticide carbofuran. The response of the electrodes in thiocholine before and following a 15-min exposure to solutions of the pesticides was compared. The data was then processed using a feed-forward neural network generated with NEMO 1.15.02 as previously described [8,9]. Networks with the smallest errors were selected and further refined. This approach together with varying the AChE led to the construction of a sensor with capability to analyse the binary pesticide mixtures. [Pg.321]

As a chemometric quantitative modeling technique, ANN stands far apart from all of the regression methods mentioned previously, for several reasons. First of all, the model structure cannot be easily shown using a simple mathematical expression, but rather requires a map of the network architecture. A simplified example of a feed-forward neural network architecture is shown in Figure 8.17. Such a network structure basically consists of three layers, each of which represent a set of data values and possibly data processing instructions. The input layer contains the inputs to the model (11-14). [Pg.264]

Figure 12.1. Multi-layered Feed Forward Neural Network Architecture... Figure 12.1. Multi-layered Feed Forward Neural Network Architecture...
Artificial neural networks (feed-forward neural networks, self-organizing neural networks, counterpropagation neural networks, Bayesian neural networks)... [Pg.217]

Bums, J. A. and Whitesides, G. M., Feed-forward neural networks in chemistry mathematical systems for classification and pattern recognition, Chem. Rev., 93, 2583,... [Pg.140]

FIGURE 10.2 TargetP localization predictor architecture. TargetP is built from two layers of feed-forward neural networks and, on top, a decision-making unit, taking into account cutoff restrictions (if opted for) and outputting a prediction and a reliability class, RC, which is an indication of prediction certainty (see the text). The nonplant version lacks the cTP network unit in the first layer and does not have cTP as a prediction possibility. [Pg.269]

The impredict algorithm uses a two-layer, feed-forward neural network to assign the predicted type for each residue (Kneller et al., 1990). In making the predictions, the server uses a FASTA format file with the sequence in either one-letter or three-letter code, as well as the folding class of the protein (a, j8, or a//8). Residues are classified... [Pg.264]

Feedback error learning (FEL) is a hybrid technique [113] using the mapping to replace the estimation of parameters within the feedback loop in a closed-loop control scheme. FEL is a feed-forward neural network structure, under training, learning the inverse dynamics of the controlled object. This method is based on contemporary physiological studies of the human cortex [114], and is shown in Figure 15.6. [Pg.243]


See other pages where Forward-feed neural network is mentioned: [Pg.4]    [Pg.5]    [Pg.21]    [Pg.205]    [Pg.251]    [Pg.205]    [Pg.325]    [Pg.132]    [Pg.366]    [Pg.367]    [Pg.351]    [Pg.83]    [Pg.179]    [Pg.117]    [Pg.925]    [Pg.235]    [Pg.209]    [Pg.255]    [Pg.614]    [Pg.270]    [Pg.1789]    [Pg.227]    [Pg.90]   
See also in sourсe #XX -- [ Pg.217 ]

See also in sourсe #XX -- [ Pg.2 ]

See also in sourсe #XX -- [ Pg.83 ]

See also in sourсe #XX -- [ Pg.4 , Pg.2792 ]




SEARCH



Feed-forward

Feed-forward network, artificial neural

Feed-forward networks

Forward

Forwarder

Neural feed-forward

Neural multi-layer-feed-forward network

Neural network

Neural networking

Neural networks feed-forward computational

Three-layer forward-feed neural network

© 2024 chempedia.info