Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Back-propagation networks

To understand neural networks, especially Kohonen, counter-propagation and back-propagation networks, and their applications... [Pg.439]

Kohonen network Conceptual clustering Principal Component Analysis (PCA) Decision trees Partial Least Squares (PLS) Multiple Linear Regression (MLR) Counter-propagation networks Back-propagation networks Genetic algorithms (GA)... [Pg.442]

Besides the artihcial neural networks mentioned above, there are various other types of neural networks. This chapter, however, will confine itself to the three most important types used in chemoinformatics Kohonen networks, counter-propagation networks, and back-propagation networks. [Pg.455]

A back-propagation network usually consists of input units, one or more hidden layers and one output layer. Figure 9-16 gives an example of the architecture. [Pg.462]

Figure 9-23. Weight correction for a back-propagation network. Figure 9-23. Weight correction for a back-propagation network.
Of the several approaches that draw upon this general description, radial basis function networks (RBFNs) (Leonard and Kramer, 1991) are probably the best-known. RBFNs are similar in architecture to back propagation networks (BPNs) in that they consist of an input layer, a single hidden layer, and an output layer. The hidden layer makes use of Gaussian basis functions that result in inputs projected on a hypersphere instead of a hyperplane. RBFNs therefore generate spherical clusters in the input data space, as illustrated in Fig. 12. These clusters are generally referred to as receptive fields. [Pg.29]

Back propagation network (single) Linear projection Fixed shape, sigmoid [a, 0], minimum output prediction error... [Pg.34]

Independent studies (Cybenko, 1988 Homik et al., 1989) have proven that a three-layered back propagation network will exist that can implement any arbitrarily complex real-valued mapping. The issue is determining the number of nodes in the three-layer network to produce a mapping with a specified accuracy. In practice, the number of nodes in the hidden layer are determined empirically by cross-validation with testing data. [Pg.39]

Optimization of the PPR model is based on minimizing the mean-squares error approximation, as in back propagation networks and as shown in Table I. The projection directions a, basis functions 6, and regression coefficients /3 are optimized, one at a time for each node, while keeping all other parameters constant. New nodes are added to approximate the residual output error. The parameters of previously added nodes are optimized further by backfitting, and the previously fitted parameters are adjusted by cyclically minimizing the overall mean-squares error of the residuals, so that the overall error is further minimized. [Pg.39]

The most popular techniques of multilayer perceptrons (MLP) are back-propagation networks (Wythoff [1993] Jagemann [1998]). The weight matrixes W are estimated by minimizing the net error... [Pg.193]

Feed compression process, 13 778-779 Feed enzymes, 10 310 Feedforward back-propagation networks, 6 68-69... [Pg.349]

Example of a three-layer back-propagation network describing the application of physical property prediction. [Pg.208]

The evaluation of the measurements, the correlation between the medium components and the various ranges of the 2D-fluorescence spectrum was performed by Principal Component Analysis (PCA), Self Organized Map (SOM) and Discrete Wavelet Transformation (DWT), respectively. Back Propagation Network (BPN) was used for the estimation of the process variables [62]. By means of the SOM the courses of several process variables and the CPC concentration were determined. [Pg.127]

Le Cun, Y Boser, B Denker, J Henderson, R. E., Howard, W., et al. (1990a). Handwritten digit recognition with a back-propagation network. Adv Neural Inf Process Syst 2,396-404. [Pg.101]


See other pages where Back-propagation networks is mentioned: [Pg.442]    [Pg.450]    [Pg.462]    [Pg.462]    [Pg.494]    [Pg.500]    [Pg.4]    [Pg.740]    [Pg.662]    [Pg.5]    [Pg.38]    [Pg.185]    [Pg.704]    [Pg.124]    [Pg.255]    [Pg.73]    [Pg.5]    [Pg.38]    [Pg.105]    [Pg.106]    [Pg.107]    [Pg.108]    [Pg.108]    [Pg.109]    [Pg.109]   
See also in sourсe #XX -- [ Pg.662 ]




SEARCH



Artificial neural networks back-propagation

Back propagation network pattern recognition

Back-propagation

Back-propagation neural network applications

Back-propagation neural networks

Error back-propagation artificial neural networks

Neural networks feedforward back-propagation

© 2024 chempedia.info