Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural backpropagation algorithm

M-CASE/BAIA (see text). BP-ANN = three-layer feedforward artificial neural network trained by the backpropagation algorithm, PAAN = probabilistic artificial neural network, CPANN = counterpropagation artificial neural network. [Pg.662]

J Leonard and MA Kramer. Improvement of the backpropagation algorithm for training neural networks. Comput. Chem. Engg., 14(3) 337-341, 1990. [Pg.289]

Figure 8.16 Decision boundaries of a feedforward neural network trained by a Bayesian regulation (a) and a conjugate gradient backpropagation algorithm (b). Figure 8.16 Decision boundaries of a feedforward neural network trained by a Bayesian regulation (a) and a conjugate gradient backpropagation algorithm (b).
A schematic diagram of the neural network-based adaptive control technique is shown in Fig. 4.9. A neural network identification model is trained using a static backpropagation algorithm to generate p(fc + 1), given past values of y and u. The identification error is then used to update the weights of the neural identification model. The control error is used to update the... [Pg.61]

Different feedforward neural networks with three layers have been tested to describe the particulate formation in the KPP furnace. A linear activation function has been used in the first layer and tan-sigmoid activation functions were used in the hidden and output layers. Training has been accomplished through 1000 epochs using a backpropagation algorithm. [Pg.1011]

Given a function to implement as a neural network, there are a number of (currently) ill-understood and unconventional stages in the process. Instead of designing an algorithm to meet the specification, we train a network to do so, and in both cases we test the implementation to determine whether it is acceptable. Currently, we are using Multilayer Perceptron nets trained with the backpropagation algorithm [8]. [Pg.225]

B. Walczak, Neural networks with robust backpropagation learning algorithm. Anal. Chim. Acta, 322 (1996) 21-30. [Pg.696]

Riedmiller, M. Braun, H. (1993). A (Greet adaptive method for faster backpropagation learning the Rprop algorithm. In Proceedings of the IEEE International Conference on Neural Networks (ICNN 93). (ed. Ruspini H.), pp. 586-91. [Pg.113]

C. Klawun and C. L. Wilkins,/. Chem. Inf. Comput. Sci., 34,984 (1994). A Novel Algorithm for Local Minimum Escape in Backpropagation Neural Networks Application to the Interpretation of Matrix Isolation Infrared Spectra. [Pg.132]

Neural networks, such as backpropagation, have an especially simple reasoning algorithm. The knowledge of the neural network is represented as a matrix of synaptic connections, possibly quite sparse. The information to be evaluated by the neural network is represented as an input vector of the appropriate size, and the reasoning process is to multiply the connection matrix by the input vector to obtain the conclusion as an output vector. [Pg.123]


See other pages where Neural backpropagation algorithm is mentioned: [Pg.119]    [Pg.87]    [Pg.127]    [Pg.2401]    [Pg.2401]    [Pg.319]    [Pg.657]    [Pg.158]    [Pg.335]    [Pg.196]    [Pg.121]    [Pg.2039]    [Pg.2063]    [Pg.56]    [Pg.348]    [Pg.23]    [Pg.25]    [Pg.222]    [Pg.210]    [Pg.417]    [Pg.99]    [Pg.540]    [Pg.540]    [Pg.760]    [Pg.301]    [Pg.205]    [Pg.531]    [Pg.540]    [Pg.159]    [Pg.338]    [Pg.1779]    [Pg.240]    [Pg.84]    [Pg.123]    [Pg.390]    [Pg.352]   
See also in sourсe #XX -- [ Pg.316 , Pg.317 ]




SEARCH



Backpropagation

Backpropagation algorithm

Neural backpropagation

© 2024 chempedia.info