Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Error-backpropagation learning

Figure 5 Sequence order for the conection of weights in the error-backpropagation learning is oriented bottom-top. The ith output of the fth layer is marked as The same is true for inputs... Figure 5 Sequence order for the conection of weights in the error-backpropagation learning is oriented bottom-top. The ith output of the fth layer is marked as The same is true for inputs...
The aim of any training, of course, is to reach the smallest RMS value possible in the shortest possible time. It should be mentioned that employing the error-backpropagation learning for solving complex problems with many input variables and many objects may require days if not weeks of computation time even on super-computers. However, smaller tasks involving fewer objects and fewer than ten variables on input and/or on the output side, require much less computational effort and can be solved on personal computers within seconds,... [Pg.1820]

Beside the most widely used ANNs, such as error-backpropagation, Kohonen and counter-propagation, there exist a number of other ANNs as well (for example Hopfield, ABAM, Hamming, and others ). These less commonly used ANNs differ from the standard ones in their layout of neurons as well as in their learning strategy. Therefore, they are seldom used in chemical applications. [Pg.1820]

Several different types of ANN are available and the most popular is the backpropagation approach. In this procedure, input patterns presented to the input layer, for example, signals from an array of chemical sensors, generate a flow of activation to the output layer. Errors in the output are then fed back to the input layer to modify the weights of the interconnections. It should be emphasized that backpropagation does not describe a network but represents a learning algorithm. In this way, the network can be trained with known parameters, such as sensor array responses to sets of known chemicals. [Pg.437]


See other pages where Error-backpropagation learning is mentioned: [Pg.195]    [Pg.2046]    [Pg.221]    [Pg.1813]    [Pg.1816]    [Pg.1818]    [Pg.209]    [Pg.416]    [Pg.195]    [Pg.2046]    [Pg.221]    [Pg.1813]    [Pg.1816]    [Pg.1818]    [Pg.209]    [Pg.416]    [Pg.464]    [Pg.541]    [Pg.267]    [Pg.146]    [Pg.267]    [Pg.75]    [Pg.2048]    [Pg.2051]    [Pg.1816]    [Pg.1824]    [Pg.415]    [Pg.99]    [Pg.462]    [Pg.373]    [Pg.760]    [Pg.380]    [Pg.101]    [Pg.2401]    [Pg.146]    [Pg.84]    [Pg.75]    [Pg.89]    [Pg.90]    [Pg.90]    [Pg.110]    [Pg.113]    [Pg.2048]    [Pg.23]    [Pg.254]   
See also in sourсe #XX -- [ Pg.3 , Pg.1816 ]




SEARCH



Backpropagation

© 2024 chempedia.info