Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Training a Layered Network Backpropagation

Once the network geometry and the type of activation function have been chosen, the network will be ready to use as soon as the connection weights have been determined, which requires a period of training. [Pg.30]

With the introduction of a hidden layer, training becomes trickier because, although the target responses for the output nodes are still available from the database of sample patterns, there are no target values in the database for hidden nodes. Unless we know what output a hidden node should be generating, it is not possible to adjust the weights of the connections into it in order to reduce the difference between the required output and that which is actually delivered. [Pg.30]

The lack of a recipe for adjusting the weights of connections into hidden nodes brought research in neural networks to a virtual standstill until the publication by Rumelhart, Hinton, and Williams2 of a technique now known as backpropagation (BP). This offered a way out of the difficulty. [Pg.30]

Backpropagation is a generalized version of the delta rule, extended to multiple layers. The central assumption of BP is that when the target output and actual output at a node differ, the responsibility for the error can be divided between  [Pg.30]

The output of hidden nodes in the immediately preceeding layer that generated the input signals into the node. [Pg.30]


See other pages where Training a Layered Network Backpropagation is mentioned: [Pg.30]   


SEARCH



A -networks

Backpropagation

Backpropagation network

Layered network

Network layer

Training network

© 2024 chempedia.info