Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural networks feed-forward computational

Artificial neural networks (ANN) are computing tools made up of simple, interconnected processing elements called neurons. The neurons are arranged in layers. The feed-forward network consists of an input layer, one or more hidden layers, and an output layer. ANNs are known to be well suited for assimilating knowledge about complex processes if they are properly subjected to input-output patterns about the process. [Pg.36]

Figure 2 Schematic diagram of a three-layer, fully-connected, feed-forward computational neural network... Figure 2 Schematic diagram of a three-layer, fully-connected, feed-forward computational neural network...
Neural networks can be broadly classified based on their network architecture as feed-forward and feed-back networks, as shown in Fig. 3. In brief, if a neuron s output is never dependent on the output of the subsequent neurons, the network is said to be feed forward. Input signals go only one way, and the outputs are dependent on only the signals coming in from other neurons. Thus, there are no loops in the system. When dealing with the various types of ANNs, two primary aspects, namely, the architecture and the types of computations to be per-... [Pg.4]

The basic component of the neural network is the neuron, a simple mathematical processing unit that takes one or more inputs and produces an output. For each neuron, every input has an associated weight that defines its relative importance, and the neuron simply computes the weighted sum of all the outputs and calculates an output. This is then modified by means of a transformation function (sometimes called a transfer or activation function) before being forwarded to another neuron. This simple processing unit is known as a perceptron, a feed-forward system in which the transfer of data is in the forward direction, from inputs to outputs, only. [Pg.688]

An adaptation of the simple feed-forward network that has been used successfully to model time dependencies is the so-called recurrent neural network. Here, an additional layer (referred to as the context layer) is added. In effect, this means that there is an additional connection from the hidden layer neuron to itself. Each time a data pattern is presented to the network, the neuron computes its output function just as it does in a simple MLP. However, its input now contains a term that reflects the state of the network before the data pattern was seen. Therefore, for subsequent data patterns, the hidden and output nodes will depend on everything the network has seen so far. For recurrent neural networks, therefore, the network behaviour is based on its history. [Pg.2401]

R. Kocjancic and J. Zupan,/. Chem. Inf. Comput. Set, 37,985 (1997). Application of Feed-Forward Artificial Neural Network as a Mapping Device. [Pg.130]

The above formulation was adapted from Hornik (1991). However, also other results concerning approximation by means of feed-forward neural networks (Kurkovd, 1992 Hornik et al, 1994 Pinkus, 1998 Kurkova, 2002 Kainen et al., 2007) rely on essentially the same paradigm the required number of hidden neurons h is unknown the only guarantee is that some h always exists such that a multilayer perceptron with h hidden neurons can compute a function F with desired properties. The actual value of h depends on the approximated dependence D and on the aspects discussed in the previous points — the function space considered and the required degree of closeness between F and D however, the fact that such an h exists is independent of D and of those aspects. [Pg.94]

Based on computation done, it is observed that neural network architecture with feed forward backpropagation method of 3 layers consists of one input layer, one hidden... [Pg.401]


See other pages where Neural networks feed-forward computational is mentioned: [Pg.5]    [Pg.115]    [Pg.205]    [Pg.229]    [Pg.205]    [Pg.24]    [Pg.137]    [Pg.116]    [Pg.255]    [Pg.227]    [Pg.38]    [Pg.90]    [Pg.4549]    [Pg.361]    [Pg.364]    [Pg.209]    [Pg.390]    [Pg.128]    [Pg.79]    [Pg.63]    [Pg.2326]    [Pg.87]    [Pg.418]   
See also in sourсe #XX -- [ Pg.4 , Pg.2326 ]




SEARCH



Computational network

Computer network

Feed-forward

Feed-forward networks

Feed-forward neural network

Forward

Forwarder

Networks/networking, computer

Neural feed-forward

Neural network

Neural network computing

Neural networking

© 2024 chempedia.info