Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Feedforward ANN

Artificial neural networks arose from efforts to model the functioning of the mammalian brain. The most popular ANN — the feedforward ANN — has deeper roots in statistics than in neurobiology, though. A form of ANN (a Probability Neural Network) has been used within a QPA context to improve sensor data reliability, but not as an on-line quality model [57]. The best way to represent a feedforward ANN as an on-line quality model for SHMPC is... [Pg.284]

Results from these experimental runs were used as x, q data records to fit the parameters of six ANNs. In the experimental effort, a different feedforward ANN was used after each intermediate secondary measurement was obtained in the simulation-based effort, only one ANN accommodates all secondary measurements, and averaged dummy inputs are used for those secondary measurements not yet obtained. In addition in the experimental effort, a different ANN was used for final thickness and final void content predictions in the simulation-based effort, one ANN was used to predict both final thickness and final void content. The advantage of using one ANN to predict all values of q is that the parameters of only one ANN need be fitted. Fitting the parameters of an ANN for each variable in q is much more time-consuming. The disadvantage, however, is that the parameters A and abias are the same for each variable in q when just one ANN is used as an on-line model. When a different ANN is used for each variable in q, the parameters in A and abias are unique for each of those output variables, which results in increased on-line prediction accuracy. Similar speed-versus-accuracy arguments apply to the choice of one ANN for all secondary measurements versus an ANN for each secondary measurement. [Pg.287]

ANNs have been used to develop behavioral models to enable efficient system-level simulation of integrated microfluidic devices. Magargle et al. [8] reported a feedforward ANN-based model for an injector device that is used in... [Pg.2280]

A multilayer perceptron (MLP) network is the most common and popular ANN. It is a type of feedforward ANN and consisting of an input layer, hidden layer and output layer. Consider a MLP network with A, inputs, hidden nodes and v, (t) is the t-th input signal at f-th sample. The output of the y-th hidden node is given by ... [Pg.668]

An ANN is a network of single neurons jointed together by synaptie eonneetions. Figure 10.22 shows a three-layer feedforward neural network. [Pg.349]

The Back-Propagation Algorithm (BPA) is a supervised learning method for training ANNs, and is one of the most common forms of training techniques. It uses a gradient-descent optimization method, also referred to as the delta rule when applied to feedforward networks. A feedforward network that has employed the delta rule for training, is called a Multi-Layer Perceptron (MLP). [Pg.351]

Jouyban et al. (2004) applied ANN to calculate the solubility of drugs in water-cosolvent mixtures, using 35 experimental datasets. The networks employed were feedforward back-propagation errors with one hidden layer. The topology of neural network was optimized in a 6-5-1 architecture. All data points in each set were used to train the ANN and the solubilities were back-calculated employing the trained networks. The difference between calculated solubilities and experimental... [Pg.55]

In this way, the most common ANN used for numerical models is known as the multilayer feedforward network, and is sketched in Fig. 30.3 the scheme represents the approach for a simultaneous calibration model of two species, A and B, departing from the readings of four ISE sensors. [Pg.728]

M-CASE/BAIA (see text). BP-ANN = three-layer feedforward artificial neural network trained by the backpropagation algorithm, PAAN = probabilistic artificial neural network, CPANN = counterpropagation artificial neural network. [Pg.662]

It is now well known that the artificial neural networks (ANNs) are nonlinear tools well suited to find complex relationships among large data sets [43], Basically an ANN consists of processing elements (i.e., neurons) organized in different oriented groups (i.e., layers). The arrangement of neurons and their interconnections can have an important impact on the modeling capabilities of the ANNs. Data can flow between the neurons in these layers in different ways. In feedforward networks no loops occur, whereas in recurrent networks feedback connections are found [79,80],... [Pg.663]

Feedforward artificial neural network (ANN) Radial Basis Perceptron ANN... [Pg.148]

Application of artificial neural networks (ANN) for modelling of the kinetics of a catalytic hydrogenation reaction in a gas-liquid-solid system has been studied and discussed. The kinetics of the hydrogenation of 2,4-DNT over a palladium on alumina catalyst has been described with feedforward neural networks of dififerent architectures. A simple experimental procedure to supply learning data has been proposed. The accuracy and flexibility of the hybrid first principles-neural network model have been tested and compared with those of the classical model. [Pg.379]

After the learning step, the ANN is used to estimate output values for actual input data. ANN weights and biases are fixed during this process, and the ANN acts as an open-loop feedforward estimator. [Pg.208]

ANN has also been applied to flow control in microfluidic networks. Assadsangabi et al. [13] presented a combined feedback/feedforward strategy to control the output flow rate in the T-juncti(Mi of microchannels. A finite element model (FEM) was used to generate the training data, and a combined ANN and fuzzy logic (FL) system was utilized to build an inverse model of the flow in the T-junction, which serves as a controller to adjust the output flow rate. [Pg.2280]

Two-layer feedforward/feedback ANNs are heteroassociative. They can store input and output vectors and are useful in recalling an output vector when presented with a noisy or incomplete version of its corresponding input vector. They are also useful for classification problems. Typically, every feedforward connection between two PEs is accompanied by a feedback connection between the same two PEs. Both connections have weights, and these weights are usually different from each other. Examples are the adaptive resonance theory and bidirectional associative memory networks. [Pg.86]

Perceptron networks are feedforward, heteroassociative (or may be auto-associative) networks that accept continuous inputs. Within the last five years there have been no chemical applications of perceptrons applications before that time are now largely outmoded by the advent of more powerful ANNs. We mention them briefly for three reasons they have historical significance, they are ubiquitous in neural network texts, and you will find papers that claim to use perceptrons but in actuality do not. [Pg.98]

Aoyama and Ichikawa have given analytic formulas for the partial derivatives of the output value of either a HL or output layer PE with respect to the input value of an input PE. Their formulas, which are applicable to any feedforward network with differentiable transfer functions in the hidden and output layers, allow you to give a precise, analytical answer to the question that sensitivity analysis asks (see above). A similar sensitivity analysis has been performed for a radial basis function ANN. Aoyama and coworkers introduced another technique useful in network analysis the reconstruction of weight matrices for a backpropagation network. They used a learning... [Pg.123]

The objective of this research is to assess which compound affects customer contentment on flavor notes of RTD tea. In order to identify the hidden pattern of the customer s needs, the ANNs have heen apphed to classify the key volatile compound of the flavors. The architecture of developed ANNs is multilayer feedforward network with 4-7-5 structure, by 4 nodes of customer group at the input layer, 5 nodes of flavor compounds at the output layer, and 7 nodes of hidden layer selected with minimum MSE by varying the number of nodes from 1 to 20. In the training, we used BP training algorithm. The results of this structure show that the MSE was 1.54 , and it could predict 75.5 % (24.5 % errors) of accuracy. The compounds that most affect customer contentment are as follows women-adult is jasmine women-teen is lemon men-teen is citrus and men-adult is plain tea. [Pg.433]

The ANNs were developed in an attempt to imitate, mathematically, the characteristics of the biological neurons. They are composed by intercoimected artificial neurons responsible for the processing of input-output relationships, these relationships are learned by training the ANN with a set of irqmt-output patterns. The ANNs can be used for different proposes approximation of functions and classification are examples of such applications. The most common types of ANNs used for classification are the feedforward neural networks (FNNs) and the radial basis function (RBF) networks. Probabilistic neural networks (PNNs) are a kind of RBFs that uses a Bayesian decision strategy (Dehghani et al., 2006). [Pg.166]

The feedforward network applied is depicted in Fig. 11. There are five input units, corresponding to the five structure parameters described earlier, one hidden layer, and seven output units corresponding with the seven properties mentioned earlier. By training the ANN with several numbers of hidden units and by comparing the prediction errors of a test set, the optimum number of hidden units was determined to be eight. [Pg.396]

Three years later, a novel approach was developed by Yanqing using an adaptive ANN-based model and a neurocontroller for online cell SOC determination [10]. A radial basis function (RBE) NN has been adopted to simulate the battery. It is a typical two-layer feedforward NN that allows for fast training and is capable of converging to a global optimum. Its hidden layer comes with a nonlinear RBF activation function given by... [Pg.239]


See other pages where Feedforward ANN is mentioned: [Pg.285]    [Pg.664]    [Pg.685]    [Pg.86]    [Pg.191]    [Pg.289]    [Pg.285]    [Pg.664]    [Pg.685]    [Pg.86]    [Pg.191]    [Pg.289]    [Pg.509]    [Pg.367]    [Pg.157]    [Pg.336]    [Pg.657]    [Pg.513]    [Pg.87]    [Pg.87]    [Pg.233]    [Pg.394]    [Pg.437]    [Pg.857]    [Pg.70]    [Pg.40]    [Pg.558]    [Pg.285]    [Pg.177]   
See also in sourсe #XX -- [ Pg.284 ]




SEARCH



Anne

Feedforward

© 2024 chempedia.info