Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural network activation function

Casciati F, Faravelli L, Venini P (1993) A neural-network performance-function selection in active structural control. In Proceedings of the international workshop on structural control. University of Southern California, Los Angeles... [Pg.19]

Neural networks can also be classified by their neuron transfer function, which typically are either linear or nonlinear models. The earliest models used linear transfer functions wherein the output values were continuous. Linear functions are not very useful for many applications because most problems are too complex to be manipulated by simple multiplication. In a nonlinear model, the output of the neuron is a nonlinear function of the sum of the inputs. The output of a nonlinear neuron can have a very complicated relationship with the activation value. [Pg.4]

The purpose of this case study was to develop a simple neural network based model with the ability to predict the solvent activity in different polymer systems. The solvent activities were predicted by an ANN as a function of the binary type and the polymer volume frac-... [Pg.20]

Several nonlinear QSAR methods have been proposed in recent years. Most of these methods are based on either ANN or machine learning techniques. Both back-propagation (BP-ANN) and counterpropagation (CP-ANN) neural networks [33] were used in these studies. Because optimization of many parameters is involved in these techniques, the speed of the analysis is relatively slow. More recently, Hirst reported a simple and fast nonlinear QSAR method in which the activity surface was generated from the activities of training set compounds based on some predefined mathematical functions [34]. [Pg.313]

The basic component of the neural network is the neuron, a simple mathematical processing unit that takes one or more inputs and produces an output. For each neuron, every input has an associated weight that defines its relative importance, and the neuron simply computes the weighted sum of all the outputs and calculates an output. This is then modified by means of a transformation function (sometimes called a transfer or activation function) before being forwarded to another neuron. This simple processing unit is known as a perceptron, a feed-forward system in which the transfer of data is in the forward direction, from inputs to outputs, only. [Pg.688]

This activation function is much more complicated then the saturated linear function used in recurrent neural networks [152-155] and is actually established by the biochemical system. According to Siegelmann [154], use of a complicated activation function does not increase the computational power of the network. [Pg.133]

There are many similarities between recurrent neural networks and the biochemical networks presented in this work. However, the dissimilarities reviewed here are very closely related to the inherent characteristics of biochemical systems, such as their kinetic equations. Thus, in future work one may consider recurrent neural networks that are similar to biochemical networks— having the same activation function and the same connections between neurons—and this approach will allow one to assess their computational capabilities. [Pg.133]

Multivariate models using neural networks, support vector machines and least median squares regression have been used to predict hERG activity [96-98]. These types of models function more as computational black box assays. [Pg.401]

There is a long history of efforts to find simple and interpretable /i and fi functions for various activities and properties (29, 30). The quest for predictive QSAR models started with Hammett s pioneer work to correlate molecular structures with chemical reactivities (30-32). However, the widespread applications of modern predictive QSAR and QSPR actually started with the seminal work of Hansch and coworkers on pesticides (29, 33, 34) and the developments of various powerful analysis tools, such as PLS (partial least squares) and neural networks, for multivariate analysis have fueled these widespread applications. Nowadays, numerous publications on guidelines, workflows, and... [Pg.40]

Each NN (Figure 12.8) has 6 inputs and 1 output a number of 3 to 12 nodes were considered in the hidden layer. The optimal number of nodes, together with the best activation function, was selected according to an algorithm given in Greaves et al. (2003). Note that an array of all the combinations of 11 activation functions was used to help in finding the most appropriate neural network for each output. The... [Pg.381]

To optimize the neural network design, important choices must be made for the selection of numerous parameters. Many of these are internal parameters that need to be tuned with the help of experimental results and experience with the specific application under study. The following discussion focuses on back-propagation design choices for the learning rate, momentum term, activation function, error function, initial weights, and termination condition. [Pg.92]


See other pages where Neural network activation function is mentioned: [Pg.509]    [Pg.360]    [Pg.911]    [Pg.313]    [Pg.364]    [Pg.158]    [Pg.3]    [Pg.250]    [Pg.527]    [Pg.205]    [Pg.10]    [Pg.98]    [Pg.159]    [Pg.181]    [Pg.399]    [Pg.181]    [Pg.359]    [Pg.75]    [Pg.20]    [Pg.45]    [Pg.85]    [Pg.85]    [Pg.325]    [Pg.525]    [Pg.304]    [Pg.541]    [Pg.169]    [Pg.224]    [Pg.365]    [Pg.366]    [Pg.426]    [Pg.397]    [Pg.420]    [Pg.423]    [Pg.94]    [Pg.161]    [Pg.162]    [Pg.174]    [Pg.179]    [Pg.180]   
See also in sourсe #XX -- [ Pg.363 ]




SEARCH



Activating function

Activation function

Active functional

Functional activation

Functional activity

Functions activity

Network functionality

Neural activation

Neural activity

Neural network

Neural networking

© 2024 chempedia.info