Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural transfer functions

Transfer function models are linear in nature, but chemical processes are known to exhibit nonhnear behavior. One could use the same type of optimization objective as given in Eq. (8-26) to determine parameters in nonlinear first-principle models, such as Eq. (8-3) presented earlier. Also, nonhnear empirical models, such as neural network models, have recently been proposed for process applications. The key to the use of these nonlinear empirical models is naving high-quality process data, which allows the important nonhnearities to be identified. [Pg.725]

A key featui-e of MPC is that a dynamic model of the pi ocess is used to pi-edict futui e values of the contmlled outputs. Thei-e is considei--able flexibihty concei-ning the choice of the dynamic model. Fof example, a physical model based on fifst principles (e.g., mass and energy balances) or an empirical model coiild be selected. Also, the empirical model could be a linear model (e.g., transfer function, step response model, or state space model) or a nonhnear model (e.g., neural net model). However, most industrial applications of MPC have relied on linear empirical models, which may include simple nonlinear transformations of process variables. [Pg.740]

A sigmoid (s-shaped) is a continuous function that has a derivative at all points and is a monotonically increasing function. Here 5,p is the transformed output asymptotic to 0 < 5/,p I and w,.p is the summed total of the inputs (- 00 < Ui p < -I- 00) for pattern p. Hence, when the neural network is presented with a set of input data, each neuron sums up all the inputs modified by the corresponding connection weights and applies the transfer function to the summed total. This process is repeated until the network outputs are obtained. [Pg.3]

Neural networks can also be classified by their neuron transfer function, which typically are either linear or nonlinear models. The earliest models used linear transfer functions wherein the output values were continuous. Linear functions are not very useful for many applications because most problems are too complex to be manipulated by simple multiplication. In a nonlinear model, the output of the neuron is a nonlinear function of the sum of the inputs. The output of a nonlinear neuron can have a very complicated relationship with the activation value. [Pg.4]

Kolmogorov s Theorem (Reformulated by Hecht-Nielson) Any real-valued continuous function f defined on an N-dimensional cube can be implemented by a three layered neural network consisting of 2N -)-1 neurons in the hidden layer with transfer functions from the input to the hidden layer and (f> from all of... [Pg.549]

When the MLF is used for classification its non-linear properties are also important. In Fig. 44.12c the contour map of the output of a neural network with two hidden units is shown. It shows clearly that non-linear boundaries are obtained. Totally different boundaries are obtained by varying the weights, as shown in Fig. 44.12d. For modelling as well as for classification tasks, the appropriate number of transfer functions (i.e. the number of hidden units) thus depends essentially on the complexity of the relationship to be modelled and must be determined empirically for each problem. Other functions, such as the tangens hyperbolicus function (Fig. 44.13a) are also sometimes used. In Ref. [19] the authors came to the conclusion that in most cases a sigmoidal function describes non-linearities sufficiently well. Only in the presence of periodicities in the data... [Pg.669]

So, the basic neuron can be seen as having two operations, summation and thresholding, as illustrated in Figure 2.5. Other forms of thresholding and, indeed, other transfer functions are commonly used in neural network modeling some of these will be discussed later. For input neurons, the transfer function is typically assumed to be unity, i.e., the input signal is passed through without modification as output to the next layer F(x) = 1.0. [Pg.24]

An example of a non-covalent MIP sensor array is shown in Fig. 21.14. Xylene imprinted poly(styrenes) (PSt) and poly(methacrylates) (PMA) with 70 and 85% cross-linker have been used for the detection of o- and p-xylene. The detection has been performed in the presence of 20-60% relative humidity to simulate environmental conditions. In contrast to the calixarene/urethane layers mentioned before, p-xylene imprinted PSts still show a better sensitivity to o-xylene. The inversion of the xylene sensitivities can be gathered with PMAs and higher cross-linker ratios. As a consequence of the humidity, multivariate calibration of the array with partial least squares (PLS) and artificial neural networks (ANN) is performed, The evaluated xylene detection limits are in the lower ppm range (Table 21.2), whereas neural networks with back-propagation training and sigmoid transfer functions provide the most accurate data for o- and p-xylene concentrations as compared to PLS analyses. [Pg.524]

Autoassociative neural networks provide a special five-layer network structure (Figure 3.6) that can implement nonlinear PCA by reducing variable dimensionality and producing a feature space map that retains the maximum possible amount of information from the original data set [150]. Autoassociative neural networks use conventional feedforward connections and sigmoidal or linear nodal transfer functions. [Pg.63]

Fig. 4 The architecture of (a) a. single iayer neurai network with the. sigmoidal transfer function, as well as the wavelet neural network for (h) IR spectral data compre.ssion. and (c) pattern recognition in ilV-VIS spectroscopy. Fig. 4 The architecture of (a) a. single iayer neurai network with the. sigmoidal transfer function, as well as the wavelet neural network for (h) IR spectral data compre.ssion. and (c) pattern recognition in ilV-VIS spectroscopy.

See other pages where Neural transfer functions is mentioned: [Pg.9]    [Pg.660]    [Pg.662]    [Pg.679]    [Pg.535]    [Pg.543]    [Pg.704]    [Pg.303]    [Pg.104]    [Pg.760]    [Pg.119]    [Pg.20]    [Pg.34]    [Pg.35]    [Pg.132]    [Pg.30]    [Pg.208]    [Pg.309]    [Pg.220]    [Pg.977]    [Pg.330]    [Pg.177]    [Pg.685]    [Pg.248]    [Pg.385]    [Pg.308]   
See also in sourсe #XX -- [ Pg.308 ]




SEARCH



Transfer function

Transfer function functions

Transference function

© 2024 chempedia.info