Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Radial basis function network

Radial basis function networks (RBF) are a variant of three-layer feed forward networks (see Fig 44.18). They contain a pass-through input layer, a hidden layer and an output layer. A different approach for modelling the data is used. The transfer function in the hidden layer of RBF networks is called the kernel or basis function. For a detailed description the reader is referred to references [62,63]. Each node in the hidden unit contains thus such a kernel function. The main difference between the transfer function in MLF and the kernel function in RBF is that the latter (usually a Gaussian function) defines an ellipsoid in the input space. Whereas basically the MLF network divides the input space into regions via hyperplanes (see e.g. Figs. 44.12c and d), RBF networks divide the input space into hyperspheres by means of the kernel function with specified widths and centres. This can be compared with the density or potential methods in pattern recognition (see Section 33.2.5). [Pg.681]

The output of a hidden unit, in case of a Gaussian kernel function is defined as  [Pg.681]

The output of these hidden nodes, o, is then forwarded to all output nodes through weighted connections. The output yj of these nodes consists of a linear combination of the kernel functions  [Pg.682]

Approaches for the initialization and training of the position of the centres, the widths and weights are summarized below  [Pg.683]

For the distribution of the position of the centres the following methods or combinations of methods are used [29]  [Pg.683]


Nair, S., S. Udpa, and L. Udpa, (1993), Radial basis functions network for defect sizing . Review of Progress in QNDE, Vol. 12, 1993, pp. 819-825... [Pg.104]

J. Park and I.W. Sandberg, Universal approximation using radial basis function networks. Neural Computation, 3 (1991) 246-257. [Pg.698]

L. Kiernan, J.D. Mason and K. Warwick, Robust initialisation of Gaussian radial basis function networks using partitioned k-means clustering. Electron. Lett., 32 (1996) 671-672. [Pg.698]

W. Luo, M.N. Karim, A.J. Morris and E.B. Martin, Control relevant identification of a pH waste water neutralisation process using adaptive radial basis function networks. Computers Chem. Eng., 20(1996)S1017... [Pg.698]

Of the several approaches that draw upon this general description, radial basis function networks (RBFNs) (Leonard and Kramer, 1991) are probably the best-known. RBFNs are similar in architecture to back propagation networks (BPNs) in that they consist of an input layer, a single hidden layer, and an output layer. The hidden layer makes use of Gaussian basis functions that result in inputs projected on a hypersphere instead of a hyperplane. RBFNs therefore generate spherical clusters in the input data space, as illustrated in Fig. 12. These clusters are generally referred to as receptive fields. [Pg.29]

Among these, the most widely used is the radial basis function network (RBFN). The key distinctions among these methods are summarized in Table I and discussed in detail in Bakshi and Utojo (1998). An RBFN is an example of a method that can be used for both input analysis and input-output analysis. As discussed earlier, the basis functions in RBFNs are of the form 0( xj - tm 2), where tm denotes the center of the basis function. One of the most popular basis functions is the Gaussian,... [Pg.40]

Chen, S., Cowan, C. F. N., and Grant, P. M., Orthogonal least squares learning algorithm for radial basis function networks, IEEE Trans. Neur. Net. 2(2), 302-309 (1991). [Pg.98]

Leonard, J., and Kramer, M. A., Radial basis function networks for classifying process faults, IEEE Control Systems, April, p. 31 (1991). [Pg.100]

Zhang, Z. Wang, D. Harrington, P. d. B. Voorhees, K. J. Rees, J. Forward selection radial basis function networks applied to bacterial classification based on MALDI-TOF-MS. Talanta 2004, 63, 527-532. [Pg.159]

N. Mai-Duy and R.I. Tanner. Computing non-Newtonian fluid flow with radial basis function networks. International Journal for Numerical Methods in Fluids, 48 1309-1336, 2005. [Pg.596]

Not all neural networks are the same their connections, elemental functions, training methods and applications may differ in significant ways. The types of elements in a network and the connections between them are referred to as the network architecture. Commonly used elements in artificial neural networks will be presented in Chapter 2. The multilayer perception, one of the most commonly used architectures, is described in Chapter 3. Other architectures, such as radial basis function networks and self organizing maps (SOM) or Kohonen architectures, will be described in Chapter 4. [Pg.17]

Networks based on radial basis functions have been developed to address some of the problems encountered with training multilayer perceptrons radial basis functions are guaranteed to converge and training is much more rapid. Both are feed-forward networks with similar-looking diagrams and their applications are similar however, the principles of action of radial basis function networks and the way they are trained are quite different from multilayer perceptrons. [Pg.41]

The essence of the differences between the operation of radial basis function networks and multilayer perceptrons can be seen in Figure 4.1, which shows data from the hypothetical classification example discussed in Chapter 3. Multilayer perceptrons classify data by the use of hyperplanes that divide the data space into discrete areas radial basis functions, on the other hand, cluster the data into a finite number of ellipsoid regions. Classification is then a matter of finding which ellipsoid is closest for a given test data point. [Pg.41]

The hidden units of a radial basis function network are not the same as used for a multilayer perception, and the weights between input and hidden layer have different meanings. Transfer functions typically used include the Gaussian function, spline functions and various quadratic functions they all are smooth functions, which taper off as distance from a center point increases. In two dimensions, the Gaussian is the well-known bell-shaped curve in three dimensions it forms a hill. [Pg.41]

Figure 4.1 Different approach to classification by multilayer perceptrons (left) and radial basis function networks (right). Figure 4.1 Different approach to classification by multilayer perceptrons (left) and radial basis function networks (right).
For radial basis function networks, each hidden unit represents the center of a cluster in the data space. Input to a hidden unit in a radial basis function is not the weighted sum of its inputs but a distance measure a measure of how far the input vector is from the center of the basis function for that hidden unit. Various distance measures are used, but perhaps the most common is the well-known Euclidean distance measure. [Pg.42]

To see how this works, consider the simple radial basis function network in Figure 4.2 which has only one input unit, two hidden units and one output unit. [Pg.43]

Radial basis function networks with more than one input unit have more parameters for each hidden node e.g.,. if there are two input units, then the basis function for each hidden unit j needs two location parameters, pij and p2j, for the center, and, optionally, two parameters, Oij and a2j, for variability. The dimension of the centers for each of the hidden units matches the dimension of the input vector. [Pg.43]

So, it can be seen that, computationally, a radial basis function network has three separate actions ... [Pg.44]

The only difficult part is finding the values for p. and o for each hidden unit, and the weights between the hidden and output layers, Le., training the network. This will be discussed later, in Chapter 5. At this point, it is sufficient to say that training radial basis function networks is considerably faster than training multilayer perceptrons. On the other hand, once trained, the feed-forward process for multilayer perceptrons is faster than for radial basis function networks. [Pg.44]


See other pages where Radial basis function network is mentioned: [Pg.99]    [Pg.540]    [Pg.172]    [Pg.681]    [Pg.5]    [Pg.61]    [Pg.176]    [Pg.540]    [Pg.5]    [Pg.61]   
See also in sourсe #XX -- [ Pg.256 , Pg.257 ]

See also in sourсe #XX -- [ Pg.90 , Pg.99 , Pg.123 ]




SEARCH



Basis function radial

Basis functions

Network functionality

Neural network with radial basis functions

Radial basis function network training

Radial basis function networks (RBF

Radial basis function neural network RBFNN)

Radial basis function neural networks

© 2024 chempedia.info