Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Back propagation technique

Cigizoglu, H.K. and O. Kisi Flow prediction by two back propagation techniques using k-fold partitioning of neural network training data. Nordic Hydrol. 36 (2005) (inpress). [Pg.430]

Now, one may ask, what if we are going to use Feed-Forward Neural Networks with the Back-Propagation learning rule Then, obviously, SVD can be used as a data transformation technique. PCA and SVD are often used as synonyms. Below we shall use PCA in the classical context and SVD in the case when it is applied to the data matrix before training any neural network, i.e., Kohonen s Self-Organizing Maps, or Counter-Propagation Neural Networks. [Pg.217]

The Back-Propagation Algorithm (BPA) is a supervised learning method for training ANNs, and is one of the most common forms of training techniques. It uses a gradient-descent optimization method, also referred to as the delta rule when applied to feedforward networks. A feedforward network that has employed the delta rule for training, is called a Multi-Layer Perceptron (MLP). [Pg.351]

Several nonlinear QSAR methods have been proposed in recent years. Most of these methods are based on either ANN or machine learning techniques. Both back-propagation (BP-ANN) and counterpropagation (CP-ANN) neural networks [33] were used in these studies. Because optimization of many parameters is involved in these techniques, the speed of the analysis is relatively slow. More recently, Hirst reported a simple and fast nonlinear QSAR method in which the activity surface was generated from the activities of training set compounds based on some predefined mathematical functions [34]. [Pg.313]

The back-propagation strategy is a steepest gradient method, a local optimization technique. Therefore, it also suffers from the major drawback of these methods, namely that it can become locked in a local optimum. Many variants have been developed to overcome this drawback [20-24]. None of these does however really solve the problem. [Pg.677]

The most popular techniques of multilayer perceptrons (MLP) are back-propagation networks (Wythoff [1993] Jagemann [1998]). The weight matrixes W are estimated by minimizing the net error... [Pg.193]

Consequently, we will follow the example of Mills et al. (29) who recently presented the first measurements of local solvent concentration using the Rutherford back-scattering technique. They analyzed the case of 1,1,1-trichloroethane (TCE) diffusing into PMMA films in terms of a simpler model developed by Peterlin 130-311, in which the propagating solvent front is preceded by a Fickian precursor. The Peterlin model describes the front end of the steady state SCP as ... [Pg.394]

One special topic for field propagation techniques in general is the minimization of the effect of the transversal boundaries. Uncared, they correspond to abrupt changes of the refractive index distribution, and back-reflections from the boundary into the computational domain do occur. After the obvious ansatz of absorbing BC, TBC " and PML indicate the major improvements so far, which eliminate the problem almost completely. [Pg.264]

Chen et al. (2008) employed a commercial electronic tongue, based on an array of seven sensors, to classify 80 green tea samples on the basis of their taste grade, which is usually assessed by a panel test. PCA was employed as an explorative tool, while fc-NN and a back propagation artificial neural network (BP-ANN) were used for supervised classification. Both the techniques provide excellent results, achieving 100% prediction ability on a test set composed of 40 samples (one-half of the total number). In cases like this, when a simple technique, such as fc-NN, is able to supply excellent outcomes, the utilization of a complex technique, like BP-ANN, does not appear justified from a practical point of view. [Pg.105]

Na and K) employing different pattern recognition techniques [PCA, linear discriminant analysis (LDA) and ANNs], ANNs were trained by an error back-propagation algorithm and they were found to be very efficient for classifying and discriminating food products. [Pg.273]

The utility of ANNs as a pattern recognition technique in the field of microbeam analysis was demonstrated by Ro and Linton [99]. Back-propagation neural networks were applied to laser microprobe mass spectra (LAMMS) to determine interparticle variations in molecular components. Selforganizing feature maps (Kohonen neural networks) were employed to extract information on molecular distributions within environmental microparticles imaged in cross-section using SIMS. [Pg.276]

In a related paper Herrador and Gonzalez [144] described the application of PCA and CA and of two supervised techniques, LDA and back-propagated ANN on Al, Ba, Ca, Cu, K, Mg, Mn, and Zn data obtained from commercial Spanish tea samples. A minitorch ICP-AES instrument was used for the determinations. The characterization of three classes of tea was achieved. In a paper that expands previous research described in reference [47], trace metal concentrations measured by ICP-AES and ICP-MS were employed by Moreda-Pineiro et al. [145] for a more elaborated chemometric treatment on 85 samples of tea of Asian, African, commercial, and unknown origin. Seventeen elements (Al, Ba, Ca, Cd, Co, Cr, Cu, Cs, Mg, Mn, Ni, Pb, Rb, Sr, Ti, V, and Zn) were determined. In addition to the techniques employed in the already mentioned papers (PCA, CA, LDA), soft independent modeling (SIM) of class analogy was also applied. The latter method resulted in the totally correct (100 percent) classification of Chinese teas. [Pg.487]

To date, two different OCR methods were implemented in CLiDE. The first one used a back-propagation neural network for classification of the characters. The character features used as input to the neural network are determined by template matching (Venczel 1993). The second OCR implementation in CLiDE is based on topological and geometrical feature analysis, and it uses a filtering technique for the classification of characters (Simon 1996). [Pg.63]

Among the nonlinear methods, there are, besides nonlinear least squares regression, i.e. polynomial regression, the nonlinear PLS method. Alternating Conditional Expectations ACE), SMART, and MARS. Moreover, some Artificial Neural Networks techniques have also to be considered among nonlinear regression methods, such as the back-propagation method. [Pg.63]

Artificial neural networks (ANNs) represent, as opposed to PLS and MLR, a nonlinear statistical analysis technique [86]. The most commonly used N N is of the feed-forward back-propagation type (Figure 14.2). As is the case of both PLS and MLR, there are a few aspects of NN to be considered when using this type of analysis technique ... [Pg.390]

Neural networks are also being seriously explored for certain classes of optimization applications. These employ parallel solution techniques which are patterned after the way the human brain functions. Statistical routines and back propagation algorithms are used to force closure on a set of cross linked circuits (equations). Weighting functions are applied at each of the intersections. [Pg.701]


See other pages where Back propagation technique is mentioned: [Pg.253]    [Pg.237]    [Pg.42]    [Pg.325]    [Pg.95]    [Pg.148]    [Pg.253]    [Pg.237]    [Pg.42]    [Pg.325]    [Pg.95]    [Pg.148]    [Pg.491]    [Pg.8]    [Pg.32]    [Pg.679]    [Pg.704]    [Pg.387]    [Pg.266]    [Pg.119]    [Pg.121]    [Pg.256]    [Pg.367]    [Pg.478]    [Pg.148]    [Pg.155]    [Pg.97]    [Pg.26]    [Pg.473]    [Pg.328]    [Pg.59]    [Pg.362]    [Pg.498]    [Pg.1778]    [Pg.822]    [Pg.41]    [Pg.42]   
See also in sourсe #XX -- [ Pg.22 ]




SEARCH



Back-propagation

Propagation technique

© 2024 chempedia.info