Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Learning algorithms

Besides these LFER-based models, approaches have been developed using whole-molecule descriptors and learning algorithms other then multiple linear regression (see Section 10.1.2). [Pg.494]

Learning in the context of a neural network is the process of adjusting the weights and biases in such a manner that for given inputs, the correct responses, or outputs are achieved. Learning algorithms include ... [Pg.350]

To summarize, the simple perceptron learning algorithm consists of the following four steps ... [Pg.514]

Pseudo-Code Implementation of Perceptron Learning Algorithm ... [Pg.514]

Pseudo-Code Implementation The Boltzman Machine Learning Algorithm proceeds in two phases (1) a positive, or learning, ph2 se and (2) a negative, or unlearning, phtise. It is summarized below in pseudo-code. It is assumed that the visible neurons are further subdivided into input and output sets as shown schematically in figure 10.8. [Pg.535]

Richards, et. al. s idea is to use a genetic algorithm to search through a space of a certain class of cellular automata rules for a local rule that best reproduces the observed behavior of the data. Their learning algorithm (which was applied specifically to sequential patterns of dendrites formed by NH4 Br as it solidifies from a supersaturated solution) starts with no a-priori knowledge about the physical system. R, instead, builds increasingly sophisticated models that reproduce the observed behavior. [Pg.591]

At the heart of the platform was a coarse-grained physical description of the binding free energy, which was trained with a proprietary machine learning algorithm. The coarse-grained physical model used was ... [Pg.339]

QSAR modeling. Therefore considerably larger and more consistent data sets for each enzyme will be required in future to increase the predictive scope of such models. The evaluation of any rule-based metabolite software with a diverse array of molecules will indicate that it is possible to generate many more metabolites than have been identified in the literature for the respective molecules to date, which could also reflect the sensitivity of analytical methods at the time of publishing the data. In such cases, efficient machine learning algorithms will be necessary to indicate which of the metabolites are relevant and will be likely to be observed under the given experimental conditions. [Pg.458]

Rather than trying to replace any of the above traditional techniques, this chapter presents the development of complementary frameworks and methodologies, supported by symbolic empirical machine learning algorithms (Kodratoff and Michalski, 1990 Shavlik and Dietterich, 1990 Shapiro and Frawley, 1991). These ideas from machine learning try to overcome some of the weaknesses of the traditional techniques in terms of both (1) the number and type of a priori decisions and assumptions that they require and (2) the knowledge representation formats they choose to express final solutions. [Pg.101]

The search procedure, S, used to uncover promising hyperrectangles in the decision space, X, associated with a desired y value (e.g., y = good ), is based on symbolic inductive learning algorithms, and leads to the identification of a final number of promising solutions, X, such as the ones in Fig. 2b. It is described in the following subsection. [Pg.112]

With the selection of wavelets as the basis functions the learning algorithm can now be finalized. [Pg.186]

As shown earlier, by imposing a threshold on the L empirical error and applying the learning algorithm, an approximating function with... [Pg.190]

A. Sankar and R. Mammone, A fast learning algorithm for tree neural networks. In Proc. 1990 Conf. on Information Sciences and Systems, Princeton, NJ, 1990, pp. 638-642. [Pg.240]

Due to the Kohonen learning algorithm, the individual weight vectors in the Kohonen map are arranged and oriented in such a way that the structure of the input space, i.e. the topology is preserved as well as possible in the resulting... [Pg.691]

B. Walczak, Neural networks with robust backpropagation learning algorithm. Anal. Chim. Acta, 322 (1996) 21-30. [Pg.696]

Chen, S., Cowan, C. F. N., and Grant, P. M., Orthogonal least squares learning algorithm for radial basis function networks, IEEE Trans. Neur. Net. 2(2), 302-309 (1991). [Pg.98]

By means of the learning algorithm the parameters of the ANN are altered in such a way that the net inputs produce adequate net outputs. Mostly this is affected only by changing the weights (other procedures like adding or removing of neurons and modification of activation functions are rarely used see Zell [1994]). [Pg.193]


See other pages where Learning algorithms is mentioned: [Pg.464]    [Pg.193]    [Pg.499]    [Pg.721]    [Pg.546]    [Pg.592]    [Pg.729]    [Pg.740]    [Pg.763]    [Pg.781]    [Pg.782]    [Pg.142]    [Pg.366]    [Pg.688]    [Pg.9]    [Pg.9]    [Pg.101]    [Pg.158]    [Pg.158]    [Pg.183]    [Pg.186]    [Pg.187]    [Pg.190]    [Pg.192]    [Pg.192]    [Pg.277]    [Pg.317]    [Pg.104]    [Pg.671]    [Pg.30]    [Pg.62]    [Pg.340]   
See also in sourсe #XX -- [ Pg.195 ]

See also in sourсe #XX -- [ Pg.308 ]

See also in sourсe #XX -- [ Pg.308 , Pg.312 , Pg.313 , Pg.316 , Pg.318 , Pg.319 , Pg.338 ]

See also in sourсe #XX -- [ Pg.28 ]

See also in sourсe #XX -- [ Pg.258 , Pg.323 ]

See also in sourсe #XX -- [ Pg.226 , Pg.230 ]




SEARCH



Algorithmic learning

© 2024 chempedia.info