Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural networks overtraining

In all modeling techniques, and neural networks in particular, care must be taken not to overtrain or overfit the model. [Pg.474]

Stopping criteria. A rule used to terminate the iterative training process for neural network learning or function minimization. To prevent overtraining, the stopping criteria may not be based solely upon the error function for example performance on a validation set is often used to stop training. [Pg.188]

Bayesian neural networks (BNNs) are an alternative to the more traditional ANNs. The main advantage with BNNs is that they are less prone to overtraining compared to ANNs. BNNs are based on Bayesian probabilistics for the network training. Network weights are determined by Bayesian inference. BNNs have been successfully used together with automatic relevance determination (ARD) for the selection of relevant descriptors to model aqueous solubility [89]. For a good review on BNNs, see Ref. [90]. [Pg.390]

These originate from the machine learning field and compare well and often exceed artificial neural networks for data modelling. The method prevents overtraining and does not require... [Pg.500]

Although a majority of the published ADMET models are based on linear multivariate methods as discussed in Section 16.3.3.1, other nonlinear methods have also been employed. The most commonly used nonlinear method in ADMET modeling is neural networks (NNs). Backpropagation NNs have been used to model absorption, permeation, as well as solubility and toxicological effects. A particular problem for many NNs is the tendency for these networks to overtrain (see further discussions on model validation in Section 16.3.3.4), which needs to be closely monitored to avoid the situation where the derived model becomes an encyclopedia , that is, the model can perfectly explain the variance of the investigated property of the compounds used to derive the model but have quite poor predictive ability with respect to new compounds. [Pg.1013]

It seems too early to judge on the real suitability of neural nets for QSAR studies further investigations which compare classical structure-activity analyses and results from neural networks e.g. [570]) are required to evaluate the scope and limitations of neural nets. Some problems of neural networks, e.g. the design of the network, lack of convergence, chance correlations, and overtraining of the network, have been discussed and critically commented [562, 567 — 570]. [Pg.89]

Neural Network Studies. 1. Comparison of Overfitting and Overtraining. [Pg.346]

One problem frequently encountered with neural networks is that of overtraining. This occurs when the net trains so closely to the training set that any other data will not be recognized. In this case, when the net is tested, it will be found that examples which were in the training set will give highly accurate results, and other examples may be wildly inaccurate. [Pg.356]

However, like other regression methods, standard back-propagation neural nets are still prone to overtraining, overfitting, and validation problems. They introduce an additional problem related to overfitting—the need to optimize the neural network architecture. We summarize a number of developments in neural nets, from our work and that of others, which have overcome these shortcomings and allow neural networks to develop very robust models for use in combinatorial discovery. [Pg.331]

To avoid overtraining, the data set is divided into a training set, a cross-validation seL and an external prediction set. The neural network is trained using the compounds from the training set only. Periodically, the training process is halted and the property values for the members of the cross-validation set are predicted. A running tally is kept of the... [Pg.2326]


See other pages where Neural networks overtraining is mentioned: [Pg.107]    [Pg.500]    [Pg.109]    [Pg.481]    [Pg.394]    [Pg.355]    [Pg.84]    [Pg.213]    [Pg.163]    [Pg.180]    [Pg.351]    [Pg.60]    [Pg.94]    [Pg.2405]    [Pg.653]    [Pg.302]    [Pg.116]    [Pg.116]    [Pg.209]    [Pg.1181]    [Pg.405]    [Pg.436]    [Pg.436]    [Pg.153]    [Pg.83]    [Pg.90]    [Pg.18]    [Pg.138]    [Pg.350]    [Pg.357]    [Pg.232]    [Pg.100]    [Pg.139]    [Pg.163]    [Pg.109]    [Pg.538]    [Pg.334]   
See also in sourсe #XX -- [ Pg.89 ]




SEARCH



Neural network

Neural networking

Overtraining

© 2024 chempedia.info