Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Limitations of the Neural Networks

There is no doubt about the useful capabilities of the ANNs to yield good solutions in many dilferent applications. They are specially suited to obtain a solution when noisy or strong nonlinearities are contained into the data. Nevertheless, they are not trivial to apply and have disadvantages (as with any other regression method). [Pg.263]

Kateman [12] considered that ANNs are not really based on theoretical models and, therefore, they lack of a structured and systematic development. Their implementation and optimisation are based on a trial-and-error approach. This is labour demanding and time consuming and there are always doubts about whether the selected model was the best one we could find. [Pg.263]

A typical drawback of neural nets is that they need a relatively high number of samples. Nevertheless, good PLS model development also needs as many samples as possible and three data sets [calibration, test (or control) and [Pg.263]

Although ANNs can, in theory, model any relation between predictors and predictands, it was found that common regression methods such as PLS can outperform ANN solutions when linear or slightly nonlinear problems are considered [1-5]. In fact, although ANNs can model linear relationships, they require a long training time since a nonlinear technique is applied to linear data. Despite, ideally, for a perfectly linear and noise-free data set, the ANN performance tends asymptotically towards the linear model performance, in practical situations ANNs can reach a performance qualitatively similar to that of linear methods. Therefore, it seems not too reasonable to apply them before simpler alternatives have been considered. [Pg.264]


The history of neural networks, at least in its popular version, has its angels and demons. One of them is Marvin Minsky, referred to as the devil. Minsky and Papert wrote a book entitled Perceptrons (1969) in which they showed, among other things, some of the limitations of the neural networks that were popular at the time. This was interpreted by some as a very restrictive limit on all kinds of neural networks and it was interpreted by some as being the end of an era of funding for the then popular neural networks in order to make room for the more classical AI approadies promoted by Minsky and others. [Pg.335]


See other pages where Limitations of the Neural Networks is mentioned: [Pg.263]   


SEARCH



Neural network

Neural networking

© 2024 chempedia.info