Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

In neural networks

Since 1970 the subject of amoiphous semiconductors, in particular silicon, has progressed from obscurity to product commercialisation such as flat-panel hquid crystal displays, linear sensor arrays for facsimile machines, inexpensive solar panels, electrophotography, etc. Many other appHcations are at the developmental stage such as nuclear particle detectors, medical imaging, spatial light modulators for optical computing, and switches in neural networks (1,2). [Pg.357]

In neural network design, the above parameters have no precise number/answers because it is dependent on the particular application. However, the question is worth addressing. In general, the more patterns and the fewer hidden neurons to be used, the better the network. It should be realized that there is a subtle relationship between the number of patterns and the number of hidden layer neurons. Having too few patterns or too many hidden neurons can cause the network to memorize. When memorization occurs, the network would perform well during training, but tests poorly with a new data set. [Pg.9]

Literature in the area of neural networks has been expanding at an enormous rate with the development of new and efficient algorithms. Neural networks have been shown to have enormous processing capability and the authors have implemented many hybrid approaches based on this technique. The authors have implemented an ANN based approach in several areas of polymer science, and the overall results obtained have been very encouraging. Case studies and the algorithms presented in this chapter were very simple to implement. With the current expansion rate of new approaches in neural networks, the readers may find other paradigms that may provide new opportunities in their area of interest. [Pg.31]

Kramer, M. A., Thompson, M. L. and Bhagat, P. M., Embedding theoretical models in neural networks. Proc. Am. Control Conf. 475 (1992). [Pg.204]

D. Domine, D. Wienke, J. Devillers and L.M.C. Buydens, A new nonlinear neural mapping technique for visual exploration of QSAR data. In Neural Networks in QSAR and Drug Design, J. Devillers (ed.). Academic Press, London, 1996, p. 223-253. [Pg.699]

Davis, J. F., and Wang, C. M., Pattern-based interpretation of on-line process data, in Neural Networks for Chemical Engineers (A. Bulsari, ed.), Elsevier Science, 1995, pp. 443-470. [Pg.98]

Kavuri, S. N., and Venkatasubramanian, V., Using fuzzy clustering with ellipsoidal units in neural networks for robust fault classification, Comput. Chem. Eng. 17(8), 765 (1993). [Pg.99]

Anzali, S., Bamickel, G., Krug, M., Sadowski, J., Wagener, M. and Gasteiger, J. (1996) Evaluation of molecular surface properties using a Kohonen neural network. In Neural Networks in QSAR and Design, Devillers, J. (Ed.), Academic Press, London. [Pg.79]

The lack of a recipe for adjusting the weights of connections into hidden nodes brought research in neural networks to a virtual standstill until the publication by Rumelhart, Hinton, and Williams2 of a technique now known as backpropagation (BP). This offered a way out of the difficulty. [Pg.30]

Overfitting is a potentially serious problem in neural networks. It is tackled in two ways (1) by continually monitoring the quality of training as it occurs using a test set, and (2) by ensuring that the geometry of the network (its size and the way the nodes are connected) is appropriate for the size of the dataset. [Pg.38]

A. Hjelmfelt and J. Ross, Pattern recognition, chaos, and multiplicity in neural networks of excitable systems, Proc. Natl. Acad. Sci. USA, 91, 63-37 (1994). [Pg.143]

In neural networks the interactions between neurons are highly nonlinear. Models based on the concept of dissipative structures can be constructed that account for spatiotemporal patterns in epileptic seizures. [Pg.33]

Luo [86] proposed a kind of neural cluster structure embedded in neural networks. The ANN is based on the error back-propagation learning... [Pg.274]

It is very important to note that this method automatically contains the interactions between taste substances. If many measurements of various mixed solutions are made, comparison of output patterns between test solution and the mixed solutions can be easily made by adequate algorithms such as in neural networks. [Pg.398]

So, the basic neuron can be seen as having two operations, summation and thresholding, as illustrated in Figure 2.5. Other forms of thresholding and, indeed, other transfer functions are commonly used in neural network modeling some of these will be discussed later. For input neurons, the transfer function is typically assumed to be unity, i.e., the input signal is passed through without modification as output to the next layer F(x) = 1.0. [Pg.24]

Ripley, B. D. (1997b). Statistical ideas for selecting network architectures. In Neural Networks Artificial Intelligence and Industrial Applications (ed. K. B. and S. Gielen), pp. 183-90. Springer. [Pg.151]

Current trends in neural networks favor smaller networks with minimal architecture. Two major advantages of smaller networks previously discussed are better generalization capability (8.4) and easier rule extraction (13.2). Another advantage is better predictive accuracy, seen when a large network is replaced by many smaller networks, each for a subtask or a subset of data. A typical example is the protein classification problem, where n individual networks can be used to classify n different protein families and increase the prediction accuracy obtained by one large network with n output units. The improvement is especially significant when there is sufficient data for fine-tuning individual neural networks to the particularity of the data subsets. The use of ensembles of small, customized neural networks to improve predictive accuracy has been shown in numerous cases. [Pg.156]


See other pages where In neural networks is mentioned: [Pg.326]    [Pg.326]    [Pg.350]    [Pg.5]    [Pg.5]    [Pg.5]    [Pg.8]    [Pg.124]    [Pg.775]    [Pg.779]    [Pg.204]    [Pg.654]    [Pg.660]    [Pg.679]    [Pg.158]    [Pg.378]    [Pg.111]    [Pg.4]    [Pg.246]    [Pg.458]    [Pg.181]    [Pg.359]    [Pg.375]    [Pg.52]   


SEARCH



Neural network

Neural networking

© 2024 chempedia.info