Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural applications

Similar neural applications of aligned nanoflbers include spinal cord repair and regeneration of aligned neural tracks in the brain. These applications, however, are much more difflcult to address due to the increased inflammatory response in CNS tissue, as well as the lack of natural regeneration in CNS tissues. [Pg.199]

Laminin, another ECM protein, has also been used to modify materials, particularly for neural applications. When laminin was covalently immobilized on 3D agarose hydrogels, neurite extension from embryonic chick DRGs was stimulated. Further, immobilization was necessary, since laminin simply mixed into the agarose gel was unable to stimulate neurite extension (Yu et al., 1999). Uniform laminin substrates have also been used for the attachment of Schwann cells, but no preferential cell orientation was shown. When micropattems of laminin and albumin were fabricated on... [Pg.373]

Calcium uptake by muscle sarcoplasmic reticulum following neural application of... [Pg.340]

We have presented a neural network based spectrum classifier (NSC) aimed at ultrasonic resonance spectroscopy. The ultrasonic spectroscopy and the NSC has been evaluated in many industrial applications, such as concrete inspection, testing of aerospace composite structures, ball bearings, and aircraft multi-layer structures. The latter application has been presented in some detail. [Pg.111]

The results obtained with NSC in different applications show that both flaw detection and localization can be performed automatically by the use of a neural network classifier. [Pg.111]

Freeman, J. A., Skapura, D. M. Neural Networks Algorithms, Applications and Programming Techniques, Computation and Neural systems Series. Addison Wesley Publishing Company, 1991... [Pg.466]

This format was developed in our group and is used fruitfully in SONNIA, software for producing Kohonen Self Organizing Maps (KSOM) and Coimter-Propaga-tion (CPG) neural networks for chemical application [6]. This file format is ASCII-based, contains the entire information about patterns and usually comes with the extension "dat . [Pg.209]

To understand neural networks, especially Kohonen, counter-propagation and back-propagation networks, and their applications... [Pg.439]

The usage of a neural network varies depending on the aim and especially on the network type. This tutorial covers two applications on the one hand the usage of a Kohonen network for classification, and on the other hand the prediction of object properties with a counter-propagation network,... [Pg.463]

The possibilities for the application for neural networks in chemistry arc huge [10. They can be used for various tasks for the classification of structures or reactions, for establishing spcctra-strncturc correlations, for modeling and predicting biological activities, or to map the electrostatic potential on molecular surfaces. [Pg.464]

Table 9-3 can act as a guideline for the proper selection of a neural network method. It summarizes the different network types and thcii learning strategy, and lists different types of applications. [Pg.464]

Table 9-3. Summary of neural network types and different types of applications. Table 9-3. Summary of neural network types and different types of applications.
The models are applicable to large data sets with a rapid calculation speed, a wide range of compounds can be processed. Neural networks provided better models than multilinear regression analysis. [Pg.504]

Figure 10.2-9. Application of a counterpropagation neural network as a look-up table for IR spectra sinnulation, The winning neuron which contains the RDF code in the upper layer of the network points to the simulated IR spectrum in the lower layer. Figure 10.2-9. Application of a counterpropagation neural network as a look-up table for IR spectra sinnulation, The winning neuron which contains the RDF code in the upper layer of the network points to the simulated IR spectrum in the lower layer.
In recent decades, much attention has been paid to the application of artificial neural networks as a tool for spectral interpretation (see, e.g.. Refs. [104, 105]). The ANN approach app]ied to vibrational spectra allows the determination of adequate functional groups that can exist in the sample, as well as the complete interpretation of spectra. Elyashberg [106] reported an overall prediction accuracy using ANN of about 80 % that was achieved for general-purpose approaches. Klawun and Wilkins managed to increase this value to about 95% [107]. [Pg.536]

Neural networks have been applied to IR spectrum interpreting systems in many variations and applications. Anand [108] introduced a neural network approach to analyze the presence of amino acids in protein molecules with a reliability of nearly 90%. Robb and Munk [109] used a linear neural network model for interpreting IR spectra for routine analysis purposes, with a similar performance. Ehrentreich et al. [110] used a counterpropagation network based on a strategy of Novic and Zupan [111] to model the correlation of structures and IR spectra. Penchev and co-workers [112] compared three types of spectral features derived from IR peak tables for their ability to be used in automatic classification of IR spectra. [Pg.536]

The results presented here imply that a similar approadi can be used for comparing two different hbraries, for determining the degree of overlap between the compounds in these two Hbraries. Examples of the application of artificial neural networks or GA in drug design are given in [57, 58, 84, 85]. [Pg.615]

T A and H Kalayeh 1991. Applications of Neural Networks in Quantitative Structure-Activity ationships of Dihydrofolate Reductase Inhibitors, journal of Medicinal Chemistry 34 2824-2836. ik M and R C Glen 1992. Applications of Rule-induction in the Derivation of Quantitative icture-Activity Relationships. Journal of Computer-Aided Molecular Design 6 349-383. [Pg.736]

Transfer function models are linear in nature, but chemical processes are known to exhibit nonhnear behavior. One could use the same type of optimization objective as given in Eq. (8-26) to determine parameters in nonlinear first-principle models, such as Eq. (8-3) presented earlier. Also, nonhnear empirical models, such as neural network models, have recently been proposed for process applications. The key to the use of these nonlinear empirical models is naving high-quality process data, which allows the important nonhnearities to be identified. [Pg.725]

A key featui-e of MPC is that a dynamic model of the pi ocess is used to pi-edict futui e values of the contmlled outputs. Thei-e is considei--able flexibihty concei-ning the choice of the dynamic model. Fof example, a physical model based on fifst principles (e.g., mass and energy balances) or an empirical model coiild be selected. Also, the empirical model could be a linear model (e.g., transfer function, step response model, or state space model) or a nonhnear model (e.g., neural net model). However, most industrial applications of MPC have relied on linear empirical models, which may include simple nonlinear transformations of process variables. [Pg.740]

SS So, M Karplus. Evolutionary optimization in quantitative structure-activity relationship An application of genetic neural networks. J Med Chem 39 1521-1530, 1996. [Pg.367]

TA Andrea, H Kalayeh. Applications of neural networks in quantitative structure-activity relationships of dihydrofolate reductase inhibitors. J Med Chem 34 2824-2836, 1991. [Pg.367]

Application of neural networks to modelling, estimation and control... [Pg.358]

Neural networks can also be classified by their neuron transfer function, which typically are either linear or nonlinear models. The earliest models used linear transfer functions wherein the output values were continuous. Linear functions are not very useful for many applications because most problems are too complex to be manipulated by simple multiplication. In a nonlinear model, the output of the neuron is a nonlinear function of the sum of the inputs. The output of a nonlinear neuron can have a very complicated relationship with the activation value. [Pg.4]

In neural network design, the above parameters have no precise number/answers because it is dependent on the particular application. However, the question is worth addressing. In general, the more patterns and the fewer hidden neurons to be used, the better the network. It should be realized that there is a subtle relationship between the number of patterns and the number of hidden layer neurons. Having too few patterns or too many hidden neurons can cause the network to memorize. When memorization occurs, the network would perform well during training, but tests poorly with a new data set. [Pg.9]

Applications of neural networks are becoming more diverse in chemistry [31-40]. Some typical applications include predicting chemical reactivity, acid strength in oxides, protein structure determination, quantitative structure property relationship (QSPR), fluid property relationships, classification of molecular spectra, group contribution, spectroscopy analysis, etc. The results reported in these areas are very encouraging and are demonstrative of the wide spectrum of applications and interest in this area. [Pg.10]

The ANN as a predictive tool is most effective only within the trained range of input training variables. Those predictions that fall outside the trained range must be considered to be of questionable validity. Even so, whenever experimental data are available for validation, neural networks can be put to effective use. Since an extensive experimental body of data on polymers has been published in the literature, the application of neural networks as a predictive tool for physical, thermodynamic, and other fluid properties is, therefore, promising. It is a novel technique that will continue to be used, and it deserves additional investigation and development. [Pg.32]


See other pages where Neural applications is mentioned: [Pg.159]    [Pg.733]    [Pg.53]    [Pg.256]    [Pg.664]    [Pg.340]    [Pg.159]    [Pg.733]    [Pg.53]    [Pg.256]    [Pg.664]    [Pg.340]    [Pg.105]    [Pg.106]    [Pg.275]    [Pg.474]    [Pg.530]    [Pg.115]    [Pg.509]    [Pg.509]    [Pg.19]    [Pg.360]    [Pg.718]    [Pg.362]    [Pg.267]    [Pg.1]    [Pg.10]    [Pg.10]   
See also in sourсe #XX -- [ Pg.680 ]




SEARCH



© 2024 chempedia.info