Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural network artificial

In an effort to simulate the theoretical behavior of neuron cells, researchers in the 1940 s and the 1950 s such as McCullogh, Pitts, and Rosenblatt developed functions that mimicked the threshold response of a synapse. Their theory of synapses held that the output of a neuron cell was at or near zero until the sum of all of the input potentials connected to it passed a certain threshold, at which point the cell s output would be at or near one. Mathematically, their neuron functions yielded a binary or a sigmoidal response to a linear combination of its inputs, and could be connected [Pg.367]

Artificial neural networks (ANN s) were effectively set aside for 15 years after a 1969 study by Minksy and Papert demonstrated their failure to correctly model a simple exclusive OR (XOR) function. i The XOR function describes the result of an operation involving two bits (1 or 0). A simple OR function produces a value of 1 if either bit or both bits have a value of 1. The XOR differs from an OR function in the output of an operation on two bits of value 1. The XOR function will yield a 0 while the OR function will yield a 1. Interest in ANN s resumed in the 1980 s after modifications were made to the layering of their neurons that allowed them to overcome the XOR test as well as a wide variety of other non-linear modeling challenges. [Pg.368]

Predictive models are built with ANN s in much the same way as they are with MLR and PLS methods descriptors and experimental data are used to fit (or train in machine-learning nomenclature) the parameters of the functions until the performance error is minimized. Neural networks differ from the previous two methods in that (1) the sigmoidal shapes of the neurons output equations better allow them to model non-linear systems and (2) they are subsymbolic , which is to say that the information in the descriptors is effectively scrambled once the internal weights and thresholds of the neurons are trained, making it difficult to examine the final equations to interpret the influences of the descriptors on the property of interest. [Pg.368]

The cooperative relationship between the structural features of molecules and many physiological processes makes artificial neural network models a frequent choice for predicting the ADMET properties of drug candidates.  [Pg.368]


Artificial Neural Networks. An Artificial Neural Network (ANN) consists of a network of nodes (processing elements) connected via adjustable weights [Zurada, 1992]. The weights can be adjusted so that a network learns a mapping represented by a set of example input/output pairs. An ANN can in theory reproduce any continuous function 95 —>31 °, where n and m are numbers of input and output nodes. In NDT neural networks are usually used as classifiers... [Pg.98]

Pattern Recognition of Artificial Neural Network to Waveform Data. [Pg.263]

The method that was developed builds on computed values of physicochemical effects and uses neural networks for classification. Therefore, for a deeper understanding of this form of reaction classification, later chapters should be consulted on topics such as methods for the calculation of physicochemical effects (Section 7.1) and artificial neural networks (Section 9.4). [Pg.193]

A challenging task in material science as well as in pharmaceutical research is to custom tailor a compound s properties. George S. Hammond stated that the most fundamental and lasting objective of synthesis is not production of new compounds, but production of properties (Norris Award Lecture, 1968). The molecular structure of an organic or inorganic compound determines its properties. Nevertheless, methods for the direct prediction of a compound s properties based on its molecular structure are usually not available (Figure 8-1). Therefore, the establishment of Quantitative Structure-Property Relationships (QSPRs) and Quantitative Structure-Activity Relationships (QSARs) uses an indirect approach in order to tackle this problem. In the first step, numerical descriptors encoding information about the molecular structure are calculated for a set of compounds. Secondly, statistical and artificial neural network models are used to predict the property or activity of interest based on these descriptors or a suitable subset. [Pg.401]

A structure descriptor is a mathematical representation of a molecule resulting from a procedure transforming the structural information encoded within a symbolic representation of a molecule. This mathematical representation has to be invariant to the molecule s size and number of atoms, to allow model building with statistical methods and artificial neural networks. [Pg.403]

Chirality codes are used to represent molecular chirality by a fixed number of de-.scriptors. Thc.se descriptors can then be correlated with molecular properties by way of statistical methods or artificial neural networks, for example. The importance of using descriptors that take different values for opposite enantiomers resides in the fact that observable properties are often different for opposite enantiomers. [Pg.420]

Kohonen networks, also known as self-organizing maps (SOMs), belong to the large group of methods called artificial neural networks. Artificial neural networks (ANNs) are techniques which process information in a way that is motivated by the functionality of biological nervous systems. For a more detailed description see Section 9.5. [Pg.441]

Problems involving routine calculations are solved much faster and more reliably by computers than by humans. Nevertheless, there are tasks in which humans perform better, such as those in which the procedure is not strictly determined and problems which are not strictly algorithmic. One of these tasks is the recognition of patterns such as feces. For several decades people have been trying to develop methods which enable computers to achieve better results in these fields. One approach, artificial neural networks, which model the functionality of the brain, is explained in this section. [Pg.452]

Artificial Neural Networks (ANNs) are information processing imits which process information in a way that is motivated by the functionality of the biological nervous system. Just as the brain consists of neurons which are connected with one another, an ANN comprises interrelated artificial neurons. The neurons work together to solve a given problem. [Pg.452]

Figure 9-16. ArtiFicial neural network architecture with a two-layer design, compri ing Input units, a so-called hidden layer, and an output layer. The squares Inclosing the ones depict the bias which Is an extra weight (see Ref, [10 for further details),... Figure 9-16. ArtiFicial neural network architecture with a two-layer design, compri ing Input units, a so-called hidden layer, and an output layer. The squares Inclosing the ones depict the bias which Is an extra weight (see Ref, [10 for further details),...
Woodruff and co-workers introduced the expert system PAIRS [67], a program that is able to analyze IR spectra in the same manner as a spectroscopist would. Chalmers and co-workers [68] used an approach for automated interpretation of Fourier Transform Raman spectra of complex polymers. Andreev and Argirov developed the expert system EXPIRS [69] for the interpretation of IR spectra. EXPIRS provides a hierarchical organization of the characteristic groups that are recognized by peak detection in discrete ames. Penchev et al. [70] recently introduced a computer system that performs searches in spectral libraries and systematic analysis of mixture spectra. It is able to classify IR spectra with the aid of linear discriminant analysis, artificial neural networks, and the method of fe-nearest neighbors. [Pg.530]

In recent decades, much attention has been paid to the application of artificial neural networks as a tool for spectral interpretation (see, e.g.. Refs. [104, 105]). The ANN approach app]ied to vibrational spectra allows the determination of adequate functional groups that can exist in the sample, as well as the complete interpretation of spectra. Elyashberg [106] reported an overall prediction accuracy using ANN of about 80 % that was achieved for general-purpose approaches. Klawun and Wilkins managed to increase this value to about 95% [107]. [Pg.536]

In many cases, structure elucidation with artificial neural networks is limited to backpropagation networks [113] and, is therefore performed in a supervised man-... [Pg.536]

Chemoinformati.cs is involved in the drug discovery process in both the lead finding and lead optimization steps. Artificial neural networks can play a decisive role of various stages in this process cf. Section 10.4.7.1). [Pg.602]

The results presented here imply that a similar approadi can be used for comparing two different hbraries, for determining the degree of overlap between the compounds in these two Hbraries. Examples of the application of artificial neural networks or GA in drug design are given in [57, 58, 84, 85]. [Pg.615]

Concomitantly with the increase in hardware capabilities, better software techniques will have to be developed. It will pay us to continue to learn how nature tackles problems. Artificial neural networks are a far cry away from the capabilities of the human brain. There is a lot of room left from the information processing of the human brain in order to develop more powerful artificial neural networks. Nature has developed over millions of years efficient optimization methods for adapting to changes in the environment. The development of evolutionary and genetic algorithms will continue. [Pg.624]

Since biological systems can reasonably cope with some of these problems, the intuition behind neural nets is that computing systems based on the architecture of the brain can better emulate human cognitive behavior than systems based on symbol manipulation. Unfortunately, the processing characteristics of the brain are as yet incompletely understood. Consequendy, computational systems based on brain architecture are highly simplified models of thek biological analogues. To make this distinction clear, neural nets are often referred to as artificial neural networks. [Pg.539]

J. C. Hoskins, K. M. Kahyur, and D. M. Himmelblau, "The AppHcation of Artificial Neural Networks to Fault Diagnosis in Chemical Processing," paper presented tAIChE Spring Meetings Houston, Tex., 1988. [Pg.541]

Sometimes fuzzy logic controllers are combined with pattern recognition software such as artificial neural networks (Kosko, Neural Networks and Fuzzy Systems, Prentice Hall, Englewood Cliffs, New Jersey, 1992). [Pg.735]

Watanahe, K., S. Hirota, L. Hou, and D.M. Himmelhlau, Diagnosis of Multiple Simultaneous Fault via Hierarchical Artificial Neural Networks, AlChE Journal, 40(5), 1994, 839-848. (Neural network)... [Pg.2545]

Terry, P.A. and D.M. Himmelhlau, Data Rectification and Gross Error Detection in a Steady-State Process via Artificial Neural Networks, Indushial and Engineeiing Chemistiy Reseaieh, 32, 199.3,. 3020-3028. (Neural networks, measurement test)... [Pg.2545]

Intended Use The intended use of the model sets the sophistication required. Relational models are adequate for control within narrow bands of setpoints. Physical models are reqiiired for fault detection and design. Even when relational models are used, they are frequently developed bv repeated simulations using physical models. Further, artificial neural-network models used in analysis of plant performance including gross error detection are in their infancy. Readers are referred to the work of Himmelblau for these developments. [For example, see Terry and Himmelblau (1993) cited in the reference list.] Process simulators are in wide use and readily available to engineers. Consequently, the emphasis of this section is to develop a pre-liminaiy physical model representing the unit. [Pg.2555]

New developments which have still to be checked for their usability in data evaluation of depth profiles are artificial neural networks [2.16, 2.21-2.25], fuzzy clustering [2.26, 2.27] and genetic algorithms [2.28]. [Pg.21]

Neural network control systems 10.3.1 Artificial neural networks... [Pg.347]

Artificial Neural Networks (ANNs) attempt to emulate their biological counterparts. McCulloch and Pitts (1943) proposed a simple model of a neuron, and Hebb (1949) described a technique which became known as Hebbian learning. Rosenblatt (1961), devised a single layer of neurons, called a Perceptron, that was used for optical pattern recognition. [Pg.347]

Artificial Neural Networks have the following potential advantages for intelligent control ... [Pg.348]

Woinaroschy, A., Isopescu, R. and Fillipescu, L., 1994. Crystallisation process optimisation using artificial neural networks. Chemical Engineering Technik, 17, 269-272. [Pg.326]

Artificial Neural Networks as a Semi-Empirical Modeling Tool for Physical Property Predictions in Polymer Science... [Pg.1]

Recently, a new approach called artificial neural networks (ANNs) is assisting engineers and scientists in their assessment of fuzzy information, Polymer scientists often face a situation where the rules governing the particular system are unknown or difficult to use. It also frequently becomes an arduous task to develop functional forms/empirical equations to describe a phenomena. Most of these complexities can be overcome with an ANN approach because of its ability to build an internal model based solely on the exposure in a training environment. Fault tolerance of ANNs has been found to be very advantageous in physical property predictions of polymers. This chapter presents a few such cases where the authors have successfully implemented an ANN-based approach for purpose of empirical modeling. These are not exhaustive by any means. [Pg.1]

To understand why and how artificial neural networks work as they do, it is helpful to study some of the funda-... [Pg.1]


See other pages where Neural network artificial is mentioned: [Pg.105]    [Pg.106]    [Pg.263]    [Pg.402]    [Pg.454]    [Pg.455]    [Pg.500]    [Pg.516]    [Pg.509]    [Pg.509]    [Pg.19]    [Pg.360]    [Pg.360]    [Pg.300]    [Pg.1]   
See also in sourсe #XX -- [ Pg.441 , Pg.452 , Pg.516 , Pg.624 ]

See also in sourсe #XX -- [ Pg.21 ]

See also in sourсe #XX -- [ Pg.85 , Pg.93 ]

See also in sourсe #XX -- [ Pg.649 ]

See also in sourсe #XX -- [ Pg.437 ]

See also in sourсe #XX -- [ Pg.387 , Pg.453 , Pg.467 , Pg.478 , Pg.484 ]

See also in sourсe #XX -- [ Pg.284 , Pg.456 , Pg.465 ]

See also in sourсe #XX -- [ Pg.323 , Pg.325 ]

See also in sourсe #XX -- [ Pg.193 , Pg.194 , Pg.195 , Pg.256 ]

See also in sourсe #XX -- [ Pg.319 , Pg.322 , Pg.325 , Pg.337 ]

See also in sourсe #XX -- [ Pg.262 ]

See also in sourсe #XX -- [ Pg.484 ]

See also in sourсe #XX -- [ Pg.151 ]

See also in sourсe #XX -- [ Pg.83 ]

See also in sourсe #XX -- [ Pg.279 ]

See also in sourсe #XX -- [ Pg.10 , Pg.17 , Pg.18 , Pg.19 , Pg.20 , Pg.21 , Pg.22 , Pg.23 , Pg.24 , Pg.25 ]

See also in sourсe #XX -- [ Pg.274 , Pg.282 ]

See also in sourсe #XX -- [ Pg.63 ]

See also in sourсe #XX -- [ Pg.390 , Pg.439 ]

See also in sourсe #XX -- [ Pg.208 ]

See also in sourсe #XX -- [ Pg.4 , Pg.102 , Pg.103 , Pg.104 , Pg.105 , Pg.106 , Pg.107 , Pg.108 ]

See also in sourсe #XX -- [ Pg.1023 ]

See also in sourсe #XX -- [ Pg.2 , Pg.201 ]

See also in sourсe #XX -- [ Pg.227 , Pg.332 ]

See also in sourсe #XX -- [ Pg.85 ]

See also in sourсe #XX -- [ Pg.57 ]

See also in sourсe #XX -- [ Pg.16 ]

See also in sourсe #XX -- [ Pg.146 ]

See also in sourсe #XX -- [ Pg.137 , Pg.178 , Pg.189 , Pg.272 ]

See also in sourсe #XX -- [ Pg.224 , Pg.226 , Pg.230 , Pg.236 ]

See also in sourсe #XX -- [ Pg.47 ]

See also in sourсe #XX -- [ Pg.233 ]

See also in sourсe #XX -- [ Pg.382 ]

See also in sourсe #XX -- [ Pg.491 , Pg.496 ]

See also in sourсe #XX -- [ Pg.63 ]

See also in sourсe #XX -- [ Pg.217 ]

See also in sourсe #XX -- [ Pg.359 ]

See also in sourсe #XX -- [ Pg.244 ]

See also in sourсe #XX -- [ Pg.9 ]

See also in sourсe #XX -- [ Pg.247 ]

See also in sourсe #XX -- [ Pg.536 , Pg.667 ]

See also in sourсe #XX -- [ Pg.131 ]

See also in sourсe #XX -- [ Pg.221 ]

See also in sourсe #XX -- [ Pg.282 , Pg.283 , Pg.284 , Pg.285 ]

See also in sourсe #XX -- [ Pg.90 , Pg.114 , Pg.163 , Pg.164 ]

See also in sourсe #XX -- [ Pg.187 ]

See also in sourсe #XX -- [ Pg.60 ]

See also in sourсe #XX -- [ Pg.244 , Pg.245 ]

See also in sourсe #XX -- [ Pg.189 ]

See also in sourсe #XX -- [ Pg.96 , Pg.105 ]

See also in sourсe #XX -- [ Pg.101 , Pg.102 ]




SEARCH



Artificial Intelligence and Neural Networks

Artificial Neural Network (ANN) Models

Artificial Neural Networks and Their Use in Chemistry

Artificial Neural Networks in a Nutshell

Artificial network

Artificial neural network cases

Artificial neural network classification methods

Artificial neural network design

Artificial neural network investigation

Artificial neural network model

Artificial neural network pattern recognition technique

Artificial neural network supervised classification

Artificial neural network theory

Artificial neural network unsupervised classification

Artificial neural networking

Artificial neural networks (ANN)

Artificial neural networks ANNs)

Artificial neural networks Subject

Artificial neural networks activation function

Artificial neural networks analysis

Artificial neural networks application

Artificial neural networks architecture

Artificial neural networks back-propagation

Artificial neural networks based models

Artificial neural networks based models accuracy

Artificial neural networks based models approach, applications

Artificial neural networks based models example

Artificial neural networks based models training

Artificial neural networks based models weighting

Artificial neural networks capillary electrophoresis

Artificial neural networks component analysis input

Artificial neural networks connections

Artificial neural networks defined

Artificial neural networks experimental design

Artificial neural networks food analysis

Artificial neural networks fundamentals

Artificial neural networks hidden layers

Artificial neural networks in QSAR

Artificial neural networks in worked example

Artificial neural networks input layer

Artificial neural networks instrumentation

Artificial neural networks learning mechanisms

Artificial neural networks learning paradigms

Artificial neural networks learning rate

Artificial neural networks limitations

Artificial neural networks method

Artificial neural networks multilayer perceptron network

Artificial neural networks neurons

Artificial neural networks optimization

Artificial neural networks other regression

Artificial neural networks other regression methods

Artificial neural networks output layer

Artificial neural networks overview

Artificial neural networks pattern recognition

Artificial neural networks prediction capabilities

Artificial neural networks reinforcement

Artificial neural networks sigmoid function

Artificial neural networks software

Artificial neural networks supervised

Artificial neural networks testing

Artificial neural networks topologies

Artificial neural networks training

Artificial neural networks unsupervised

Artificial neural networks validating

Artificial neural networks worked example

Calibration by artificial neural networks

Chemometrics artificial neural networks

Computational artificial neural networks

Electronic tongues artificial neural networks

Empirical models artificial neural networks

Error back-propagation artificial neural networks

Fault diagnosis artificial neural networks

Feed-forward network, artificial neural

Feedforward artificial neural network

Genetic algorithms artificial neural networks, machine

Learning from Nature — Artificial Neural Networks

Linear networks, artificial neural

Linear networks, artificial neural network construction

Machine-learning methods artificial neural network

Modeling with artificial neural networks

Multilayer perceptron artificial neural networks

Multivariate analysis artificial neural network

Natural computation artificial neural networks

Neural artificial

Neural network

Neural networking

Neural networks Artificial intelligence

Optimal artificial neural network

Potential Applications of Artificial Neural Networks to Thermodynamics

Quantitative structure-activity artificial neural network

Search with artificial neural networks

Short Review of Artificial Neural Networks

Statistical models artificial neural network

Supervised learning artificial neural networks

Tablet formulation artificial neural networks

The Structure of an Artificial Neural Network

The Training of Artificial Neural Networks

Three-layer artificial neural network

Training an Artificial Neural Network

Types of Artificial Neural Network

What Are Artificial Neural Networks

Yushu, China earthquake struck area using artificial neural network model

© 2024 chempedia.info