Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

NNs, Neural networks

MLR - multiple linear regression, PLS - partial least squares projection to latent structures, NN - neural network. [Pg.1030]

CV = cross-validation MLR = multiple linear regression mp = melting point NIPALS = nonlinear iterative partial least squares NN = neural networks PCA = principal... [Pg.2006]

Counterpropagation neural networks (CFG NN) were then used to establish relationships between protons and their H NMR chemical shifts. A detailed description of this method is given in the Tools Section 10,2.4.2,... [Pg.524]

After selection of descriptors/NN training, the best networks were applied to the prediction of 259 chemical shifts from 31 molecules (prediction set), which were not used for training. The mean absolute error obtained for the whole prediction set was 0.25 ppm, and for 90% of the cases the mean absolute error was 0.19 ppm. Some stereochemical effects could be correctly predicted. In terms of speed, the neural network method is very fast - the whole process to predict the NMR shifts of 30 protons in a molecule with 56 atoms, starting from an MDL Molfile, took less than 2 s on a common workstation. [Pg.527]

Perhaps the most interesting aspect of this set of studies is the question posed in the recent paper by Schmidt et al. (2004) and deals with the reality of the patterns they observed. Is the polymorphism observed a result of the calculation methods used in the study, neural network (NN), and multivariate statistical analysis (MVA) Would increased sampling result in a greater number of chemo-types It is entirely possible, of course, that the numbers obtained in this study are a true reflection of the biosynthetic capacities of the plants studied. The authors concluded—and this is a point made elsewhere in this review—that ... for a correct interpretation a good knowledge of the biosynthetic background of the components is needed. ... [Pg.49]

There are many different methods for selecting those descriptors of a molecule that capture the information that somehow encodes the compounds solubility. Currently, the most often used are multiple linear regression (MLR), partial least squares (PLS) or neural networks (NN). The former two methods provide a simple linear relationship between several independent descriptors and the solubility, as given in Eq. (14). This equation yields the independent contribution, hi, of each descriptor, Di, to the solubility ... [Pg.302]

The nearest neighbour method is often applied, with, in view of its simplicity, surprisingly good results. An example where -NN performs well in a comparison with neural networks and SIMCA (see further) can be found in [16]. [Pg.225]

Neural networks (NN) represent, as opposed to PLS and MLR, a nonlinear statistical analysis technique [43]. As is the case for both PLS and MLR, several aspects of NN should be considered when using this type of analysis technique ... [Pg.400]

Models of the form y =f(x) or v =/(x1, x2,..., xm) can be linear or nonlinear they can be formulated as a relatively simple equation or can be implemented as a less evident algorithmic structure, for instance in artificial neural networks (ANN), tree-based methods (CART), local estimations of y by radial basis functions (RBF), k-NN like methods, or splines. This book focuses on linear models of the form... [Pg.118]

Such applications of NN as a predictive method make the artificial neural networks another technique of data treatment, comparable to parametric empirical modeling by, for example, numerical regression methods [e.g., 10,11] briefly mentioned in section 16.1. The main advantage of NN is that the network needs not be programmed because it learns from sets of experimental data, which results in the possibility of representing even the most complex implicit functions, and also in better modeling without prescribing a functional form of the actual relationship. Another field of... [Pg.705]

MacKay s textbook [114] offers not only a comprehensive coverage of Shannon s theory of information but also probabilistic data modeling and the mathematical theory of neural networks. Artificial NN can be applied when problems appear with processing and analyzing the data, with their prediction and classification (data mining). The wide range of applications of NN also comprises optimization issues. The information-theoretic capabilities of some neural network algorithms are examined and neural networks are motivated as statistical models [114]. [Pg.707]

The applications of NN to solvent extraction, reported in section 16.4.6.2., suffer from an essential limitation in that they do not apply to processes of quantum nature therefore they are not able to describe metal complexes in extraction systems on the microscopic level. In fact, the networks can describe only the pure state of simplest quantum systems, without superposition of states. Neural networks that indirectly take into account quantum effects have already been applied to chemical problems. For example, the combination of quantum mechanical molecular electrostatic potential surfaces with neural networks makes it possible to predict the bonding energy for bioactive molecules with enzyme targets. Computational NN were employed to identify the quantum mechanical features of the... [Pg.707]

Chen et al. (2008) employed a commercial electronic tongue, based on an array of seven sensors, to classify 80 green tea samples on the basis of their taste grade, which is usually assessed by a panel test. PCA was employed as an explorative tool, while fc-NN and a back propagation artificial neural network (BP-ANN) were used for supervised classification. Both the techniques provide excellent results, achieving 100% prediction ability on a test set composed of 40 samples (one-half of the total number). In cases like this, when a simple technique, such as fc-NN, is able to supply excellent outcomes, the utilization of a complex technique, like BP-ANN, does not appear justified from a practical point of view. [Pg.105]

There have been many books and reviews written on the subject of NN and parallel computing. Only a token one is listed here, for those who need a traditional book reference (Haykin, 1999). It will probably be obsolete before this book is published. Otherwise, a wealth of up-to-date information is always available on the Internet where a neural networks entry produces an avalanche of information. Both lead articles cited for Chapter 10 (Hierlemann et al 1996) and (Jurs et al., 2000) discuss their applications in the context of chemical and biological sensing. [Pg.325]

Fig. 10.8 (a) Example of common neural net (perceptron) architecture. Here one hidden layer Neural Networks (NNs) is shown (Hierlemann et al., 1996). (b) A more sophisticated recurrent neural network utilizing adjustable feedback through recurrent variables, (c) Time-delayed neural network in which time has been utilized as an experimental variable... [Pg.326]


See other pages where NNs, Neural networks is mentioned: [Pg.283]    [Pg.358]    [Pg.687]    [Pg.376]    [Pg.409]    [Pg.1216]    [Pg.546]    [Pg.226]    [Pg.206]    [Pg.1657]    [Pg.283]    [Pg.358]    [Pg.687]    [Pg.376]    [Pg.409]    [Pg.1216]    [Pg.546]    [Pg.226]    [Pg.206]    [Pg.1657]    [Pg.491]    [Pg.507]    [Pg.28]    [Pg.158]    [Pg.160]    [Pg.10]    [Pg.24]    [Pg.180]    [Pg.236]    [Pg.474]    [Pg.704]    [Pg.705]    [Pg.708]    [Pg.394]    [Pg.137]    [Pg.138]    [Pg.473]    [Pg.31]    [Pg.79]    [Pg.325]    [Pg.234]    [Pg.234]    [Pg.185]   
See also in sourсe #XX -- [ Pg.1013 , Pg.1037 ]




SEARCH



Neural network

Neural networking

© 2024 chempedia.info