Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Neural networks connection with

The back-propagation netwoik was set up with an input layer of nineteen nodes. Two hidden layers of twelve nodes each connect the input layer. The paper showed that optimization of yam quality is easily achieved as a function of the necessarily input parameters. They have also compared neural network model with multiple regression analysis and the results showed that the netwoik model can be considered more reliable than its statistical counterpart. [Pg.117]

Artificial Neural Networks (ANNs) are information processing imits which process information in a way that is motivated by the functionality of the biological nervous system. Just as the brain consists of neurons which are connected with one another, an ANN comprises interrelated artificial neurons. The neurons work together to solve a given problem. [Pg.452]

Neural networks have been proposed as an alternative way to generate quantitative structure-activity relationships [Andrea and Kalayeh 1991]. A commonly used type of neural net contains layers of units with connections between all pairs of units in adjacent layers (Figure 12.38). Each unit is in a state represented by a real value between 0 and 1. The state of a unit is determined by the states of the units in the previous layer to which it is connected and the strengths of the weights on these connections. A neural net must first be trained to perform the desired task. To do this, the network is presented with a... [Pg.719]

A sigmoid (s-shaped) is a continuous function that has a derivative at all points and is a monotonically increasing function. Here 5,p is the transformed output asymptotic to 0 < 5/,p I and w,.p is the summed total of the inputs (- 00 < Ui p < -I- 00) for pattern p. Hence, when the neural network is presented with a set of input data, each neuron sums up all the inputs modified by the corresponding connection weights and applies the transfer function to the summed total. This process is repeated until the network outputs are obtained. [Pg.3]

In this approach, connectivity indices were used as the principle descriptor of the topology of the repeat unit of a polymer. The connectivity indices of various polymers were first correlated directly with the experimental data for six different physical properties. The six properties were Van der Waals volume (Vw), molar volume (V), heat capacity (Cp), solubility parameter (5), glass transition temperature Tfj, and cohesive energies ( coh) for the 45 different polymers. Available data were used to establish the dependence of these properties on the topological indices. All the experimental data for these properties were trained simultaneously in the proposed neural network model in order to develop an overall cause-effect relationship for all six properties. [Pg.27]

A neural network consists of many neurons organized into a structure called the network architecture. Although there are many possible network architectures, one of the most popular and successful is the multilayer perceptron (MLP) network. This consists of identical neurons all interconnected and organized in layers, with those in one layer connected to those in the next layer so that the outputs in one layer become the inputs in the subsequent... [Pg.688]

Indeed, if the problem is simple enough that the connection weights can be found by a few moments work with pencil and paper, there are other computational tools that would be more appropriate than neural networks. It is in more complex problems, in which the relationships that exist between data points are unknown so that it is not possible to determine the connection weights by hand, that an ANN comes into its own. The ANN must then discover the connection weights for itself through a process of supervised learning. [Pg.21]

The ability of an ANN to learn is its greatest asset. When, as is usually the case, we cannot determine the connection weights by hand, the neural network can do the job itself. In an iterative process, the network is shown a sample pattern, such as the X, Y coordinates of a point, and uses the pattern to calculate its output it then compares its own output with the correct output for the sample pattern, and, unless its output is perfect, makes small adjustments to the connection weights to improve its performance. The training process is shown in Figure 2.13. [Pg.21]

A global view of consciousness is that it is generated throughout the entire brain, as a result of synchronisation of relevant neural networks. Specific systems or regions—for example the cerebral cortex, brainstem reticular formation and thalamic nuclei—may be key anatomical integrators. Areas with the most widespread interconnections are pivotal, and on this basis the cortex and thalamus are more relevant than cerebellum and striatum for example. Frontal cortex for example connects with every other brain region, both in terms of input and output, with 80% of such connections accounted for by cortico-cortical connections. Thalamic intralaminar nuclei are, in conjunction with the reticular nucleus, reciprocally connected to all cortical areas. By contrast the cerebellum has very few output pathways and striatal-cortical input is (via the thalamus) confined to frontal lobe. [Pg.5]

The basis of molecular modeling is that all important molecular properties, i. e., stabilities, reactivities and electronic properties, are related to the molecular structure (Fig. 1.1). Therefore, if it is possible to develop algorithms that are able to calculate a structure with a given stoichiometry and connectivity, it must be possible to compute the molecular properties based on the calculated structure, and vice versa. There are many different approaches and related computer programs, including ab-initio calculations, various semi-empirical molecular orbital (MO) methods, ligand field calculations, molecular mechanics, purely geometrical approaches, and neural networks, that can calculate structures and one or more additional molecular properties. [Pg.2]


See other pages where Neural networks connection with is mentioned: [Pg.454]    [Pg.474]    [Pg.1]    [Pg.367]    [Pg.911]    [Pg.450]    [Pg.481]    [Pg.688]    [Pg.101]    [Pg.650]    [Pg.652]    [Pg.191]    [Pg.15]    [Pg.199]    [Pg.205]    [Pg.453]    [Pg.370]    [Pg.372]    [Pg.373]    [Pg.378]    [Pg.380]    [Pg.324]    [Pg.111]    [Pg.203]    [Pg.474]    [Pg.527]    [Pg.135]    [Pg.137]    [Pg.37]    [Pg.179]    [Pg.180]    [Pg.314]    [Pg.101]    [Pg.200]    [Pg.359]    [Pg.75]    [Pg.11]    [Pg.257]    [Pg.303]    [Pg.205]   
See also in sourсe #XX -- [ Pg.508 ]




SEARCH



Connection neural network

Neural connections

Neural network

Neural networking

© 2024 chempedia.info