Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Random connectivity

Covalent dendritic connection can result from any standard synthetic transformation capable of forming a covalent bond. These include nucleophilic, electrophilic, ionic, radical, and carbenoid reactions. Elements effecting bond formation include metals, non-metals, and metaloids. [Pg.227]


Hiding Since non-functional CLBs and functional CLBs unused outputs can be both removed without affecting the digital design, they should be well hidden. The operations available to hide these elements are a secure hash function to randomly replace the unused CLBs, randomly connecting their inputs and outputs to nearby passing lines, and to dont-care inputs of other CLBs. These techniques make the non-functional CLBs appear like functional ones. [Pg.8]

Ti-TUD-1 three-dimensionally randomly connected mesoporous silica... [Pg.26]

The mesoporous materials reported above are usually prepared from relatively expensive surfactants. Some of them have poor hydrothermal stability. Furthermore, the MCM-41 host structure has a one-dimensional pore system with consequent pore blockage and diffusion limitations. Shan et al. (52) reported the synthesis of a three-dimensional and randomly connected mesoporous titano-silicate (Ti-TUD-1, mesopore wall thickness = 2.5-4 nm, surface area — 700-1000 m2/g, tunable pore size —4.5-5.7 nm) from triethanolamine (TEA). Ti-TUD-1 showed higher activity (about 5.6 times) for cyclohexene epoxidation than the framework-substituted Ti-MCM-41. Its activity was similar to that of the Ti-grafted MCM-41 (52). [Pg.181]

Computer simulation in space takes into account spatial correlations of any range which result in Intramolecular reaction. The lattice percolation was mostly used. It was based on random connections of lattice points of rigid lattice. The main Interest was focused on the critical region at the gel point, l.e., on critical exponents and scaling laws between them. These exponents were found to differ from the so-called classical ones corresponding to Markovian systems irrespective of whether cycllzatlon was approximated by the spanning-tree... [Pg.10]

A neural network consists of many processing elements joined together. A typical network consists of a sequence of layers with full or random connections between successive layers. A minimum of two layers is required the input buffer where data is presented and the output layer where the results are held. However, most networks also include intermediate layers called hidden layers. An example of such an ANN network is one used for the indirect determination of the Reid vapor pressure (RVP) and the distillate boiling point (BP) on the basis of 9 operating variables and the past history of their relationships to the variables of interest (Figure 2.56). [Pg.207]

In the model of bond percolation on the square lattice, the elements are the bonds formed between the monomers and not the sites, i.e., the elements of the clusters are the connected bonds. The extent of a polymerization reaction corresponds to the fraction of reacted bonds. Mathematically, this is expressed by the probability p for the presence of bonds. These concepts can allow someone to create randomly connected bonds (clusters) assigning different values for the probability p. Accordingly, the size of the clusters of connected bonds increases as the probability p increases. It has been found that above a critical value of pc = 0.5 the various bond configurations that can be formed randomly share a common characteristic a cluster percolates through the lattice. A more realistic case of a percolating cluster can be obtained if the site model of a square lattice is used with probability p = 0.6, Figure 1.5. Notice that the critical value of pc is 0.593 for the 2-dimensional site model. Also, the percolation thresholds vary according to the type of model (site or bond) as well as with the dimensionality of the lattice (2 or 3). [Pg.18]

All these different mechanisms of mass transport through a porous medium can be studied experimentally and theoretically through classical models (Darcy s law, Knudsen diffusion, molecular dynamics, Stefan-Maxwell equations, dusty-gas model etc.) which can be coupled or not with the interactions or even reactions between the solid structure and the fluid elements. Another method for the analysis of the species motion inside a porous structure can be based on the observation that the motion occurs as a result of two or more elementary evolutions that are randomly connected. This is the stochastic way for the analysis of species motion inside a porous body. Some examples that will be analysed here by the stochastic method are the result of the particularisations of the cases presented with the development of stochastic models in Sections 4.4 and 4.5. [Pg.286]

If we combine all the aspects above with the descriptions of basic stochastic processes, then we can conclude that we have the case of a stochastic process with complete and random connections (see Section 4.4.1.1). [Pg.293]

At 0.15 A long-range connectivity of the surface water is established, in 2-dimensional percolative phase transition. Network of H-bonded water spans protein surface the network has fluctuating and random connectivity, richness of connections increasing with hydration level... [Pg.347]

For randomly connected pore networks, some of the supercritical pores (those larger than the molecular size) are connected only through subcritical pores (those smaller than the molecular size) and are therefore not accessible. In a given network, in which not all the pores can accommodate the probe moleeules, the number fraction of the pores that are available is... [Pg.124]

Gelation is a connectivity transition that can be described by a bond percolation model. Imagine that we start with a container full of monomers, which occupy the sites of a lattice (as sketched in Fig. 6.14). In a simple bond percolation model, all sites of the lattice are assumed to be occupied by monomers. The chemical reaction between monomers is modelled by randomly connecting monomers on neighbouring sites by bonds. The fraction of all possible bonds that are formed at any point in the reaction is called the extent of reaction p, which increases from zero to unity as the reaction proceeds. A polymer in this model is represented by a cluster of monomers (sites) connected by bonds. When all possible bonds are formed (all monomers are connected into one macroscopic polymer) the reaction is completed (/> = 1) and the polymer is a fully developed network. Such fully developed networks will be the subject of Chapter 7, while in this chapter we focus on the gelation transition. [Pg.213]

Reverse engineering has been demonstrated to work in principle for model genetic networks of binary genes connected through logical rules [18]. A key issue is the data requirement necessary to provide sufficient information to capture the complexity of the molecular network. In model networks it has been shown that only a tiny subset of all possible behaviors need to be known in order to infer network architecture with accuracy [18], provided that the network exhibits significant constraints (biomolecular networks are far removed from randomly connected networks) [20]. [Pg.568]

C. Borgers and N. Kopell, Synchronization in networks of excitatory and inhibitory neurons with sparse, random connectivity. Neural Computation 15 (2003) 509-538. [Pg.234]

Table 1.1. Some known properties of the olfactory system of insects. For the work described here, the assumption of random connections and the localization of learning to the synapses between intrinsic and extrinsic Kenyon cells are particularly important. Table 1.1. Some known properties of the olfactory system of insects. For the work described here, the assumption of random connections and the localization of learning to the synapses between intrinsic and extrinsic Kenyon cells are particularly important.
Before presenting our results on structural implications of the observed and predicted levels of activity in the MB and of the requirement of lossless information transmission, some explanation of our probabilistic approach seems warranted. The probability Ppn of activity in a given PN mainly reflects properties of the input space (odor space) and different patterns of PN activity are diced out for every LFP cycle. The connection probabilities pn->kc and kc bkc, on the other hand, refer to the random connectivity of each locust, i.e., the connectivity is determined only once for each animal. In building distributions (and taking averages) with respect to both probability spaces, we are making statements about the distribution of (and the typical value of) properties for all locusts in response to all possible odors, in a sense. [Pg.9]

Using the above assumptions of independently chosen random connections and independently and randomly active PNs we can directly calculate the probability for a given KC to be active. [Pg.9]

In summary, we have seen that if parameters (connectivity degree and firing threshold) are chosen wisely, fully random connections allow an almost always (in the loose sense of a very small failure probability) one-to-one projection of activity patterns from the AL to the MB, a necessary requirement for successful odor classification. At the same time, the activity level in the MB can remain reasonably low even though the absolute minimum for the confusion probability is attained at very high activity levels. [Pg.12]

We have seen in this subsection once again that one of the determining factors in making a system successful in the information processing framework with disordered (random) connections is the correct balance of system size, connectivity degrees and firing thresholds. Other factors like learning rates and output redundancy may play equally important roles. [Pg.18]

In summary, the more detailed models reveal how the nervous system of the locust may implement the elements of the odor classification scheme with random connections using simple elements like mutual all-to-all inhibition and Hebbian learning. We have also seen that the implementation of classification with these simple ingredients automatically solves the additional task of detecting the cluster structure of the input pattern set. [Pg.23]


See other pages where Random connectivity is mentioned: [Pg.266]    [Pg.71]    [Pg.667]    [Pg.13]    [Pg.227]    [Pg.227]    [Pg.407]    [Pg.92]    [Pg.52]    [Pg.51]    [Pg.590]    [Pg.583]    [Pg.205]    [Pg.501]    [Pg.375]    [Pg.295]    [Pg.1089]    [Pg.427]    [Pg.94]    [Pg.590]    [Pg.501]    [Pg.309]    [Pg.4]    [Pg.5]    [Pg.7]    [Pg.25]    [Pg.26]    [Pg.26]   
See also in sourсe #XX -- [ Pg.228 ]




SEARCH



A connection to nuclear physics random matrices

Random Chains and Systems with Complete Connections

Random connection methods

© 2024 chempedia.info