Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Conditional mutual information

Furthermore, the conditional mutual information between X and Y when Z is known is defined as... [Pg.347]

With the definition of conditional mutual information and Rule (4 ), this yields H(5,IP = l(Si-,Histi i PK) + H(Si PK,Histi i) > and therefore... [Pg.359]

This finishes the proof of (3). For i = N+l, and with the-definition of conditional mutual information, one obtains... [Pg.366]

Next, the probability function Ptj for the maximum and minimum values of 1(0, R) is discussed mathematically. The self-entropy H(C) in Eq. (2.38) is decided only by the fraction of each component in the feed, and the value does not change through the mixing process. Then, the maximum and minimum values of the mutual information entropy are determined by the value of the conditional entropy H(C/R). Since the range of the variable j is fixed as l[Pg.70]

Examples of mathematical methods include nominal range sensitivity analysis (Cullen Frey, 1999) and differential sensitivity analysis (Hwang et al., 1997 Isukapalli et al., 2000). Examples of statistical sensitivity analysis methods include sample (Pearson) and rank (Spearman) correlation analysis (Edwards, 1976), sample and rank regression analysis (Iman Conover, 1979), analysis of variance (Neter et al., 1996), classification and regression tree (Breiman et al., 1984), response surface method (Khuri Cornell, 1987), Fourier amplitude sensitivity test (FAST) (Saltelli et al., 2000), mutual information index (Jelinek, 1970) and Sobol s indices (Sobol, 1993). Examples of graphical sensitivity analysis methods include scatter plots (Kleijnen Helton, 1999) and conditional sensitivity analysis (Frey et al., 2003). Further discussion of these methods is provided in Frey Patil (2002) and Frey et al. (2003, 2004). [Pg.59]

The underlying joint atom-orbital probabilities, Pab(A,/),/ e B and Pab(B, /), / e A, to be used as weighting factors in the average conditional-entropy (covalency) and mutual-information (ionicity) descriptors of the AB chemical bond(s), indeed assume appreciable magnitudes only when the electron occupying the atomic orbital Xi of one atom is simultaneously found with a significant probability on the other atom, thus effectively excluding the contributions to the entropy/information bond descriptors due to the lone-pair electrons. Thus, such joint bond probabilities emphasize of AOs have both atoms are simultaneously involved in the occupied MOs. [Pg.39]

As we have already mentioned in Section 2, in OCT the complementary quantities characterizing the average noise (conditional entropy of the channel output given input) and the information flow (mutual information in the channel output and input) in the diatomic communication system defined by the conditional AO probabilities of Eq. (48) provide the overall descriptors of the fragment bond covalency and ionicity, respectively. Both molecular and promolecular reference (input) probability distributions have been used in the past to determine the information index characterizing the displacement (ionicity) aspect of the system chemical bonds [9, 46-48]. [Pg.40]

All the random variables occurring in entropies in the following are finite, i.e., their range only contains a finite number of values x.) H(X I Y) denotes the conditional entropy of X when Y is known and I(X F) the mutual information between X and Y. They are defined as follows ... [Pg.346]

A more systematic way is to see that any entropy or mutual information with an additional condition Z is the weighted sum over the same terms with a condition z (over all values z), and those are terms of the simpler structure in the probability space induced by conditioning over z- Hence the simpler formulas can be applied. [Pg.347]

At the point Pm where (13) equals zero, the function I B, F) achieves a maximum, since the mutual information is a concave function of p. By substituting the condition = 0 in... [Pg.28]

FIGURE 8.12 Entropy for two dependent probability distributions p and q. Two circles enclose areas representing the entropies S(p) and S(q) of two separate probability vectors, while their common (overlap) area corresponds to the mutual information I(p. q) in these two distributions. The remaining part of each circle represents the corresponding conditional entropy S(p q) or 5(g p), measuring the residual uncertainty about events in one set, when one has full knowledge of the occurrence of events in the other set of outcomes. The area enclosed by the envelope of two circles then represents the entropy of the product (joint) distribution S(n) = S( (aAb)) = S(p) + S(q) - I(p q) = S(p) + S q p) = S(q) + S(plq). [Pg.162]

In OCT, the entropy/information indices of the covalent/ionic components of chemical bonds represent the complementary descriptors of conditional entropy (average communication noise) and mutual information (amount of information flow, capacity) in the molecular information channel [10-14,35,36,47,54,55,62]. One observes that the molecular input P a) = p generates the same distribution in the output of the molecular channel... [Pg.166]

FIGURE 8.15 Conservation of the overall entropic bond multiplicity J °(P) = 1 bit in the 2-AO model of the chemical bond, combining the conditional entropy (average noise, bond covalency) S P) = H(P) and the mutual information (information capacity, bond ionicity) P(P) = 1 - H P). In MO theory, the direct bond order of Wiberg is represented by the (broken line) parabola M yP) = 4P(1 -P) = 4PQ. [Pg.168]

This weighting procedure can be illustrated in the 2-AO model of Section 8.4. In the bond-weighted approach one uses the elementary subchannels of Figure 8.16 and their partial entropy/information descriptors. The conditional entropy and mutual information quantities for these partial communication systems S(/li), / (t /) i =A,5 are also listed in the diagram. Since these row descriptors represent the IT indices per electron in the diatomic fragment, these contributions have to be multiplied by Aab = a = 2 in the corresponding resultant components and in the overall multiplicity of an effective diatomic (localized) bond. [Pg.170]

Even the void fraction together with particle size distribution does not provide all of the necessary information on the kind of flow. The mutual forces between distinct particles depend not only on the distance between the particles but also on the surface properties of the particles. The strength of the attractive forces between particles depends on conditions. For instance, the moisture content of the solid is essential for determining the attractiv c forces between particles, especially for hydroscopic materials such as wood. Airflow between particles usually tends to separate particles, whereas the surface forces, adhesion forces, tend to bring them together. [Pg.1323]

Its experimental confirmation provides information about the free rotation time tj. However, this is very difficult to do in the Debye case. From one side the density must be high enough to reach the perturbation theory (rotational diffusion) region where i rotational relaxation which is valid at k < 1. The two conditions are mutually contradictory. The validity condition of perturbation theory... [Pg.74]


See other pages where Conditional mutual information is mentioned: [Pg.364]    [Pg.365]    [Pg.364]    [Pg.365]    [Pg.12]    [Pg.2]    [Pg.224]    [Pg.165]    [Pg.252]    [Pg.195]    [Pg.146]    [Pg.163]    [Pg.163]    [Pg.164]    [Pg.411]    [Pg.516]    [Pg.52]    [Pg.56]    [Pg.57]    [Pg.58]    [Pg.59]    [Pg.59]    [Pg.79]    [Pg.426]    [Pg.245]    [Pg.202]    [Pg.162]    [Pg.208]    [Pg.1164]    [Pg.149]    [Pg.396]    [Pg.580]    [Pg.227]    [Pg.39]    [Pg.130]   
See also in sourсe #XX -- [ Pg.347 ]




SEARCH



Mutual

Mutual information

Mutualism

Mutuality

© 2024 chempedia.info