Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Informational entropy

Many landmark papers in the history of the exploration of the connections between entropy, information and computation appear in the excellent collection of reprints edited by Leff and Rex [lefF90]. [Pg.607]

Ieff90] LefF, H.S, and A.F.Rex, editors. Maxwell s Demon entropy, information, computing, Princeton University Press, 1990. [Pg.772]

The idea here was to examine which pair of techniques and individual columns could lead to the best separations in 2DLC. This is achievable by using ID separations and then comparing how the retention of each component varies across the separation space. Another innovation here was the use of IT-derived metrics such as information entropy, informational similarity, and the synentropy. As stated in this paper, The informational similarity of 2D chromatographic systems, H is a measure of global... [Pg.21]

Schrodinger, E. (1967). What is Life. Cambridge University Press, Cambridge (1st edn., 1944) Trefil, J. (1987). Meditations at Sunset A Scientist looks at the Sky. Charles Scribner s Sons, New York Ulanowicz, R.E. (1997). Ecology, the Ascendent Perspective. Columbia University Press, New York Weber, B.H., Depew, DJ. and Smith, J.D. (eds.) (1988). Entropy, Information and Evolution. MIT Press, Cambridge, MA... [Pg.123]

Coming, P.A. and Kline, S.J. (1998). Thermodynamics and life revisited. Parts I and II. Syst. Res. Behav. Sci., 15, 273-295 and 453-482 (Lucid discussion of many questions related to energy, entropy, information and evolution, a critical analysis of the different points of view and vast bibliography)... [Pg.124]

What is complexity There is no good general definition of complexity, though there are many. Intuitively, complexity lies somewhere between order and disorder, between regularity and randomness, between perfect crystal and gas. Complexity has been measured by logical depth, metric entropy, information content (Shannon s entropy), fluctuation complexity, and many other techniques some of them are discussed below. These measures are well suited to specific physical or chemical applications, but none describe the general features of complexity. Obviously, the lack of a definition of complexity does not prevent researchers from using the term. [Pg.28]

After a brief summary of the molecular and MO-communication systems and their entropy/information descriptors in OCT (Section 2) the mutually decoupled, localized chemical bonds in simple hydrides will be qualitatively examined in Section 3, in order to establish the input probability requirements, which properly account for the nonbonding status of the lone-pair electrons and the mutually decoupled (noncommunicating, closed) character of these localized a bonds. It will be argued that each such subsystem defines the separate (externally closed) communication channel, which requires the individual, unity-normalized probability distribution of the input signal. This calls for the variable-input revision of the original and fixed-input formulation of OCT, which will be presented in Section 4. This extension will be shown to be capable of the continuous description of the orbital(s) decoupling limit, when AO subspace does not mix with (exhibit no communications with) the remaining basis functions. [Pg.5]

Throughout this article, the bold symbol X represents a square or rectangular matrix, the bold-italic X denotes the row vector, and italic X stands for the scalar quantity. The entropy/information descriptors are measured in bits, which correspond to the base 2 in the logarithmic (Shannon) measure of information. [Pg.6]

In OCT, the entropy/information indices of the covalent/ionic components of all chemical bonds in a molecule represent the complementary descriptors of the average communication noise and the amount of information flow in the molecular information channel. The molecular input p(a) = p generates the same distribution in the output of the molecular channel,... [Pg.8]

It should be emphasized that these entropy/information descriptors and the underlying probabilities depend on the selected basis set, for example, the canonical AO of the isolated atoms or the hybrid orbitals (HOs) of their promoted (valence) states, the localized MO (LMO), etc. In what follows we shall examine these IT descriptors of chemical bonds in illustrative model systems. The emphasis will be placed on the orbital decoupling in the molecular communication channels and the need for appropriate changes in their input probabilities, which weigh the contributions to the average information descriptors from each input. [Pg.10]

Scheme 1.1 The molecular information system modeling the chemical bond between two basis functions /=(o,b) and its entropy/information descriptors. In Panel b, the corresponding nonbonding (deterministic) channel due to the lone-pair hybrid 6° is shown. For the molecular input p = (P, Q), the orbital channel of Panel a gives the bond entropy-covalency represented by the binary entropy function H[P). For the promolecular input p° = (1/2,1/2), when both basis functions contribute a single electron each to form the chemical bond, one thus predicts H[p°] = 1 and the bond information ionicity / = 1 — H(P). Hence, these two bond components give rise to the conserved (P-independent) value of the single overall bond multiplicity N = I + S = 1. Scheme 1.1 The molecular information system modeling the chemical bond between two basis functions /=(o,b) and its entropy/information descriptors. In Panel b, the corresponding nonbonding (deterministic) channel due to the lone-pair hybrid 6° is shown. For the molecular input p = (P, Q), the orbital channel of Panel a gives the bond entropy-covalency represented by the binary entropy function H[P). For the promolecular input p° = (1/2,1/2), when both basis functions contribute a single electron each to form the chemical bond, one thus predicts H[p°] = 1 and the bond information ionicity / = 1 — H(P). Hence, these two bond components give rise to the conserved (P-independent) value of the single overall bond multiplicity N = I + S = 1.
We have recognized in these expressions that each lone-pair (doubly occupied) hybrid hn of the central atom, which does not form any chemical bonds (communications) with the hydrogen ligands, generates the decoupled deterministic subchannel of Scheme 1.1b, thus exhibiting the unit input probability. Therefore, it does not contribute to the resultant entropy/information index of all chemical bonds in the molecule. [Pg.13]

Scheme 1.2 The flexible-input generalization of the two-AO channel of Scheme 1.1a for the promolecular reference distribution p° = (1/2,1/2). The corresponding partial and average entropy/information descriptors of the chemical bond are also reported. Scheme 1.2 The flexible-input generalization of the two-AO channel of Scheme 1.1a for the promolecular reference distribution p° = (1/2,1/2). The corresponding partial and average entropy/information descriptors of the chemical bond are also reported.
Scheme 1.8 The probability scattering in benzene (HOckel theory) for the representative input orbital z, =2pz( and the associated OCT entropy/information descriptors. Scheme 1.8 The probability scattering in benzene (HOckel theory) for the representative input orbital z, =2pz( and the associated OCT entropy/information descriptors.
The corresponding entropy/information descriptors then read as follows ... [Pg.29]

The problem can be best illustrated using the simplest allyl case. As discussed elsewhere [9, 22], the entropy/information indices for the given pair of atomic orbitals can be extracted from the relevant partial channel, which includes all AO inputs (sources of the system chemical bonds) and the two-orbital outputs in question, defining the localized chemical interaction of interest. In Scheme 1.9, two examples of such partial information systems are... [Pg.29]

Scheme 1.9 The molecular partial information channels and their entropy/information descriptors of the chemical interaction between the adjacent (Panel a) and terminal (Panel b) AO in the jt-electron system of allyl. Scheme 1.9 The molecular partial information channels and their entropy/information descriptors of the chemical interaction between the adjacent (Panel a) and terminal (Panel b) AO in the jt-electron system of allyl.
Scheme 1.12 summarizes the elementary entropy/information increments of the diatomic bond indices generated by the MO channels of Eq. (42). They give rise to the corresponding diatomic descriptors, which are obtained from Eq. (32). For example, by selecting i = 1 of the diatomic fragment consisting additionally the = 2,3,4 carbon, one finds the following IT bond indices ... [Pg.32]

Scheme 1.11 The partial MO-information channels and their entropy/information descriptors for the two-orbital interactions in the jr-electron system of butadiene. Scheme 1.11 The partial MO-information channels and their entropy/information descriptors for the two-orbital interactions in the jr-electron system of butadiene.
Scheme 1.12 The elementary entropy/information contributions to chemical interactions between two different AOs in the minimum basis set z, = 2pz l] of the zr-electron system in benzene. Scheme 1.12 The elementary entropy/information contributions to chemical interactions between two different AOs in the minimum basis set z, = 2pz l] of the zr-electron system in benzene.
The underlying joint atom-orbital probabilities, Pab(A,/),/ e B and Pab(B, /), / e A, to be used as weighting factors in the average conditional-entropy (covalency) and mutual-information (ionicity) descriptors of the AB chemical bond(s), indeed assume appreciable magnitudes only when the electron occupying the atomic orbital Xi of one atom is simultaneously found with a significant probability on the other atom, thus effectively excluding the contributions to the entropy/information bond descriptors due to the lone-pair electrons. Thus, such joint bond probabilities emphasize of AOs have both atoms are simultaneously involved in the occupied MOs. [Pg.39]

Table 1.1 Comparison of the diatomic Wiberg and entropy/information bond multiplicity descriptors in selected molecules the RHF results obtained in the minimum (STO-3G) basis set... Table 1.1 Comparison of the diatomic Wiberg and entropy/information bond multiplicity descriptors in selected molecules the RHF results obtained in the minimum (STO-3G) basis set...
R.F. Nalewajski, E. Broniatowska, Entropy/information indices of the "stockholder" atoms-in-molecules, Int. J. Quantum Chem. 101 (2005) 349. [Pg.46]

R.F. Nalewajski, Partial communication channels of molecular fragments and their entropy/information indices, Mol. Phys. 103 (2005) 451. [Pg.46]

R.F. Nalewajski, Entropy/information bond indices of molecular fragments, J. Math. Chem. 38 (2005) 43. [Pg.46]

Jochum-Gasteiger canonical numbering canonical numbering Joint Entropy-based Diversity Analysis cell-based methods joint entropy information content... [Pg.425]

A reference to Figs 2 and 3 shows that qualitatively the density and entropy displacement functions are very similar, with the latter providing a somewhat more resolved picture of entropy/information changes in the valence shell. These plots demonstrate that both functions can be used to probe changes in the electronic structure due to bond formation in molecules, reflecting the promotion (polarization) of bonded atoms in the molecular valence state, as a result of the electron excitation and orbital hybridization, and the inter-atomic electron CT effects. [Pg.155]


See other pages where Informational entropy is mentioned: [Pg.60]    [Pg.796]    [Pg.2]    [Pg.15]    [Pg.17]    [Pg.22]    [Pg.32]    [Pg.35]    [Pg.44]    [Pg.23]    [Pg.409]    [Pg.410]    [Pg.333]    [Pg.203]    [Pg.729]    [Pg.729]    [Pg.160]    [Pg.161]   
See also in sourсe #XX -- [ Pg.17 ]




SEARCH



Analysis of Chemical Information Content Using Shannon Entropy

Entropy and information

Entropy information theory

Entropy of information

Entropy-based information theory

Information Entropy with Globally Identifiable Case

Information entropy

Information entropy

Information entropy based on continuous variable

Maximum information entropy

Probability density distribution function for the maximum information entropy

Robust Information Entropy

Sensitiveness of human experience for quantity and information entropy

Shannon Entropy (Information Content)

© 2024 chempedia.info