Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Descriptor entropy

Recently, an entropy-based approach has been introduced to compare the intrinsic and extrinsic variability of different descriptors, independent of their units and value ranges. The method was originally introduced in communication theory and is based on Shannon entropy, which calculates descriptor-entropy values using histogram representations. Shannon entropy is defined as ... [Pg.147]

The extremes of data distributions are depicted in Figure 2 along with an arbitrary midpoint in a calculation of descriptor entropy. [Pg.267]

Campbell [campj82], in his Grammatical Man Information, Entropy, Language and Life, was one of the first to assert the importance of adding information to our short list of fundamental descriptors of nature ... [Pg.632]

Wei, D.T., Meadows, J.C., and Kellogg, G.E. Effects of entropy on QSAR equations for HlV-1 protease 1. Using hydropathic binding descriptors. [Pg.372]

Stahura F, Godden JW, Xue L, Bajorath J. (2000) Distinguishing between natural products and synthetic molecules by descriptor Shannon entropy analysis and binary QSAR calculations. J Chem Inf Comput Sci 40 1245-1252. [Pg.124]

Godden, J. W., Stahura, F. L., Bajorath, J. (2000) Variabilities of molecular descriptors in compound databases revealed by Shannon entropy calculations. J Chem Inf Comput Sci 40, 796-800. [Pg.151]

The term entropy, which literally means a change within, was first used in 1851 by Rudolf Clausius, one of the formulators of the second law of thermodynamics. A rigorous quantitative definition of entropy involves statistical and probability considerations. However, its nature can be illustrated qualitatively by three simple examples, each demonstrating one aspect of entropy. The key descriptors of entropy are randomness and disorder, manifested in different ways. [Pg.24]

After a brief summary of the molecular and MO-communication systems and their entropy/information descriptors in OCT (Section 2) the mutually decoupled, localized chemical bonds in simple hydrides will be qualitatively examined in Section 3, in order to establish the input probability requirements, which properly account for the nonbonding status of the lone-pair electrons and the mutually decoupled (noncommunicating, closed) character of these localized a bonds. It will be argued that each such subsystem defines the separate (externally closed) communication channel, which requires the individual, unity-normalized probability distribution of the input signal. This calls for the variable-input revision of the original and fixed-input formulation of OCT, which will be presented in Section 4. This extension will be shown to be capable of the continuous description of the orbital(s) decoupling limit, when AO subspace does not mix with (exhibit no communications with) the remaining basis functions. [Pg.5]

Throughout this article, the bold symbol X represents a square or rectangular matrix, the bold-italic X denotes the row vector, and italic X stands for the scalar quantity. The entropy/information descriptors are measured in bits, which correspond to the base 2 in the logarithmic (Shannon) measure of information. [Pg.6]

In OCT, the entropy/information indices of the covalent/ionic components of all chemical bonds in a molecule represent the complementary descriptors of the average communication noise and the amount of information flow in the molecular information channel. The molecular input p(a) = p generates the same distribution in the output of the molecular channel,... [Pg.8]

Thus, this average noise descriptor expresses the difference between the Shannon entropies of the molecular one- and two-orbital probabilities,... [Pg.9]

It should be emphasized that these entropy/information descriptors and the underlying probabilities depend on the selected basis set, for example, the canonical AO of the isolated atoms or the hybrid orbitals (HOs) of their promoted (valence) states, the localized MO (LMO), etc. In what follows we shall examine these IT descriptors of chemical bonds in illustrative model systems. The emphasis will be placed on the orbital decoupling in the molecular communication channels and the need for appropriate changes in their input probabilities, which weigh the contributions to the average information descriptors from each input. [Pg.10]

Scheme 1.1 The molecular information system modeling the chemical bond between two basis functions /=(o,b) and its entropy/information descriptors. In Panel b, the corresponding nonbonding (deterministic) channel due to the lone-pair hybrid 6° is shown. For the molecular input p = (P, Q), the orbital channel of Panel a gives the bond entropy-covalency represented by the binary entropy function H[P). For the promolecular input p° = (1/2,1/2), when both basis functions contribute a single electron each to form the chemical bond, one thus predicts H[p°] = 1 and the bond information ionicity / = 1 — H(P). Hence, these two bond components give rise to the conserved (P-independent) value of the single overall bond multiplicity N = I + S = 1. Scheme 1.1 The molecular information system modeling the chemical bond between two basis functions /=(o,b) and its entropy/information descriptors. In Panel b, the corresponding nonbonding (deterministic) channel due to the lone-pair hybrid 6° is shown. For the molecular input p = (P, Q), the orbital channel of Panel a gives the bond entropy-covalency represented by the binary entropy function H[P). For the promolecular input p° = (1/2,1/2), when both basis functions contribute a single electron each to form the chemical bond, one thus predicts H[p°] = 1 and the bond information ionicity / = 1 — H(P). Hence, these two bond components give rise to the conserved (P-independent) value of the single overall bond multiplicity N = I + S = 1.
Scheme 1.2 The flexible-input generalization of the two-AO channel of Scheme 1.1a for the promolecular reference distribution p° = (1/2,1/2). The corresponding partial and average entropy/information descriptors of the chemical bond are also reported. Scheme 1.2 The flexible-input generalization of the two-AO channel of Scheme 1.1a for the promolecular reference distribution p° = (1/2,1/2). The corresponding partial and average entropy/information descriptors of the chemical bond are also reported.
Scheme 1.8 The probability scattering in benzene (HOckel theory) for the representative input orbital z, =2pz( and the associated OCT entropy/information descriptors. Scheme 1.8 The probability scattering in benzene (HOckel theory) for the representative input orbital z, =2pz( and the associated OCT entropy/information descriptors.
The corresponding entropy/information descriptors then read as follows ... [Pg.29]

Scheme 1.9 The molecular partial information channels and their entropy/information descriptors of the chemical interaction between the adjacent (Panel a) and terminal (Panel b) AO in the jt-electron system of allyl. Scheme 1.9 The molecular partial information channels and their entropy/information descriptors of the chemical interaction between the adjacent (Panel a) and terminal (Panel b) AO in the jt-electron system of allyl.
Scheme 1.12 summarizes the elementary entropy/information increments of the diatomic bond indices generated by the MO channels of Eq. (42). They give rise to the corresponding diatomic descriptors, which are obtained from Eq. (32). For example, by selecting i = 1 of the diatomic fragment consisting additionally the = 2,3,4 carbon, one finds the following IT bond indices ... [Pg.32]

Scheme 1.11 The partial MO-information channels and their entropy/information descriptors for the two-orbital interactions in the jr-electron system of butadiene. Scheme 1.11 The partial MO-information channels and their entropy/information descriptors for the two-orbital interactions in the jr-electron system of butadiene.
The underlying joint atom-orbital probabilities, Pab(A,/),/ e B and Pab(B, /), / e A, to be used as weighting factors in the average conditional-entropy (covalency) and mutual-information (ionicity) descriptors of the AB chemical bond(s), indeed assume appreciable magnitudes only when the electron occupying the atomic orbital Xi of one atom is simultaneously found with a significant probability on the other atom, thus effectively excluding the contributions to the entropy/information bond descriptors due to the lone-pair electrons. Thus, such joint bond probabilities emphasize of AOs have both atoms are simultaneously involved in the occupied MOs. [Pg.39]

As we have already mentioned in Section 2, in OCT the complementary quantities characterizing the average noise (conditional entropy of the channel output given input) and the information flow (mutual information in the channel output and input) in the diatomic communication system defined by the conditional AO probabilities of Eq. (48) provide the overall descriptors of the fragment bond covalency and ionicity, respectively. Both molecular and promolecular reference (input) probability distributions have been used in the past to determine the information index characterizing the displacement (ionicity) aspect of the system chemical bonds [9, 46-48]. [Pg.40]

Table 1.1 Comparison of the diatomic Wiberg and entropy/information bond multiplicity descriptors in selected molecules the RHF results obtained in the minimum (STO-3G) basis set... Table 1.1 Comparison of the diatomic Wiberg and entropy/information bond multiplicity descriptors in selected molecules the RHF results obtained in the minimum (STO-3G) basis set...
R.F. Nalewajski, Entropy descriptors of the chemical bond in information theory. I. Basic concepts and relations, Mol. Phys. 102 (2004) 531. [Pg.46]

R.F. Nalewajski, Many-orbital probabilities and their entropy/information descriptors in orbital communication theory of the chemical bond, J. Math. Chem. 47 (2010) 692. [Pg.48]


See other pages where Descriptor entropy is mentioned: [Pg.293]    [Pg.272]    [Pg.293]    [Pg.272]    [Pg.429]    [Pg.721]    [Pg.159]    [Pg.128]    [Pg.149]    [Pg.211]    [Pg.85]    [Pg.139]    [Pg.28]    [Pg.29]    [Pg.175]    [Pg.2]    [Pg.11]    [Pg.17]    [Pg.22]    [Pg.32]    [Pg.44]    [Pg.41]   
See also in sourсe #XX -- [ Pg.147 ]




SEARCH



Shannon entropy descriptor

© 2024 chempedia.info