Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Shannon information content

We have defined above a way of quantifying the structure of water based on the profile of fx values that encode the number of each possible joined state of a molecule. It is now possible to use this profile as a measure of the structure of water at different temperatures. As an application of this metric it is possible to relate this to physical properties. We have shown the results of our earlier work in Table 3.3. The reader is encouraged to repeat these and to explore other structure-property relationships using the fx as single or multiple variables. A unified parameter derived from the five fx values expressed as a fraction of 1.0, might be the Shannon information content. This could be calculated from all the data created in the above studies and used as a single variable in the analysis of water and other liquid properties. [Pg.56]

Record the average fx values for water, then convert these to fractions of 1.00, then calculate the Shannon information content. [Pg.70]

The rules in Example 4.6 are used to estimate the effective temperature resulting from the presence of a solute. In this study, replace 30 water molecules with 30 solute molecules. Use parameters for these solute molecules reflecting a moderately polar character, such as Tb(SS) = 0.5, J(SS) = 0.7 and Pb(WS) = 0.2, and T(WS) = 2.0. Run the dynamics and collect the fx values for the water. Convert these fx values as fractions of 1.00. Compute the Shannon information content, H, for this set of parameters. [Pg.70]

The Shannon information content (in thermodynamics, the entropy) can be calculated from the probability distribution of allowed amino acids substitutions (Fontana and Shuster, 1987 Saven and Wolynes, 1997 Dewey and Donne, 1998). Counting the number of sequences at a given fitness is mathematically isomorphic to calculating the entropy at a given fitness, S(F) = In fl, where the number of states (l is the number of sequences at fitness F. The entropy. v, for a given site i can also be calculated from Eq. (25)... [Pg.128]

The Shannon Equation Eq. (1)) [4] enables one to evaluate the information content, I (also known as the Shannon entropy), of the system. [Pg.208]

Given an object com[)osed of N interconnected and interacting [)arts, one might at first be te,m[)ted to equate the complexity of an object with its conventional information content, as defined l)y Shannon [.shann49] ... [Pg.616]

Another drawback to using Shannon information as a measure of complexity is the fact that it is based on an ensemble of all possible states of a system and therefore cannot describe the information content of a single state. Shannon information thus resembles traditional statistical mechanics - which describes the average or aggregate behavior of, say, a gas, rather than the motion of its constituent molecules - more so than it docs a complexity theory that must address the complexity of individual objects. [Pg.616]

According to Shannon, the well-known relationship between the information content of an item of news and its expectation probability can be calculated using the formula ... [Pg.215]

Jeffrey W. Godden and Jurgen Bajorath, Analysis of Chemical Information Content Using Shannon Entropy. [Pg.450]

Information theory 71,72) is a convenient basis for the quantitative characterization of structures. It introduces simple structural indices called information content (total or mean) of any structured system. For such a system having N elements distributed into classes of equivalence Nls N2,..., Nk a probability distribution P pi, p2,..., pk is constructed (p = Nj/N). The entropy of this distribution, calculated 71) by the Shannon formula1 ... [Pg.42]

What is complexity There is no good general definition of complexity, though there are many. Intuitively, complexity lies somewhere between order and disorder, between regularity and randomness, between perfect crystal and gas. Complexity has been measured by logical depth, metric entropy, information content (Shannon s entropy), fluctuation complexity, and many other techniques some of them are discussed below. These measures are well suited to specific physical or chemical applications, but none describe the general features of complexity. Obviously, the lack of a definition of complexity does not prevent researchers from using the term. [Pg.28]

If /is large, then the amino acid found at that position is highly conserved. Conversely, a low / indicates that many amino acids are allowed at that site. This information content is directly related to the Shannon entropy through a Boltzman weighting of the fitness changes w,(a) to calculate the probability that a residue is in amino acid state pi a). [Pg.129]

Suppose a certain system comprises N elements. Then its information content is determined by Shannon s formula ... [Pg.142]

Binary descriptors should be used when the considered characteristic is really a dual characteristic of the molecule or when the considered quantity cannot be represented in a more informative numerical form. In any case, the - mean information content of a binary descriptor /char is low (the maximum value is 1 when the proportions of 0 and 1 are equal), thus the standardized Shannon s entropy = /char/log2 , where n is the number of elements, gives a measure of the efficiency of the collected information. [Pg.234]

From the obtained equivalence classes in the hydrogen-filled multigraph, for each rth order (usually r = 0 - 6), the rth order neighbourhood Information Content IC is calculated as defined by Shannon s entropy ... [Pg.235]

The mean information content 7, also called Shannon s entropy H [Shannon and Weaver, 1949], is defined as ... [Pg.239]

The standardized Shannon s entropy (or standardized information content) is the ratio between the actual mean information content and the maximum available information content (i.e. the Hartley information) ... [Pg.240]

Model complexity is defined as the ratio between the multivariate entropy Sx of the X-block n objects and p variables) of the model and - Shannon s entropy Hyoi the y response vector, thus also accounting for the information content of the y response [Authors, This Book] ... [Pg.296]

Shannon s entropy mean information content - information content shape descriptors... [Pg.390]

When the total information content is calculated on molecules, n being the total number of atoms and the number of equivalent atoms of type g, it is often referred to as molecular negentropy. The term H is Shannon s entropy that is defined below. [Pg.413]

Several descriptors are based on the concepts of information content and entropy among these are the topological information indices, indices of neighborhood symmetry, Shannon... [Pg.416]

The Shannon s entropy is also largely applied to the analysis of the information content of molecular descriptors vithin data sets of molecules and to assess the diversity of chemical libraries [Lin, 1996b Bajorath, 2001]. [Pg.416]

Information content and Shannon s entropy of molecular descriptors were extensively studied by Bajorath, Godden, and coworkers in several papers [Godden, Stahura et al., 2000 Godden and Bajorath, 2000, 2002, 2003]. [Pg.516]

Shannon s entropy = mean information content information content... [Pg.684]

Shannon entropy applied to gene expression time series. The equation for Shannon entropy (H) is shown above. For each gene, entropy accounts for the probability, p (or frequency), of a level of gene expression, /. Data from [8] (see Figure 5.1) are binned into three expression levels. SOD, superoxide dismutase NFM, neurofilament medium GRalpha4, GABA receptor alpha 4 subunit. Actin has zero entropy (zero information content) because its expression is... [Pg.561]


See other pages where Shannon information content is mentioned: [Pg.69]    [Pg.237]    [Pg.69]    [Pg.237]    [Pg.28]    [Pg.618]    [Pg.633]    [Pg.635]    [Pg.263]    [Pg.135]    [Pg.43]    [Pg.177]    [Pg.3]    [Pg.10]    [Pg.177]    [Pg.409]    [Pg.410]    [Pg.127]    [Pg.203]    [Pg.560]   
See also in sourсe #XX -- [ Pg.237 ]




SEARCH



Analysis of Chemical Information Content Using Shannon Entropy

Information content

Shannon

Shannon Entropy (Information Content)

© 2024 chempedia.info