Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Entropic descriptors

Average entropic descriptors of diatomic chemical interactions... [Pg.40]

It should be noted that when replacing the London dispersive interactions term by other properties such as, for example, the air-hexadecane partition constant, by expressing the surface area in a more sophisticated way, and/or by including additional terms, the predictive capability could still be somewhat improved. From our earlier discussions, we should recall that we do not yet exactly understand all the molecular factors that govern the solvation of organic compounds in water, particularly with respect to the entropic contributions. It is important to realize that for many of the various molecular descriptors that are presently used in the literature to model yiw or related properties (see Section 5.5), it is not known exactly how they contribute to the excess free energy of the compound in aqueous solution. Therefore, when also considering that some of the descriptors used are correlated to each other (a fact that... [Pg.151]

In addition to actual partition coefficient measurements, there are a number of other descriptors in which the main property involved is a measure of hydrophobicity. Examples of these are various chromatographic measurements, such as thin layer, paper and reverse phase high performance liquid chromatography. Another interesting treatment is the decomposition of partition coefficients into enthalpic and entropic components, in an attempt to provide more mechanism-based parameters for hydrophobicity. [Pg.223]

FIGURE 3.2 SE and DSE calculations. Histogram representations of values of a molecular descriptor with relatively high information content in two compound databases (A and B) and either distinct (top) or similar (bottom) value distributions. SE values are an entropic measure of information content. For the distributions, calculated scaled SE and DSE values are reported. DSE calculations add value range dependence as a parameter to information content analysis. [Pg.58]

This survey of IT probes of chemical bonds continues with some rudiments on the entropic characteristics of the dependent probability distributions and information descriptors of a transmission of signals in communication systems [3,4,7,8]. For two mutually dependent (discrete) probability vectors of two separate sets of events a and b, P(a) = P(fl) =Pi) =p and P(b) = P(pp = qj = q, one decomposes the joint probabilities of the simultaneous events a/ b = [a,Air) into these two schemes, P afdb) = P(fli/ b) = Jt,y] = 31, as products of the margin probabilities of events in one set, say a, and the corresponding conditional probabilities P( la) = [P(/li) = 3t,y/Pj] of outcomes in set b, given that events a have already occurred [jty=p (j i). The relevant normalization conditions for the joint and conditional probabilities then read ... [Pg.160]

Because SE is a nonparametric distribution metric, one of the essential features of an entropic approach to descriptor information content analysis is that descriptors with different units, numerical ranges, and variability can be compared directly, a task that would otherwise not be possible. This allows us to ask questions such as follows Which descriptors carry high levels of information for a specific compound set and which carry very little To answer this question, we have systematically studied 1-D descriptors and 2-D... [Pg.272]

As the ES is scaled to the information content of the descriptor in the compared databases, a descriptor with a broader distribution (higher average SE) must have a greater peak separation in order to achieve the same level of ES as another descriptor. The ES is therefore an entropic (and nonparametric) analogy of the classical statistical phrase to be separated by so many sigma. This measure is related to, yet distinct from, DSE. Eigure 8 illustrates the application of the ES metric on a pair of hypothetical data distributions. [Pg.278]

The highest ES descriptors reflect some known differences between synthetic and natural molecules, including, for example, the degree of saturation or aromatic character. It is also interesting to note that the descriptor with the highest ES value, a ICM, is itself calculated using entropic principles. It accounts for the entropy of the distribution of the elemental composition of the compound. [Pg.281]

Two conclusions can be derived from these results. First, it is feasible to use entropy-based information theory to select fewer than 10 chemical descriptors that can systematically distinguish between compounds from different sources. Second, when selecting descriptors to distinguish between compounds, it is important that these descriptors have high information content that can support separability or differentiate compounds between the datasets. The power of the entropic separation revealed in this analysis gave rise to the development of the DSE and, ultimately, the SE-DSE metric, as described earlier. [Pg.283]


See other pages where Entropic descriptors is mentioned: [Pg.182]    [Pg.207]    [Pg.182]    [Pg.207]    [Pg.181]    [Pg.459]    [Pg.2]    [Pg.3]    [Pg.5]    [Pg.209]    [Pg.459]    [Pg.1653]    [Pg.164]    [Pg.290]    [Pg.119]    [Pg.176]    [Pg.179]    [Pg.186]    [Pg.189]    [Pg.193]    [Pg.291]    [Pg.143]    [Pg.144]    [Pg.171]    [Pg.145]    [Pg.181]    [Pg.411]    [Pg.371]    [Pg.117]    [Pg.81]    [Pg.273]    [Pg.281]    [Pg.282]    [Pg.282]   
See also in sourсe #XX -- [ Pg.207 ]




SEARCH



Entrop

Entropic

© 2024 chempedia.info