Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Entropy Metric Construction EMC

In this chapter (based on [1]) we develop further the theory of the correlation method introduced in chapter 7. Consider the expression of the pair correlation function [Pg.97]

We choose a new measure of the correlation distance, one based on an information theoretical formulation [2,3]. A natural measure of the correlation distance between two variables is the number of states jointly available to them (the size of the support set) compared to the number of states available to them individually. We therefore require that the measure of the statistical closeness between variables X and T be the fraction of the number of states jointly available to them versus the total possible number of states available to X and Y individually. Further, we demand that the measure of the support sets weighs the states according to their probabilities. Thus, two variables are close and the support set is small if the knowledge of one predicts the most likely state of the other, even if there exists simultaneously a substantial number of other states [4-6]. [Pg.97]

The information entropy gives the distance we demand in these requirements. The effective size of the support set of a continuous variable is [7] [Pg.97]

We define an EMC correlation distance based on information entropy as the minimum of eq. (9.5) regardless of the value of r  [Pg.98]

The CMC and EMC distances satisfy the first three requirements of a metric  [Pg.98]


See other pages where Entropy Metric Construction EMC is mentioned: [Pg.97]   


SEARCH



EMCE

Entropy metrical

© 2024 chempedia.info