Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Kullback-Leibler distance

Recent papers have noted the connection of the FR to Kullback-Leibler distance, or relative entropy which measures the irreversibility of the system and Sevick et al consider a similar property—the departure of the average of the time-averaged dissipation function from zero—as a measure of irreversibility. The effect of system on size on reversibility is discussed in ref. 213. [Pg.200]

Hence, Akaike, who considered the AIC ... a natural extension of the maximum likelihood principle, created an empirical function that linked Kullback-Leibler distance to maximum likelihood, thereby allowing information theory to be used as a practical tool in data analysis and model selection. [Pg.25]

Kullback, S. The Kullback-Leibler distance. American Statistician 1987 41 340-341. [Pg.373]

It will be useful to measure how close two distributions are. One measure if the Kullback-Leibler distance (or relative entropy). This is defined as... [Pg.566]

The number of components necessary can usually be judged from the data, but the appropriateness of a particular value of n can be judged by comparing different values of n and calculating the entropy distance, or Kullback-Leibler divergence. [Pg.329]

Kullback Leibler style distances. The Kullback Leibler divergence is a measure of distance between two probability distributions. From this, we can derive an expression which calculates the distance between two spectral vectors [251], [472] ... [Pg.512]

The above studies shed interesting light on the relationship between acoustic measures and their perception, but they also show that there seems to be an upper limit as to how far this approach can go. From Table 16.9, we see that the best correlation between an acoustic cost and perceptual judgment is only 0.66, which is far from the type of correlation that we would be happy to accept as a scientific rule. Given the number of studies and that nearly all the well known acoustic measures (MFCCs, LSFs, formants etc) and all the distance metrics (Euclidean, Mahalanobis, Kullback-Leibler) have been studied, we can be fairly sure that this area has been thoroughly investigated and that combination of features and distance metric is likely to significantly improve on the results in Table 16.9. [Pg.512]

Measure for comparing approximations In order to conveniently eompare the approximations, let / be a target FCD and let g be an approximate function to /. The cross-entropy distance between/ and g with respect to /, also known as Kullback-Leibler (KL) information of g at /, has been considered to assess the performance in statistical terms of g when approximating /. In fact, a number of authors have adopted the KL distance for measuring the quahty of proposal ftmetions in infering over their target densities Neil et al. (2007) and Keith et al. (2008) are some recent examples. The KL distance is defined as the expected value... [Pg.62]

In this section the density Ai (r) of the Kullback-Leibler functional for the information distance between molecular and promolecular electron distributions. [Pg.145]

An important generalization of Shannon entropy, called relative (cross) entropy (also known as entropy deficiency, missing information, or directed divergence), has been proposed by Kullback and Leibler [5] and Kullback [6]. It measures the information distance between two (normalized) probability distributions for the same set of events ... [Pg.145]


See other pages where Kullback-Leibler distance is mentioned: [Pg.193]    [Pg.25]    [Pg.566]    [Pg.552]    [Pg.193]    [Pg.25]    [Pg.566]    [Pg.552]    [Pg.110]    [Pg.24]    [Pg.122]    [Pg.168]    [Pg.152]    [Pg.350]   
See also in sourсe #XX -- [ Pg.552 ]

See also in sourсe #XX -- [ Pg.552 ]




SEARCH



Leibler

© 2024 chempedia.info