Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Kullback-Leibler function

The same proportionality relations follow from the entropy deficiency mles, in which the Kullback-Leibler functional is formulated directly in terms of the... [Pg.174]

In this section the density Ai (r) of the Kullback-Leibler functional for the information distance between molecular and promolecular electron distributions. [Pg.145]

As can easily be seen from expression (9.77), the introduction of a reference probability distribution Po x) yields a measure independent of the choice of the coordinate system. The Kullback-Leibler functional quantifies the amount of... [Pg.164]

Besides the lack of symmetry, the Kullback-Leibler functional has other formal limitations e.g., it is not bound, nor is it always well defined. In [60] the lack of these properties was addressed and the Jensen-Shannon divergence was introduced as a symmetrized version of Kullback-Leibler s functional. In [61] the Jensen-Shannon distribution was first proposed as a measure of distinguishability of two quantum states. Chatzisavvas et al. investigated the quantity for atomic density functions [62]. [Pg.165]

The Kullback-Leibler formulation evaluates the relative entropy between two data distributions. However, it is not symmetrical with respect to the two distributions under comparison that is, one must declare one distribution as the base set or reference from which the other is assumed to depart. Concerning the connection between Jaynes entropy and the Kullback-Leibler function, maximum entropy is achieved when qi is replaced with a distribution about which there is prior knowledge and pk is adjusted so as to maximize KL. Prior knowledge could, for example, be the mean or expectation value of a data distribution. Importantly, because of the quotient involved (Eq. [4]), the Kullback-Leibler function becomes undefined if any bin is unpopulated. This renders this function inappropriate for the purposes of estimating information content in chemical descriptor sets, which is discussed below. [Pg.269]

Recent papers have noted the connection of the FR to Kullback-Leibler distance, or relative entropy which measures the irreversibility of the system and Sevick et al consider a similar property—the departure of the average of the time-averaged dissipation function from zero—as a measure of irreversibility. The effect of system on size on reversibility is discussed in ref. 213. [Pg.200]

Hence, Akaike, who considered the AIC ... a natural extension of the maximum likelihood principle, created an empirical function that linked Kullback-Leibler distance to maximum likelihood, thereby allowing information theory to be used as a practical tool in data analysis and model selection. [Pg.25]

Figure 5 reports a comparison between the contour maps of the density difference, Ap(r), the Kullback-Leibler integrand, Ah(r), and the entropy displacement function [equation (94)], A (r), for the planes of sections shown in Fig. 4. The corresponding central bond profiles of the density and entropy difference functions are compared in Fig. 6. The optimized geometries of propellanes have been determined from the UHF calculations (GAMESS program) using the 3-21 G basis set. The contour maps have been obtained from the DFT calculations (deMon program) in DZVP basis set. Figure 5 reports a comparison between the contour maps of the density difference, Ap(r), the Kullback-Leibler integrand, Ah(r), and the entropy displacement function [equation (94)], A (r), for the planes of sections shown in Fig. 4. The corresponding central bond profiles of the density and entropy difference functions are compared in Fig. 6. The optimized geometries of propellanes have been determined from the UHF calculations (GAMESS program) using the 3-21 G basis set. The contour maps have been obtained from the DFT calculations (deMon program) in DZVP basis set.
Measure for comparing approximations In order to conveniently eompare the approximations, let / be a target FCD and let g be an approximate function to /. The cross-entropy distance between/ and g with respect to /, also known as Kullback-Leibler (KL) information of g at /, has been considered to assess the performance in statistical terms of g when approximating /. In fact, a number of authors have adopted the KL distance for measuring the quahty of proposal ftmetions in infering over their target densities Neil et al. (2007) and Keith et al. (2008) are some recent examples. The KL distance is defined as the expected value... [Pg.62]

Abstract This contribution reviews a selection of findings on atomic density functions and discusses ways for reading chemical information from them. First an expression for the density function for atoms in the multi-configuration Hartree-Fock scheme is established. The spherical harmonic content of the density function and ways to restore the spherical symmetry in a general open-shell case are treated. The evaluation of the density function is illustrated in a few examples. In the second part of the paper, atomic density functions are analyzed using quantum similarity measures. The comparison of atomic density functions is shown to be useful to obtain physical and chemical information. Finally, concepts from information theory are introduced and adopted for the comparison of density functions. In particular, based on the Kullback-Leibler form, a functional is constructed that reveals the periodicity in Mendeleev s table. Finally a quantum similarity measure is constructed, based on the integrand of the Kullback-Leibler expression and the periodicity is regained in a different way. [Pg.139]

Tempted by the interpretation of the Kullback-Leibler expression (9.78) as a tool to distinguish two probability distributions, the possibility of using it to compare atomic density functions is explored. To make a physically motivated choice of the reference density Po x) we consider the construction of Sanderson s electronegativity scale [63], which is based on the compactness of the electron cloud. Sanderson introduced a hypothetical noble gas atom with an average density scaled by the number of electrons. This gives us the argument to use renormalized noble gas densities as reference in expression (9.78). This gives us the quantity... [Pg.166]

There are other metrics of information content, and several of them are based on the Shannon entropyAbout 10 years after introduction of the Shannon entropy concept, Jaynes formulated the maximum entropy approach, which is often referred to as Jaynes entropy and is closely related to Shannon s work. Jaynes introduction of the notion of maximum entropy has become an important approach to any study of statistical inference where all or part of a model system s probability distribution remains unknown. Jaynes entropy, or relations, which guide the parameterization to achieve a model of minimum bias, are built on the Kullback-Leibler (KL) function, sometimes referred to as the cross-entropy or relative entropy function, which is often used and shown (in which p and q represent two probability distributions indexed by k), as... [Pg.269]


See other pages where Kullback-Leibler function is mentioned: [Pg.110]    [Pg.193]    [Pg.24]    [Pg.165]    [Pg.140]    [Pg.168]    [Pg.1527]   
See also in sourсe #XX -- [ Pg.269 ]




SEARCH



Leibler

© 2024 chempedia.info