Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Jaynes entropy

The choice of the preferred solution within the feasible set can be achieved by maximizing some function F[a(r)] of the spectrum that introduces the fewest artefacts into the distribution. It has been proved that only the Shannon-Jaynes entropy S will give the least correlated solution 1. All other maximization (or regularization) functions introduce correlations into the solution not demanded by the data. The function S is defined as... [Pg.187]

Jaynes, E.T. (1986). Where do we stand on maximum entropy ln J.H.Justice (ed.). Maximum Entropy and Bayesian Methods in AppliedStatistics, Cambridge University Press, Cambridge, pp 26-58. [Pg.352]

X-ray Charge Densities and Chemical Bonding the entropy formula (Jaynes 1968)... [Pg.116]

The quantity — Z nm ln nm > one term °f the expression to be maximized, is the Shannon entropy H as used by Jaynes and familiar to those who have studied thermal physics. For a given set nm obeying Xm=i nm = N, maximum H is attained when all the nm have the same value nm = N/M. The requirement that H be maximum, even when other constraints are attached, tends to force the nm toward this constant value and hence inhibits large excursions. This property of maximum-entropy restorations is certainly desirable. [Pg.117]

Principle (17) now consists of just the Bose-Einstein degeneracy factor, exactly Kikuchi and Soffer s form (1977). Also, as these authors showed (see also Section IX.B), in the case (11a) of sparsely occupied df it becomes Jaynes s maximum-entropy form (39) (Jaynes, 1968). Hence, both the Kikuchi-Soffer and Jaynes estimators are special cases of the ML approach, corresponding to the prior knowledge that the unknown spectrum is equal-energy white with the highest conviction. [Pg.239]

We can summarize this situation by the statement that the ML object obeys Jaynes s maximum-entropy principle when the white object definition (31) of maximum ignorance is used and when the object is of such low intensity that the df sites are mostly unoccupied. The latter situation is obeyed by weak astronomical objects such as planets in the visible and IR regions and the sun in the visible region (see Kikuchi and Softer, 1977). [Pg.248]

We note that the first sum in Eq. (45a) has the form of the negative of an entropy [see Eq. (39)]. This entropy will be called photon-site entropy because it involves the df sites zm. Likewise the second sum in Eq. (45a) is the negative of empirical entropy, in that it involves the empirical data mm. Hence, overall, estimator (45a) is one of minimum photon-site entropy plus minimum empirical entropy plus maximum Jaynes entropy (third sum). [Pg.249]

E.T. Jaynes Papers on Probability, Statistics and Statistical Physics (R.D. Rosenkrantz ed., Kluwer, Dordrecht 1989) Probability Theory (Cambridge Univ. Press 2003) B. Buck and V.A. Macaulay eds., Maximum Entropy in Action (Clarendon, Oxford 1991). A desperate attempt was made by J.M. Keynes, A Treatise on Probability (Macmillan, London 1921, 1973). [Pg.22]

Jaynes, E. T., Tribus, M."The Maximum Entropy Formalism". MIT Univ. Press, Cambridge (1978). [Pg.130]

The formal maximization of this entropy is familiar (Jaynes, 2003). We consider the functional... [Pg.75]

The topic arises from the following sequence of aspects of entropy when entropy is introduced on a thermodynamic basis the issue is the motion of heat (Jaynes, 1988), and the assessment involves calorimetry an entropy change is evaluated. When entropy is formalized with the classical view of statistical thermodynamics, the entropy is found by evaluating a configurational integral (Bennett, 1976). But a macroscopic physical system at a particular thermodynamic state has a particular entropy, a state function, and the whole description of the physical system shouldn t involve more than a mechanical trajectory for the system in a stationary, equilibrium condition. How are these different concepts compatible ... [Pg.103]

Jaynes, E. T., The evolution of Carnot s Principle. In G. J. Erickson and C. R. Smith (eds.), Maximum-Entropy and Bayesian Methods in Science and Engineering, volume 1. Dordrecht Kluwer (1988). [Pg.220]

A very useful criterion in this respect is given by the maximum entropy principle in the sense of Jaynes." The ingredients of the maximum entropy principle are (i) some reference probability distribution on the pure states and (ii) a way to estimate the quality of some given probability distribution p. on the pure states with respect to the reference distribution. As our reference probability distribution, we shall take the equidistribution defined in Eq. (30), for a two-level system (this definition of equipartition can be generalized to arbitrary dxd matrices, being the canonical measure on the d-dimensional complex projective plane - " ). The relative entropy of some probability distribution pf [see Eq. (35)] with respect to yXgqp is defined as... [Pg.125]

Since this expression is similar to that found for entropy in statistical mechanics, it is called the entropy of the probability distribution p,. Jaynes [260] shows that the thermodynamic entropy is identical to the information theory entropy except for the presence of the Boltzman constant in the former, made necessary by our arbitrary temperature scale. [Pg.407]

Nine years after Shannon s paper, Edwin T. Jaynes published a synthesis of the work of Cox and Shannon (11). In this paper Jaynes presented the "Maximum Entropy Principle" as a principle in general statistical inference, applicable in a wide variety of fields. The principle is simple. If you know something but don t know everything, encode what you know using probabilities as defined by Cox. Assign the probabilities to maximize the entropy, defined by Shannon, consistent with what you know. This is the principle of "minimum prejudice." Jaynes applied the principle in communication theory and statistical physics. It was easy to extend the theory to include classical thermodynamics and supply the equations complementary to the Rothstein paper(12). [Pg.279]

Some of the previous articles have inspired several investigations on the relation between entropy and chemical change. In work by Bernstein and Levine (1972) optimal means of characterizing the distribution of product energies were discussed in terms of an information measure / (see e.g. Jaynes, 1963, Katz. 1967 for uses of 1 in thermodynamical problems). Global, detailed experiments provide average transition probabilities ru(A, B), from which the surprisal A, B) is defined to be... [Pg.42]

In equilibrium and nonequilibrium thermodynamics, the spatial organization of a system is reflected in the configurational entropy. As has been discussed by Denbigh [113] and Jaynes [114], there are several statistical analogues of the entropy, each of which has the desired thermodynamic properties. For example, the function 5, defined by... [Pg.375]

Ben-Naim, A. On the so-called Gibbs paradox, and on the real paradox. Entropy 9, 132-136 (2007). [electronic] http //www.mdpi.org/entropy/papers/e9030132.pdf Jaynes, E.T The Gibbs paradox. In C.R. Smith, G.J. Erickson, P.O. Neudorfer (eds.) Maximum Entropy and Bayesian Methods, Fundamental Theories of Physics, vol. 50, pp. 1-22. Kluwer Academic Publishers, Dordrecht, Holland (1992)... [Pg.311]

Jaynes, E. T., 1988, The evolution of Carnot s principle, in Maximum Entropy and... [Pg.632]

Jaynes, E. T. 1982. On the rationale of maximum-entropy methods. Proceedings of the IEEE... [Pg.410]


See other pages where Jaynes entropy is mentioned: [Pg.227]    [Pg.415]    [Pg.368]    [Pg.227]    [Pg.415]    [Pg.368]    [Pg.311]    [Pg.320]    [Pg.92]    [Pg.120]    [Pg.230]    [Pg.258]    [Pg.258]    [Pg.22]    [Pg.76]    [Pg.168]    [Pg.300]    [Pg.129]    [Pg.407]    [Pg.418]    [Pg.181]    [Pg.181]    [Pg.205]    [Pg.189]    [Pg.130]    [Pg.346]    [Pg.228]   
See also in sourсe #XX -- [ Pg.230 , Pg.239 , Pg.247 , Pg.249 ]




SEARCH



© 2024 chempedia.info