Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Maximum entropy criterion

We shall use the Schrbdinger representation which among other advantages reveals the temporal evolution of the density and provides a direct connection with standard quantum mechanics. Let us begin by considering only one observable A of interest. The maximum entropy criterion (Ref. [61] and IX-21 of Ref. [21]) provides the initial nonequilibrium density (f = 0)... [Pg.34]

The commercially available software (Maximum Entropy Data Consultant Ltd, Cambridge, UK) allows reconstruction of the distribution a.(z) (or f(z)) which has the maximal entropy S subject to the constraint of the chi-squared value. The quantified version of this software has a full Bayesian approach and includes a precise statement of the accuracy of quantities of interest, i.e. position, surface and broadness of peaks in the distribution. The distributions are recovered by using an automatic stopping criterion for successive iterates, which is based on a Gaussian approximation of the likelihood. [Pg.189]

Other constrained methods have also been applied. Beatham and Orchard (1976) experimented with Biraud s method but experienced only limited success. Vasquez et al (1981) found that maximum entropy is capable of yielding excellent results on simulated ESCA data. The authors, who used Burg s method, cite its freedom from need for trial-and-error optimization. They did, however, have to develop methods of dealing with problems of instability and lack of an order-selecting criterion. [Pg.143]

In Section 1.3.6, it was shown that the entropy of a system and its surroundings dictates the direction of the spontaneity of a process. This spontaneity proceeds until the maximum entropy occurs. When the maximum entropy takes place, the system and its surroundings are at equilibrium. However, it is not always easy to measure or calculate the entropy or entropy changes for the surroundings. A new criterion... [Pg.34]

The maximum entropy method (MEM) is based on the philosophy of using a number of trial spectra generated by the computer to fit the observed FID by a least squares criterion. Because noise is present, there may be a number of spectra that provide a reasonably good fit, and the distinction is made within the computer program by looking for the one with the maximum entropy as defined in information theory, which means the one with the minimum information content. This criterion ensures that no extraneous information (e.g., additional spectral... [Pg.74]

A very useful criterion in this respect is given by the maximum entropy principle in the sense of Jaynes." The ingredients of the maximum entropy principle are (i) some reference probability distribution on the pure states and (ii) a way to estimate the quality of some given probability distribution p. on the pure states with respect to the reference distribution. As our reference probability distribution, we shall take the equidistribution defined in Eq. (30), for a two-level system (this definition of equipartition can be generalized to arbitrary dxd matrices, being the canonical measure on the d-dimensional complex projective plane - " ). The relative entropy of some probability distribution pf [see Eq. (35)] with respect to yXgqp is defined as... [Pg.125]

In principle, it is possible to compute a very large number of electron density maps (Eqn. 2, section 2(a)) based on all possible combinations of trial values for the phases. How can the correct solution be selected Maximum entropy (minimum information) provides a method for introducing constraints which reflect prior knowledge (e.g., the electron density must be positive) and provides a criterion for selection of the best map. [Pg.406]

Although linear prediction can be used to extract spectral data directly from a FID ( parametric LP ), it is commonly used to extrapolate the experimental time-domain data, which are then weighted and transformed as normal. Maximum entropy reconstruction, in contrast, seeks to fit the experimental FID with a model function that contains the minimum amount of information consistent with fitting experiment to within the estimated noise level. The criterion of minimum information corresponds to the maximum Shannon informational entropy S(p), which for a probability distribution p is defined as... [Pg.359]

Gibbs criterion (I) In an isolated equilibrium system, the entropy function has the mathematical character of a maximum with respect to variations that do not alter the energy. [Pg.150]

According to the Gibbs criterion, the entropy function S is a maximum (with respect to certain allowed variations). Recall that for a general function f(x, y,...), the conditions that/be a maximum are ... [Pg.152]

The entropy provides a criterion of spontaneous change and equilibrium at constant U and V because (dS) y 0. Thus the entropy of an isolated system can only increase and has its maximum value at equilibrium. The internal energy also provides a criterion for spontaneous change and equilibrium. That criterion is (dl/)s>K < 0, which indicates that when spontaneous changes occur in a system described by equation 2.2-1 at constant S and V, U can only decrease and has its minimum value at equilibrium. [Pg.22]

This fundamental equation for the entropy shows that S has the natural variables U, V, and n . The corresponding criterion of equilibrium is (dS) 0 at constant U, V, and n . Thus the entropy increases when a spontaneous change occurs at constant U, V, and ,. At equilibrium the entropy is at a maximum. When U, V, and , are constant, we can refer to the system as isolated. Equation 2.2-13 shows that partial derivatives of S yield 1/T, P/T, and pJT, which is the same information that is provided by partial derivatives of U, and so nothing is gained by using equation 2.2-13 rather than 2.2-8. Since equation 2.2-13 does not provide any new information, we will not discuss it further. [Pg.24]

By comparing Equation IV.5 with the equilibrium condition expressed by Equation IV.3, we see that dG for a system equals zero at equilibrium at constant temperature and pressure. Moreover, G depends only on U, P, V, T, and S of the system. The extremum condition, dG = 0, actually occurs when G reaches a minimum at equilibrium. This useful attribute of the Gibbs free energy is strictly valid only when the overall system is at constant temperature and pressure, conditions that closely approximate those encountered in many biological situations. Thus our criterion for equilibrium shifts from a maximum of the entropy of the universe to a minimum in the Gibbs free energy of the system. [Pg.563]

Box and Hill (1967) gave a more natural criterion that the next event be designed for maximum expected information gain (entropy decrease). They used the entropy function... [Pg.117]

Equation (1) has been obtained at maximum power output regimen and recovered later by some procedures [5,10,26,27] among others. Moreover, in [4] was advanced an optimization criterion of merit for the Curzon and Ahlborn cycle, taking into account the entropy production, the ecological criterion, by maximization of the ecological function,... [Pg.84]

SA -> CBA log T + R log VA + S0 or SA - C8A log T - R log CA + S0 where CA is the concentration of A expressed as a reciprocal of the volume which contains one gram-molecule It was shown in Chap II that as a criterion of equilibrium we can make use of the expression (8S) = o, t e the entropy of an isolated system is a maximum at the equilibnum point, and if any very small possible change be imagined to occur in the system the change m entropy is zero For the equilibnum system considered this relation takes the form—... [Pg.118]

From this figure it is clear that AS, the change in entropy from the initial state, and therefore the total entropy of the system, reaches a maximum value when Ai = A2 — 1 bar. Consequently, the equilibrium state of the system under consideration is the state in which the pressure in both ceils is the same, as one would expect. (Since the use of the entropy function leads to a solution that agrees with one s intuition, this example should reinforce confidence in the use of the entropy function as a criterion for equilibrium in an isolated constant-volume system.) ... [Pg.134]

Since the entropy function can only increase in value during the approach to equilibrium (because of the sign of Sgen), the entropy must be a maximum at equilibrium. Thus, the equilibrium criterion for a closed, isolated system is... [Pg.270]

Therefore a system will try to increase its entropy and when the entropy reaches its maximum value the system will be at equilibrium. One can show that for a system at constant temperature and pressure the criterion corresponding to (10.22) is... [Pg.438]

If the criterion for a stable equilibrium configuration is that entropy assumes a maximum value, it follows immediately from (9.5) that an equivalent statement for the class of systems described is that a stable equilibrium configuration is one in which free energy assumes a minimum value. Furthermore, it follows from the second law (9.3) that, if a nonequilibrium system of this class alters its configuration spontaneously, it must do so with... [Pg.699]


See other pages where Maximum entropy criterion is mentioned: [Pg.16]    [Pg.16]    [Pg.129]    [Pg.407]    [Pg.160]    [Pg.105]    [Pg.228]    [Pg.186]    [Pg.799]    [Pg.369]    [Pg.109]    [Pg.11]    [Pg.122]    [Pg.295]    [Pg.2]    [Pg.440]    [Pg.583]    [Pg.100]    [Pg.128]    [Pg.364]    [Pg.209]    [Pg.226]    [Pg.249]    [Pg.82]   
See also in sourсe #XX -- [ Pg.92 ]

See also in sourсe #XX -- [ Pg.34 ]




SEARCH



Entropy criterion

Maximum entropy

© 2024 chempedia.info