Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Entropy maximum principle

The fact that each of the two variables, S and U may be expressed as a function of the other, indicates that the extremum principle could likewise be stated in terms of either entropy or energy. The alternative to the maximum entropy principle is a minimum energy principle, valid at constant entropy, as graphically illustrated for a one-component system in figure 1, below. [Pg.417]

In many atomization processes, physical phenomena involved have not yet been understood to such an extent that mean droplet size could be expressed with equations derived directly from first principles, although some attempts have been made to predict droplet size and velocity distributions in sprays through maximum entropy principle.I252 432] Therefore, the correlations proposed by numerous studies on droplet size distributions are mainly empirical in nature. However, the empirical correlations prove to be a practical way to determine droplet sizes from process parameters and relevant physical properties of liquid and gas involved. In addition, these previous studies have provided insightful information about the effects of process parameters and material properties on droplet sizes. [Pg.253]

The maximum entropy principle To produce an image or map which is maximally non-committal or minimally biased with respect to missing data, maximize the entropy of the map subject to the constraint that this map must reproduce the data which generated it within experimental error. [Pg.339]

In addition to this, average properties like (r > or (/> ) play a special role in the formulation of bounds or approximations to different properties like the kinetic energy [4,5], the average of the radial and momentum densities [6,7] and p(0) itself [8,9,10] they also are the basic information required for the application of bounds to the radial electron density p(r), the momentum one density y(p), the form factor and related functions [11,12,13], Moreover they are required as input in some applications of the Maximum-entropy principle to modelize the electron radial and momentum densities [14,15],... [Pg.216]

The Maximum-Entropy Principle. The mean-field and the quasi-chemical approximations can be extended to larger clusters. Using the derivation of Section 3.1.3 to obtain a quasi-chemical approximation for a cluster with more sites is quite cumbersome. In this section we present an approach that is new and that unifies various approximations to deal with multi-site probabilities. It is based on the maximum-entropy principle. Suppose we have a cluster of n sites with occupations Xi, X2,. .., X . We define an entropy... [Pg.134]

The maximum-entropy principle that we have used above does not include any effects of lateral interactions. We can include them as well if we extend to definition of the entropy in eqn. (19) to that of a free energy... [Pg.138]

The proportionality constant in this expression can be obtained by normalization of the probabilities. As with the maximum-entropy principle we are more interested in the situation where we have other restrictions. Suppose we look again at restrictions that are given by eqn. (21). Introducing Lagrange multipliers gives us... [Pg.138]

Analytical expressions for the probabilities may be obtained from the maximum-entropy principle, but it may be necessary to make additional assumptions. For example, eqn. (25) is not sufficient to get the mean-field approximation. This can be seen as follows. Instead of eqn. (26) we get... [Pg.139]

We can summarize this situation by the statement that the ML object obeys Jaynes s maximum-entropy principle when the white object definition (31) of maximum ignorance is used and when the object is of such low intensity that the df sites are mostly unoccupied. The latter situation is obeyed by weak astronomical objects such as planets in the visible and IR regions and the sun in the visible region (see Kikuchi and Softer, 1977). [Pg.248]

In this review, we begin with a treatment of the functional theory employing as basis the maximum entropy principle for the determination of the density matrix of equilibrium ensembles of any system. This naturally leads to the time-dependent functional theory which will be based on the TD-density matrix which obeys the von Neumann equation of motion. In this way, we present a unified formulation of the functional theory of a condensed matter system for both equilibrium and non-equilibrium situations, which we hope will give the reader a complete picture of the functional approach to many-body interacting systems of interest to condensed matter physics and chemistry. [Pg.175]

A very useful criterion in this respect is given by the maximum entropy principle in the sense of Jaynes." The ingredients of the maximum entropy principle are (i) some reference probability distribution on the pure states and (ii) a way to estimate the quality of some given probability distribution p. on the pure states with respect to the reference distribution. As our reference probability distribution, we shall take the equidistribution defined in Eq. (30), for a two-level system (this definition of equipartition can be generalized to arbitrary dxd matrices, being the canonical measure on the d-dimensional complex projective plane - " ). The relative entropy of some probability distribution pf [see Eq. (35)] with respect to yXgqp is defined as... [Pg.125]

The maximum entropy principle can then be formulated as follows Assume that a given quantum system is in a pure state but that this pure state is unknown. Assume furthermore that the only knowledge about the system is some nonpure state with density operator D. Then the probability of finding the pure state in some subset of all pure states is described by the probability distribution p of pure states having maximum entropy with respect to equipartition. The probability distribution p is chosen from all the probability distributions p of pure states to yield the given density operator via mixing in the sense of Eq. (37). [Pg.125]

A thermal state can be decomposed according to the maximum entropy principle. The advantages are that (a) this decomposition conforms with dynamical stability under external perturbations, (b) it may lead to quantum or chemical descriptions in the sense of Section VII, depending on the particular situation (i.e., the level splitting or the number of spins considered, etc.), (c) no particular operators such as the nuclear position operators are distinguished, and (d) the maximum entropy decomposition is uniquely determined. [Pg.126]

Let us now turn to the individual formalism of quantum mechanics again, where thermal states are decomposed in a canonical way according to the maximum entropy principle. We cannot compute yet maximum entropy decompositions and large-deviation entropies for molecular situations. Nevertheless, the (simpler) example of the Curie-Weiss magnet suggests that even in molecular situations ... [Pg.132]

In modem physics, there exist alternative theories for the equilibrium statistical mechanics [1, 2] based on the generalized statistical entropy [3-12]. They are compatible with the second part of the second law of thermodynamics, i.e., the maximum entropy principle [13-14], which leads to uncertainty in the definition of the statistical entropy and consequently the equilibrium probability density functions. This means that the equilibrium statistical mechanics is in a crisis. Thus, the requirements of the equilibrium thermodynamics shall have an exclusive role in selection of the right theory for the equilibrium statistical mechanics. The main difficulty in foundation of the statistical mechanics based on the generalized statistical entropy, i.e., the deformed Boltzmann-Gibbs entropy, is the problem of its connection with the equilibrium thermodynamics. The proof of the zero law of thermodynamics and the principle of additivity... [Pg.303]

The maximum entropy method also offers the possibility of super-resolution , i.e., better resolution than might be anticipated than the simple analogy with optical systems (section 2(a)). Series termination effects in conventional Fourier syntheses lead to negative regions around the peaks. The maximum entropy principle ensures that the density is everywhere positive and gives much sharper peaks in which series termination effects have been suppressed [261,263]. [Pg.408]

The maximum entropy principle has been applied already, for example, in the structure determination of the Pfl filamentous virus at 4 A resolution [264]. Combination of the data from one heavy atom derivative and the maximum entropy method led to an interpretable map which showed the helical subunits of the virion. [Pg.408]

Nine years after Shannon s paper, Edwin T. Jaynes published a synthesis of the work of Cox and Shannon (11). In this paper Jaynes presented the "Maximum Entropy Principle" as a principle in general statistical inference, applicable in a wide variety of fields. The principle is simple. If you know something but don t know everything, encode what you know using probabilities as defined by Cox. Assign the probabilities to maximize the entropy, defined by Shannon, consistent with what you know. This is the principle of "minimum prejudice." Jaynes applied the principle in communication theory and statistical physics. It was easy to extend the theory to include classical thermodynamics and supply the equations complementary to the Rothstein paper(12). [Pg.279]

In May of 1978, we held a conference at M.I.T. dealing with the Maximum Entropy Principle. Papers were given in a variety of fields illustrating how widely these influences have spread. [Pg.279]

Jim Keck used maximum entropy methods to treat the very large number of simultaneous equations which occur when we try to calculate the pollutants produced in minute quantities during combustion (25). Faced with a large number of rate equations (say 20 or more) for which rate constants were not available, Professor Keck relied upon the maximum entropy principle to make the best possible estimate, again with good results. [Pg.284]


See other pages where Entropy maximum principle is mentioned: [Pg.311]    [Pg.741]    [Pg.68]    [Pg.268]    [Pg.338]    [Pg.120]    [Pg.530]    [Pg.135]    [Pg.76]    [Pg.81]    [Pg.132]    [Pg.28]    [Pg.72]    [Pg.283]    [Pg.329]    [Pg.349]    [Pg.407]   
See also in sourсe #XX -- [ Pg.417 ]

See also in sourсe #XX -- [ Pg.530 ]

See also in sourсe #XX -- [ Pg.76 , Pg.81 ]

See also in sourсe #XX -- [ Pg.125 , Pg.127 , Pg.132 , Pg.133 ]

See also in sourсe #XX -- [ Pg.279 ]




SEARCH



Maximum entropy

Maximum entropy principle Mechanics

Maximum principle

Principle of maximum entropy

The Principle of Maximum Entropy

© 2024 chempedia.info