Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Information theory thermodynamics

A third hint of a connection between physics and information theory comes from the thermodynamics of Black Holes, which is still a deep mystery that embodies principles from quantum mechanics, statistical mechanics and general relativity. [Pg.636]

The entropies per unit time as well as the thermodynamic entropy production entering in the formula (101) can be interpreted in terms of the numbers of paths satisfying different conditions. In this regard, important connections exist between information theory and the second law of thermodynamics. [Pg.121]

It is most remarkable that the entropy production in a nonequilibrium steady state is directly related to the time asymmetry in the dynamical randomness of nonequilibrium fluctuations. The entropy production turns out to be the difference in the amounts of temporal disorder between the backward and forward paths or histories. In nonequilibrium steady states, the temporal disorder of the time reversals is larger than the temporal disorder h of the paths themselves. This is expressed by the principle of temporal ordering, according to which the typical paths are more ordered than their corresponding time reversals in nonequilibrium steady states. This principle is proved with nonequilibrium statistical mechanics and is a corollary of the second law of thermodynamics. Temporal ordering is possible out of equilibrium because of the increase of spatial disorder. There is thus no contradiction with Boltzmann s interpretation of the second law. Contrary to Boltzmann s interpretation, which deals with disorder in space at a fixed time, the principle of temporal ordering is concerned by order or disorder along the time axis, in the sequence of pictures of the nonequilibrium process filmed as a movie. The emphasis of the dynamical aspects is a recent trend that finds its roots in Shannon s information theory and modem dynamical systems theory. This can explain why we had to wait the last decade before these dynamical aspects of the second law were discovered. [Pg.129]

The maximum entropy method (MEM) is an information-theory-based technique that was first developed in the field of radioastronomy to enhance the information obtained from noisy data (Gull and Daniell 1978). The theory is based on the same equations that are the foundation of statistical thermodynamics. Both the statistical entropy and the information entropy deal with the most probable distribution. In the case of statistical thermodynamics, this is the distribution of the particles over position and momentum space ( phase space ), while in the case of information theory, the distribution of numerical quantities over the ensemble of pixels is considered. [Pg.115]

In this form the 125 letters contain little or no information, but they are very rich in entropy. Such considerations have led to the conclusion that information is a form of energy information has been called negative entropy. In fact, the branch of mathematics called information theory, which is basic to the programming logic of computers, is closely related to thermodynamic theory. Living organisms are highly ordered, nonrandom structures, immensely rich in information and thus entropy-poor. [Pg.25]

Geometrical tools prove useful in addressing various problems of finite-time thermodynamics and optimal control theory. These methods also have potential applicability to thermodynamic-type applications in subjects ranging from the chemical, biological, and materials sciences to information theory. Efficient vector-algebraic tools allow such applications to be extended to systems of virtually unlimited complexity, beyond realistic reach of classical methods. [Pg.421]

The information theory approach studied here grew out of earlier studies of formation of atomic sized cavities in molecular liquids (Pohorille and Pratt, 1990 Pratt and Pohorille, 1992 1993). Since we deal with rigid and spherical solutes in the discussion we will drop the explicit indication of conformational coordinates and discuss p n) = Pa n lR ). We emphasize that the overall distribution p(n) is well described by the information theory with the first two moments, (n)o and n n — 1)/2)q. It is the prediction of the extreme member p 0) that makes the differences in these default models significant. Computing thermodynamic properties demands more than merely observing typical behavior. [Pg.182]

D.J.W. Grant, T. Higuchl, Solubility Behavior of Organic Compounds. Wiley (1990). (Much Information on theory, thermodynamics, activities, group contributions aqueous and non-aqueous solvents.)... [Pg.242]

The unique features of our system enable us to use three different theoretical tools — a molecular dynamics simulation, models which focus on the repulsion between atoms and a statistical approach, based on an information theory analysis. What enables us to use a thermodynamic-like language under the seemingly extreme nonequilibrium conditions are the high density, very high energy density and the hard sphere character of the atom-atom collisions, that contribute to an unusually rapid thermalization. These conditions lead to short-range repulsive interactions and therefore enable us to use the kinematic point of view in a useful way. [Pg.28]

Since this expression is similar to that found for entropy in statistical mechanics, it is called the entropy of the probability distribution p,. Jaynes [260] shows that the thermodynamic entropy is identical to the information theory entropy except for the presence of the Boltzman constant in the former, made necessary by our arbitrary temperature scale. [Pg.407]

Thus, statistical mechanics development of the thermodynamic laws is interesting and straightforward enough, and it sheds helpful light on thermodynamics. So is the use of information theory to develop and discuss statistical mechanics. They supplement the understanding generated by the macroscopic statements, but in no sense do they replace that understanding. [Pg.253]

In thermodynamics (or more accurately, thermostatics) the principal contribution of information theory is to redefine the basic ideas of classical thermostatics along the lines forecast by Rothstein. The first quantitative treatment was published in 1961, followed by a textbook and later a sequence of papers (12, 13, 14, 15, 16, 17, 18, 19, 20, 21). ... [Pg.281]

Tribus, Myron. Information Theory as the Basis for Thermostatics and Thermodynamics, Jour. Appl. Mech., March 1961 ... [Pg.285]

Tribus, Myron. "Information Theory and Thermodynamics," Boelter Anniversary Volume, McGraw-Hill Book Co., 1963 ... [Pg.286]

Tribus, Myron Shannon, Paul T. Evans, Robert B. Why Thermodynamics is a Logical Consequence of Information Theory, A.I.Ch.E. Jour., March 1966 pp. 244-248. [Pg.286]

Tribus, Myron Costa de Beauregard, Olivier. Information Theory and Thermodynamics—A Rebuttal, Helvetica Physica Acta, 1974 Vol. 47. [Pg.286]

Molecular fragments are the mutually open subsystems, which exhibit fluctuations in their electron densities and overall numbers of electrons. In chemistry one is interested in both the equilibrium distributions of electrons and non-equilibrium processes characterized by rates. Recently, it has been demonstrated [23] that the information theory provides all necessary tools for the local dynamical description of the density fluctuations and electron flows between molecular subsystems, which closely follows the thermodynamic theory of irreversible processes [146],... [Pg.163]

The Forster cycle is not capable of observing the details of the transfer reaction, since it is a thermodynamic determination of pAT (and hence AGr ). Thermodynamics provide no direct information about the kinetics. When used in conjunction with a model for proton transfer, such as the Marcus theory, thermodynamic quantities such as AGr do provide some insight into the reaction rate. The rates derived using the thermodynamic quantity AGr when compared with the measured rate provide a test of the Marcus theory. Free energy relationships like Bronsted plots provide information on how chemical reactivity varies with chemical structure and on th nature of the transition state. ... [Pg.648]

This analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life. The approach generates hypotheses relevant to sin e individuals, types, and levels of living systems, or relevant across individuals, types, and levels. These hypotheses can be confirmed, disconfirmed, or evaluated by experiments and other empirical evidence. [Pg.361]

The principle of maximum entropy is widespread in other disciplines besides thermodynamics, for example, in information theory, economics. In fact, it implies to maximize the probability of a state of a system under certain constraints [5],... [Pg.117]


See other pages where Information theory thermodynamics is mentioned: [Pg.310]    [Pg.634]    [Pg.1016]    [Pg.28]    [Pg.95]    [Pg.188]    [Pg.16]    [Pg.390]    [Pg.87]    [Pg.382]    [Pg.17]    [Pg.3]    [Pg.257]    [Pg.113]    [Pg.194]    [Pg.117]    [Pg.277]    [Pg.119]    [Pg.122]    [Pg.133]    [Pg.176]    [Pg.178]    [Pg.382]    [Pg.254]    [Pg.16]   
See also in sourсe #XX -- [ Pg.634 ]




SEARCH



Thermodynamic theory

© 2024 chempedia.info