Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Entropy interpretation

C.A. Angell, Free volume-entropy interpretation of the electrical conductance of aqueous electrolyte solutions in the concentration range 2-20N,. Phys. Chem., 70,1966,3988-3998. [Pg.427]

One approach to a mathematically well defined performance measure is to interpret the amplitude values of a processed signal as realizations of a stochastic variable x which can take a discrete number of values with probabilities P , n = 1,2,..., N. Briefly motivated in the introduction, then an interesting quality measure is the entropy H x) of the amplitude distribu-... [Pg.90]

In the experiments, the probabilities were estimated from the processed signal by means of a histogram. It is well known that the entropy is large for nearly uniform distributions and small for distributions with few peaks. Thus it is an interesting candidate as a performance measure when the goal is to process a signal to become more easily interpreted. [Pg.91]

For those who are familiar with the statistical mechanical interpretation of entropy, which asserts that at 0 K substances are nonnally restricted to a single quantum state, and hence have zero entropy, it should be pointed out that the conventional thennodynamic zero of entropy is not quite that, since most elements and compounds are mixtures of isotopic species that in principle should separate at 0 K, but of course do not. The thennodynamic entropies reported in tables ignore the entropy of isotopic mixing, and m some cases ignore other complications as well, e.g. ortho- and para-hydrogen. [Pg.371]

The principle of tire unattainability of absolute zero in no way limits one s ingenuity in trying to obtain lower and lower thennodynamic temperatures. The third law, in its statistical interpretation, essentially asserts that the ground quantum level of a system is ultimately non-degenerate, that some energy difference As must exist between states, so that at equilibrium at 0 K the system is certainly in that non-degenerate ground state with zero entropy. However, the As may be very small and temperatures of the order of As/Zr (where k is the Boltzmaim constant, the gas constant per molecule) may be obtainable. [Pg.373]

As we have seen, the third law of thermodynamics is closely tied to a statistical view of entropy. It is hard to discuss its implications from the exclusively macroscopic view of classical themiodynamics, but the problems become almost trivial when the molecular view of statistical themiodynamics is introduced. Guggenlieim (1949) has noted that the usefiihiess of a molecular view is not unique to the situation of substances at low temperatures, that there are other limiting situations where molecular ideas are helpfid in interpreting general experimental results ... [Pg.374]

As defined above, the Lyapunov exponents effectively determine the degree of chaos that exists in a dynamical system by measuring the rate of the exponential divergence of initially closely neighboring trajectories. An alternative, and, from the point of view of CA theory, perhaps more fundamental interpretation of their numeric content, is an information-theoretic one. It is, in fact, not hard to see that Lyapunov exponents are very closely related to the rate of information loss in a dynamical system (this point will be made more precise during our discussion of entropy in the next section). [Pg.205]

In this expression, the term (1 + Of) is interpreted as the probability that the hypothesis represented by the neuron is true (with Of = +1 meaning true and Of = —i- 1 meaning false), and the term (1 + Sf) is interpreted as the set of desired probabilities. With these, interpretations, / effectively yields the relative entropy between these two sets of probability measnres. [Pg.546]

At first sight, self-organization appears to violate the Second Law of Thermodynamics, which asserts that the entropy S of an isolated system never decreases (or, more formally, > 0) see figure 11.2-a. Since entropy is essentially a measure of the degree of disorder in a system, the Second Law is usually interpreted to mean that an isolated system will become increasingly more disordered with time. How, then, can structure emerge after a system has had a chance to evolve ... [Pg.559]

Correlation between Ionic Entropy and Viscosity. In Chapter 9, when we noticed that certain ions in aqueous solution cause a decrease in viscosity, and asked how this should be explained, it seemed natural to interpret the effect in terms of order and disorder. In pure water at room temperature there is a considerable degree of short-range order ... [Pg.173]

The thermodynamic analysis of these systems played an important role in the interpretation of these data and of the high selectivity. It was found that selective sorption of complex organic ions is accompanied by an increase in the entropy of the system (Table 6). [Pg.20]

The third approach is called the thermodynamic theory of passive systems. It is based on the following postulates (1) The introduction of the notion of entropy is avoided for nonequilibrium states and the principle of local state is not assumed, (2) The inequality is replaced by an inequality expressing the fundamental property of passivity. This inequality follows from the second law of thermodynamics and the condition of thermodynamic stability. Further the inequality is known to have sense only for states of equilibrium, (3) The temperature is assumed to exist for non-equilibrium states, (4) As a consequence of the fundamental inequality the class of processes under consideration is limited to processes in which deviations from the equilibrium conditions are small. This enables full linearization of the constitutive equations. An important feature of this approach is the clear physical interpretation of all the quantities introduced. [Pg.646]

The interpretation of these results is, however, problematic since no data on the absolute enthalpy and entropy of the respective triple helix and coiled state are available. Though it may be taken as an established fact that the entropy of conformation of a (Pro-Pro-Gly) coil is lower than in the case of a (Pro-Ala-Gly)n coil, we are not sure whether the entropy of the triple helix depends on the imino acid content. [Pg.196]

In a channel context, with X as the input ensemble, and Y as the output ensemble, we can interpret H(X Y) as the average additional information required at the outpiit to specify an input when the output is given thus H(X Y) is known as equivocation. Similarly, I/(F. Z) can be interpreted as the part of the entropy of Y that is not information about X, and thus H( y X) is known as noise. [Pg.207]

The integral of (8) must be interpreted as follows T refers to the temperature of the body from which the element of heat SQ is taken, and the integral sums up all the quantities 8Q/T for that body. The symbol 2 further extends this to all the external bodies concerned. Thence the sum of all the magnitudes 8Q/T is negative. Now SQ/T represents the entropy lost by the external body during the small change, because SQ, being the heat absorbed by the system, will be heat lost by the external body, and the relations (8) and (8a) may therefore be expressed in words as follows ... [Pg.80]

The transition obtained under stress can be in some cases reversible, as found, for instance, for PBT. In that case, careful studies of the stress and strain dependence of the molar fractions of the two forms have been reported [83]. The observed stress-strain curves (Fig. 16) have been interpreted as due to the elastic deformation of the a form, followed by a plateau region corresponding to the a toward [t transition and then followed by the elastic deformation of the P form. On the basis of the changes with the temperature of the critical stresses (associated to the plateau region) also the enthalpy and the entropy of the transition have been evaluated [83]. [Pg.202]


See other pages where Entropy interpretation is mentioned: [Pg.24]    [Pg.26]    [Pg.91]    [Pg.206]    [Pg.524]    [Pg.531]    [Pg.119]    [Pg.90]    [Pg.204]    [Pg.630]    [Pg.111]    [Pg.903]    [Pg.395]    [Pg.396]    [Pg.4]    [Pg.328]    [Pg.238]    [Pg.842]    [Pg.28]    [Pg.215]    [Pg.223]    [Pg.88]    [Pg.184]    [Pg.194]    [Pg.646]    [Pg.195]    [Pg.198]    [Pg.197]    [Pg.209]    [Pg.87]    [Pg.88]    [Pg.520]    [Pg.531]    [Pg.61]    [Pg.154]   
See also in sourсe #XX -- [ Pg.137 ]




SEARCH



© 2024 chempedia.info