Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistical entropy

Voigt-Martin, I.G., Zhang, Z.H., Kolb, U. and Gihnore, C.J. (1997) The use of maximum entropy statistics combined with simulation methods to determine the structure of 4-dimethylamino-3-cyanobiphenyl Ultramicroscopy, 68,43-59. [Pg.354]

Ben-Naim, A. (2008a), A Farewell to Entropy. Statistical Thermodynamics Based on Information. World Scientific, Singapore. [Pg.612]

In the literature the terms entropy and information are frequently interchanged. Arih Ben-Naim, the author of Farewell to Entropy Statistical Thermodynamics Based on Information [52] insists on going one step further and motivates not only to use the principle of maximum entropy in predicting the probability distribution [which is used in statistical physics], but to replace altogether the concept of entropy with the more suitable information. In his opinion this would replace an essentially... [Pg.161]

Ben-Naim A (2008) Farewell to entropy statistical thermodynamics based on information. World Scientific Publishing Co. Pte. Ltd., Singapore... [Pg.171]

A quantitative theory of rate processes has been developed on the assumption that the activated state has a characteristic enthalpy, entropy and free energy the concentration of activated molecules may thus be calculated using statistical mechanical methods. Whilst the theory gives a very plausible treatment of very many rate processes, it suffers from the difficulty of calculating the thermodynamic properties of the transition state. [Pg.402]

Perception and Entropy Inspired Ultrasonic Grain Noise Suppression, Using Noncoherent Detector Statistics. [Pg.89]

A novel approach for suppression of grain noise in ultrasonic signals, based on noncoherent detector statistics and signal entropy, is presented. The performance of the technique is demonstrated using ultrasonic B-scans from samples with coarse material structure. [Pg.89]

In statistical terms, a perceptual improvement is therefore obtained if the amplitude distribution in the filtered signal (image) is more concentrated around zero than in the raw data (contrast enhancement). A more concentrated amplitude distribution generally means smaller entropy. Thus, from an operator perception point of view, interesting results should be obtained if the raw data can be filtered to yield low entropy amplitude distributions. However, one should note that the entropy can be minimized by means of a (pathological) filter which always outputs zero or another constant value. Thus, appropriate restrictions must be imposed on the filter construction process. [Pg.89]

Statistical Thermodynamics of Adsorbates. First, from a thermodynamic or statistical mechanical point of view, the internal energy and entropy of a molecule should be different in the adsorbed state from that in the gaseous state. This is quite apart from the energy of the adsorption bond itself or the entropy associated with confining a molecule to the interfacial region. It is clear, for example, that the adsorbed molecule may lose part or all of its freedom to rotate. [Pg.582]

It is of interest in the present context (and is useful later) to outline the statistical mechanical basis for calculating the energy and entropy that are associated with rotation [66]. According to the Boltzmann principle, the time average energy of a molecule is given by... [Pg.582]

Thus from an adsorption isotherm and its temperature variation, one can calculate either the differential or the integral entropy of adsorption as a function of surface coverage. The former probably has the greater direct physical meaning, but the latter is the quantity usually first obtained in a statistical thermodynamic adsorption model. [Pg.645]

In general, it seems more reasonable to suppose that in chemisorption specific sites are involved and that therefore definite potential barriers to lateral motion should be present. The adsorption should therefore obey the statistical thermodynamics of a localized state. On the other hand, the kinetics of adsorption and of catalytic processes will depend greatly on the frequency and nature of such surface jumps as do occur. A film can be fairly mobile in this kinetic sense and yet not be expected to show any significant deviation from the configurational entropy of a localized state. [Pg.709]

For those who are familiar with the statistical mechanical interpretation of entropy, which asserts that at 0 K substances are nonnally restricted to a single quantum state, and hence have zero entropy, it should be pointed out that the conventional thennodynamic zero of entropy is not quite that, since most elements and compounds are mixtures of isotopic species that in principle should separate at 0 K, but of course do not. The thennodynamic entropies reported in tables ignore the entropy of isotopic mixing, and m some cases ignore other complications as well, e.g. ortho- and para-hydrogen. [Pg.371]

The principle of tire unattainability of absolute zero in no way limits one s ingenuity in trying to obtain lower and lower thennodynamic temperatures. The third law, in its statistical interpretation, essentially asserts that the ground quantum level of a system is ultimately non-degenerate, that some energy difference As must exist between states, so that at equilibrium at 0 K the system is certainly in that non-degenerate ground state with zero entropy. However, the As may be very small and temperatures of the order of As/Zr (where k is the Boltzmaim constant, the gas constant per molecule) may be obtainable. [Pg.373]

As we have seen, the third law of thermodynamics is closely tied to a statistical view of entropy. It is hard to discuss its implications from the exclusively macroscopic view of classical themiodynamics, but the problems become almost trivial when the molecular view of statistical themiodynamics is introduced. Guggenlieim (1949) has noted that the usefiihiess of a molecular view is not unique to the situation of substances at low temperatures, that there are other limiting situations where molecular ideas are helpfid in interpreting general experimental results ... [Pg.374]

The entropy of mixing of very similar substances, i.e. the ideal solution law, can be derived from the simplest of statistical considerations. It too is a limiting law, of which the most nearly perfect example is the entropy of mixing of two isotopic species. [Pg.374]

By the standard methods of statistical thermodynamics it is possible to derive for certain entropy changes general formulas that cannot be derived from the zeroth, first, and second laws of classical thermodynamics. In particular one can obtain formulae for entropy changes in highly di.sperse systems, for those in very cold systems, and for those associated, with the mixing ofvery similar substances. [Pg.374]

Nearly ten years ago, Tsallis proposed a possible generalization of Gibbs-Boltzmann statistical mechanics. [1] He built his intriguing theory on a reexpression of the Gibbs-Shannon entropy S = —k Jp r) np r)dr written... [Pg.197]

When g = 1 the extensivity of the entropy can be used to derive the Boltzmann entropy equation 5 = fc In W in the microcanonical ensemble. When g 1, it is the odd property that the generalization of the entropy Sq is not extensive that leads to the peculiar form of the probability distribution. The non-extensivity of Sq has led to speculation that Tsallis statistics may be applicable to gravitational systems where interaction length scales comparable to the system size violate the assumptions underlying Gibbs-Boltzmann statistics. [4]... [Pg.199]

To reiterate a point that we made earlier, these problems of accurately calculating the free energy and entropy do not arise for isolated molecules that have a small number of well-characterised minima which can all be enumerated. The partition function for such systems can be obtained by standard statistical mechanical methods involving a summation over the mini mum energy states, taking care to include contributions from internal vibrational motion. [Pg.329]

The Boltzmann distribution is fundamental to statistical mechanics. The Boltzmann distribution is derived by maximising the entropy of the system (in accordance with the second law of thermodynamics) subject to the constraints on the system. Let us consider a system containing N particles (atoms or molecules) such that the energy levels of the... [Pg.361]

It is not particularly difficult to introduce thermodynamic concepts into a discussion of elasticity. We shall not explore all of the implications of this development, but shall proceed only to the point of establishing the connection between elasticity and entropy. Then we shall go from phenomenological thermodynamics to statistical thermodynamics in pursuit of a molecular model to describe the elastic response of cross-linked networks. [Pg.138]

A great many liquids have entropies of vaporization at the normal boiling point in the vicinity of this value (see benzene above), a generalization known as Trouton s rule. Our interest is clearly not in evaporation, but in the elongation of elastomers. In the next section we shall apply Eq. (3.21) to the stretching process for a statistical—and therefore molecular—picture of elasticity. [Pg.144]

By combining random flight statistics from Chap. 1 with the statistical definition of entropy from the last section, we shall be able to develop a molecular model for the stress-strain relationship in a cross-linked network. It turns out to be more convenient to work with the ratio of stretched to unstretched lengths L/Lq than with y itself. Note the relationship between these variables ... [Pg.145]

In Chap. 8 we discuss the thermodynamics of polymer solutions, specifically with respect to phase separation and osmotic pressure. We shall devote considerable attention to statistical models to describe both the entropy and the enthalpy of mixtures. Of particular interest is the idea that the thermodynamic... [Pg.495]

A second way of dealing with the relationship between aj and the experimental concentration requires the use of a statistical model. We assume that the system consists of Nj molecules of type 1 and N2 molecules of type 2. In addition, it is assumed that the molecules, while distinguishable, are identical to one another in size and interaction energy. That is, we can replace a molecule of type 1 in the mixture by one of type 2 and both AV and AH are zero for the process. Now we consider the placement of these molecules in the Nj + N2 = N sites of a three-dimensional lattice. The total number of arrangements of the N molecules is given by N , but since interchanging any of the I s or 2 s makes no difference, we divide by the number of ways of doing the latter—Ni and N2 , respectively—to obtain the total number of different ways the system can come about. This is called the thermodynamic probabilty 2 of the system, and we saw in Sec. 3.3 that 2 is the basis for the statistical calculation of entropy. For this specific model... [Pg.511]

The thermal conductivity of soHd iodine between 24.4 and 42.9°C has been found to remain practically constant at 0.004581 J/(cm-s-K) (33). Using the heat capacity data, the standard entropy of soHd iodine at 25°C has been evaluated as 116.81 J/ (mol-K), and that of the gaseous iodine at 25°C as 62.25 J/(mol-K), which compares satisfactorily with the 61.81 value calculated by statistical mechanics (34,35). [Pg.359]

More fundamental treatments of polymer solubihty go back to the lattice theory developed independentiy and almost simultaneously by Flory (13) and Huggins (14) in 1942. By imagining the solvent molecules and polymer chain segments to be distributed on a lattice, they statistically evaluated the entropy of solution. The enthalpy of solution was characterized by the Flory-Huggins interaction parameter, which is related to solubihty parameters by equation 5. For high molecular weight polymers in monomeric solvents, the Flory-Huggins solubihty criterion is X A 0.5. [Pg.435]

Various equations of state have been developed to treat association ia supercritical fluids. Two of the most often used are the statistical association fluid theory (SAET) (60,61) and the lattice fluid hydrogen bonding model (LEHB) (62). These models iaclude parameters that describe the enthalpy and entropy of association. The most detailed description of association ia supercritical water has been obtained usiag molecular dynamics and Monte Carlo computer simulations (63), but this requires much larger amounts of computer time (64—66). [Pg.225]

Most of the assumptions are based on idealized models, indicating the limitations of the mathematical methods employed and the quantity and type of experimental data available. For example, the details of the combinatorial entropy of a binary mixture may be well understood, but modeling requires, in large measure, uniformity so the statistical relationships can be determined. This uniformity is manifested in mixing rules and a minimum number of adjustable parameters so as to avoid problems related to the mathematics, eg, local minima and multiple solutions. [Pg.252]


See other pages where Statistical entropy is mentioned: [Pg.92]    [Pg.92]    [Pg.92]    [Pg.92]    [Pg.610]    [Pg.611]    [Pg.370]    [Pg.396]    [Pg.2246]    [Pg.2521]    [Pg.14]    [Pg.172]    [Pg.197]    [Pg.197]    [Pg.213]    [Pg.338]    [Pg.721]    [Pg.167]    [Pg.506]    [Pg.411]   
See also in sourсe #XX -- [ Pg.112 , Pg.113 ]

See also in sourсe #XX -- [ Pg.38 ]

See also in sourсe #XX -- [ Pg.5 , Pg.53 , Pg.55 ]

See also in sourсe #XX -- [ Pg.64 ]

See also in sourсe #XX -- [ Pg.41 ]

See also in sourсe #XX -- [ Pg.133 , Pg.134 , Pg.135 , Pg.136 , Pg.137 ]




SEARCH



Entropy statistical calculation

Entropy statistical definition

Entropy statistical description

Entropy statistical interpretation

Entropy statistical mechanical

Entropy statistical mechanics

Entropy statistical thermodynamics definition

Nonequilibrium statistical mechanics entropy production

Statistical Assemblies and the Entropy

Statistical Treatment of Entropy

Statistical analogues of the entropy and Helmholtz free energy

Statistical and Classical Entropy

Statistical definition of entropy

Statistical interpretation, of entropy

Statistical model residual entropy

Statistical thermodynamics Gibbs entropy function

Statistical thermodynamics entropy

Temperature and Entropy in Quantum Statistics

The Statistical Definition of Entropy

The Statistical Interpretation of Entropy

The relation between thermodynamic and statistical entropy

The statistical mechanical interpretation of entropy

© 2024 chempedia.info