Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

The meaning of entropy

The L function has not found much use in geochemistry to date, and we will not use it in this text. It is included here to demonstrate that thermodynamics can in fact handle truly open systems, and because it is an interesting exercise in the use of the Legendre transform. Also it seems quite posssible that inventive geochemists will find much greater use for this (and other thermodynamic potentials ) in the future. Its derivation and use in metamorphic studies has had a somewhat troubled history, nicely reviewed and discussed by Rumble (1982). [Pg.103]

If we had to rely on classical thermodynamics, we would know little more than we have already said about entropy. It is a parameter, with a method of measurement, which increases in spontaneous processes, even when no energy changes are possible, that is, in isolated systems. We would also notice, after measuring the entropy of many substances, that the entropies of gases are relatively large, those of solids relatively small, and those of liquids somewhere in between, but we would probably not have any mental picture of what entropy represents physically. [Pg.104]

This question has been around since Clausius invented the term in 1865, and the answer takes on many forms. Some follow the historical route, from steam engines, to Carnot, Clausius, Thompson, Joule, Rankine, and so on. A particularly lucid, concise account of this history is Purrington (1997). A central feature of this approach is Carnot cycles, as used by Clausius to deduce the existence of the entropy parameter. This approach is rather abstract, and needs some manipulation to be seen to be connected to thermodynamic potentials and chemical reactions. Others emphasize the impossibility of some processes, or the availability of energy, and some have a rather unique viewpoint, such as Reiss (1965), who considers entropy as the degree of constraint.  [Pg.105]

Virtually since the beginning, however, a popular viewpoint has been to see entropy as a measure of disorder. Helmholtz used the word Unordnung (disorder) in 1882. This results from familiar relationships such as S liquids S soiids, and the universally positive entropy of mixing. We used this relationship in the previous section when we spoke of degree of mixed-upness. However the disorder analogy can involve a serious fallacy, as made clear by Lambert (1999 see also http //www.entropysite.com). [Pg.105]

Much of the discussion in statistical mechanics concerns probability distributions and their application to microstates. Information theory (Shannon, 1949) defines a quantity — called the entropy of a probability dis- [Pg.106]

On the other hand, Frank is involved with Cathy. All the other girls think that he s a nerd. But Cathy meets Frank after school so that they can study The History of Art in Frank s room. This is like having a small amoimt of heat, but it s so concentrated that it s really useful. Having a small amount of heat available at a really high temperature means the heat has only a little entropy, which is good. [Pg.244]

Liz has just reviewed the preceding analogy and has accused me of sexist stupidity. Thus, I ll offer a better example of entropy. Let s say I ve got 1000 Btu worth of heat in a big pot. But the temperature of the pot is only 68°F. It s nice to have the heat, but at 68 F, 1 can t think of any way to use it to make money. The problem is my pot has too much entropy. [Pg.244]

Alternatively, I ve got a tiny pot with 100 Btu of heat. The temperature of this tiny pot is 2000°F. 1 can use this heat to  [Pg.244]

All of which I can use to make money. My tiny hot pot has very little entropy. [Pg.244]

Concentrating more heat in a smaller pot makes the stuff inside the pot hotter and reduces its entropy. And because concentrated heat with low entropy is easier to do things with than dilute, colder heat with high entropy we say that  [Pg.244]


In 1877, the Austrian physicist Ludwig Boltzmann proposed a molecular definition of entropy that enables us to calculate the absolute entropy at any temperature (Fig. 7.6). His formula provided a way of calculating the entropy when measurements could not be made and deepened our insight into the meaning of entropy at the molecular level. The Boltzmann formula for the entropy is... [Pg.397]

Understand the meaning of entropy (5) in terms of the number of microstates over which a system s energy is dispersed describe how the second law provides the criterion for spontaneity, how the third law allows us to find absolute values of standard molar entropies (5°), and how conditions and properties of substances influence 5° ( 20.1) (SP 20.1) (EPs 20.4-20.7, 20.10-20.23)... [Pg.676]

Equations (29) and (30) demonstrate exphcitly the meaning of entropy and energy elasticity, respectively. [Pg.215]

What is the meaning of entropy of copolymerization becoming positive ... [Pg.313]

As a result of the above discussion what can be said about the meaning of entropy if our basic assumptions are correct It is evidently a measure of the mixed-up-ness of a system— a phrase used by Gibbs. It can also be said that high entropy states are those which have a high probability. The quantity Q is sometimes referred to as the thermodynamic probability , because the ratio for the... [Pg.55]


See other pages where The meaning of entropy is mentioned: [Pg.558]    [Pg.560]    [Pg.73]    [Pg.45]    [Pg.457]    [Pg.60]    [Pg.372]    [Pg.47]    [Pg.416]    [Pg.116]    [Pg.150]    [Pg.103]    [Pg.103]    [Pg.105]    [Pg.416]    [Pg.466]    [Pg.417]    [Pg.656]    [Pg.911]    [Pg.112]    [Pg.243]    [Pg.232]    [Pg.85]   


SEARCH



Entropy meaning

The Entropy

© 2024 chempedia.info