Entropy (energy dispersal)

In thermodynamics, the interpretation of entropy as a measure of energy dispersal has been exercised against the background of the traditional view, introduced by Ludwig Boltzmann, of entropy as a quantitative measure of disorder. The energy dispersal approach avoids the ambiguous term 'disorder'. An early advocate of the energy dispersal conception was Edward A. Guggenheim in 1949, using the word 'spread'.[1][2]

In this alternative approach, entropy is a measure of energy dispersal or spread at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its temperature.

Some educators propose that the energy dispersal idea is easier to understand than the traditional approach. The concept has been used to facilitate teaching entropy to students beginning university chemistry and biology.

Comparisons with traditional approach

The term "entropy" has been in use from early in the history of classical thermodynamics, and with the development of statistical thermodynamics and quantum theory, entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels.

Such descriptions have tended to be used together with commonly used terms such as disorder and randomness, which are ambiguous,[3][4][5] and whose everyday meaning is the opposite of what they are intended to mean in thermodynamics. Not only does this situation cause confusion, but it also hampers the teaching of thermodynamics. Students were being asked to grasp meanings directly contradicting their normal usage, with equilibrium being equated to "perfect internal disorder" and the mixing of milk in coffee from apparent chaos to uniformity being described as a transition from an ordered state into a disordered state.

The description of entropy as the amount of "mixedupness" or "disorder," as well as the abstract nature of the statistical mechanics grounding this notion, can lead to confusion and considerable difficulty for those beginning the subject.[6][7] Even though courses emphasised microstates and energy levels, most students could not get beyond simplistic notions of randomness or disorder. Many of those who learned by practising calculations did not understand well the intrinsic meanings of equations, and there was a need for qualitative explanations of thermodynamic relationships.[8][9]

Arieh Ben-Naim recommends abandonment of the word entropy, rejecting both the 'dispersal' and the 'disorder' interpretations; instead he proposes the notion of "missing information" about microstates as considered in statistical mechanics, which he regards as commonsensical.[10]


Increase of entropy in a thermodynamic process can be described in terms of "energy dispersal" and the "spreading of energy," while avoiding mention of "disorder" except when explaining misconceptions. All explanations of where and how energy is dispersing or spreading have been recast in terms of energy dispersal, so as to emphasise the underlying qualitative meaning.[6]

In this approach, the second law of thermodynamics is introduced as "Energy spontaneously disperses from being localized to becoming spread out if it is not hindered from doing so," often in the context of common experiences such as a rock falling, a hot frying pan cooling down, iron rusting, air leaving a punctured tyre and ice melting in a warm room. Entropy is then depicted as a sophisticated kind of "before and after" yardstick — measuring how much energy is spread out over time as a result of a process such as heating a system, or how widely spread out the energy is after something happens in comparison with its previous state, in a process such as gas expansion or fluids mixing (at a constant temperature). The equations are explored with reference to the common experiences, with emphasis that in chemistry the energy that entropy measures as dispersing is the internal energy of molecules.

The statistical interpretation is related to quantum mechanics in describing the way that energy is distributed (quantized) amongst molecules on specific energy levels, with all the energy of the macrostate always in only one microstate at one instant. Entropy is described as measuring the energy dispersal for a system by the number of accessible microstates, the number of different arrangements of all its energy at the next instant. Thus, an increase in entropy means a greater number of microstates for the final state than for the initial state, and hence more possible arrangements of a system's total energy at any one instant. Here, the greater 'dispersal of the total energy of a system' means the existence of many possibilities.[11]

Continuous movement and molecular collisions visualised as being like bouncing balls blown by air as used in a lottery can then lead on to showing the possibilities of many Boltzmann distributions and continually changing "distribution of the instant", and on to the idea that when the system changes, dynamic molecules will have a greater number of accessible microstates. In this approach, all everyday spontaneous physical happenings and chemical reactions are depicted as involving some type of energy flows from being localized or concentrated to becoming spread out to a larger space, always to a state with a greater number of microstates.[12]

This approach provides a good basis for understanding the conventional approach, except in very complex cases where the qualitative relation of energy dispersal to entropy change can be so inextricably obscured that it is moot.[12] Thus in situations such as the entropy of mixing when the two or more different substances being mixed are at the same temperature and pressure so there will be no net exchange of heat or work, the entropy increase will be due to the literal spreading out of the motional energy of each substance in the larger combined final volume. Each component’s energetic molecules become more separated from one another than they would be in the pure state, when in the pure state they were colliding only with identical adjacent molecules, leading to an increase in its number of accessible microstates.[13]

Current adoption

Variants of the energy dispersal approach have been adopted in number of undergraduate chemistry texts, mainly in the United States. One respected text states:

The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of 'disorder' and the 'dispersal' of matter and energy that are used widely to introduce the concept of entropy: a more 'disorderly' distribution of energy and matter corresponds to a greater number of micro-states associated with the same total energy. — Atkins & de Paula (2006)[14]:81


The concept of 'dissipation of energy' was used in Lord Kelvin's 1852 article "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy."[15] He distinguished between two types or "stores" of mechanical energy: "statical" and "dynamical." He discussed how these two types of energy can change from one form to the other during a thermodynamic transformation. When heat is created by any irreversible process (such as friction), or when heat is diffused by conduction, mechanical energy is dissipated, and it is impossible to restore the initial state.[16][17]

Using the word 'spread', an early advocate of the energy dispersal concept was Edward Armand Guggenheim.[1][2] In the mid-1950s, with the development of quantum theory, researchers began speaking about entropy changes in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels, such as by the reactants and products of a chemical reaction.[18]

In 1984, the Oxford physical chemist Peter Atkins, in a book The Second Law, written for laypersons, presented a nonmathematical interpretation of what he called the "infinitely incomprehensible entropy" in simple terms, describing the Second Law of thermodynamics as "energy tends to disperse". His analogies included an imaginary intelligent being called "Boltzmann's Demon," who runs around reorganizing and dispersing energy, in order to show how the W in Boltzmann's entropy formula relates to energy dispersion. This dispersion is transmitted via atomic vibrations and collisions. Atkins wrote: "each atom carries kinetic energy, and the spreading of the atoms spreads the energy…the Boltzmann equation therefore captures the aspect of dispersal: the dispersal of the entities that are carrying the energy."[19]:78,79

In 1997, John Wrigglesworth described spatial particle distributions as represented by distributions of energy states. According to the second law of thermodynamics, isolated systems will tend to redistribute the energy of the system into a more probable arrangement or a maximum probability energy distribution, i.e. from that of being concentrated to that of being spread out. By virtue of the First law of thermodynamics, the total energy does not change; instead, the energy tends to disperse over the space to which it has access.[20] In his 1999 Statistical Thermodynamics, M.C. Gupta defined entropy as a function that measures how energy disperses when a system changes from one state to another.[21] Other authors defining entropy in a way that embodies energy dispersal are Cecie Starr[22] and Andrew Scott.[23]

In a 1996 article, the physicist Harvey S. Leff set out what he called "the spreading and sharing of energy."[24] Another physicist, Daniel F. Styer, published an article in 2000 showing that "entropy as disorder" was inadequate.[25] In an article published in the 2002 Journal of Chemical Education, Frank L. Lambert argued that portraying entropy as "disorder" is confusing and should be abandoned. He has gone on to develop detailed resources for chemistry instructors, equating entropy increase as the spontaneous dispersal of energy, namely how much energy is spread out in a process, or how widely dispersed it becomes – at a specific temperature.[6][26]

See also


  1. Dugdale, J.S. (1996). Entropy and its Physical Meaning, Taylor & Francis, London, ISBN 0748405682, Dugdale cites only Guggenheim, on page 101.
  2. Guggenheim, E.A. (1949), Statistical basis of thermodynamics, Research: A Journal of Science and its Applications, 2, Butterworths, London, pp. 450–454.
  3. Denbigh K. (1981). The Principles of Chemical Equilibrium: With Applications in Chemistry and Chemical Engineering. London: Cambridge University Press. pp. 55–56.
  4. Jaynes, E.T. (1989). Clearing up mysteries — the original goal, in Maximum Entropy and Bayesian Methods , J. Skilling, Editor, Kluwer Academic Publishers, Dordrecht, pp. 1–27, page 24.
  5. Grandy, Walter T., Jr. (2008). Entropy and the Time Evolution of Macroscopic Systems. Oxford University Press. pp. 55–58. ISBN 978-0-19-954617-6.
  6. Frank L. Lambert, 2002, "Disorder--A Cracked Crutch for Supporting Entropy Discussions," Journal of Chemical Education 79: 187. Updated version at here. Archived April 24, 2014, at the Wayback Machine
  7. Frank L. Lambert, "The Second Law of Thermodynamics (6)."
  8. Carson, E. M., and Watson, J. R., (Department of Educational and Professional Studies, Kings College, London), 2002, "Undergraduate students' understandings of entropy and Gibbs Free energy," University Chemistry Education - 2002 Papers, Royal Society of Chemistry.
  9. Sozbilir, Mustafa, PhD studies: Turkey, A Study of Undergraduates' Understandings of Key Chemical Ideas in Thermodynamics, Ph.D. Thesis, Department of Educational Studies, The University of York, 2001.
  10. Review of "Entropy and the second law: interpretation and misss-interpretationsss" in Chemistry World
  11. Frank L. Lambert, The Molecular Basis for Understanding Simple Entropy Change
  12. Frank L. Lambert, Entropy is simple, qualitatively
  13. Frank L. Lambert, Notes for a “Conversation About Entropy”: a brief discussion of both thermodynamic and "configurational" ("positional") entropy in chemistry.
  14. Atkins, Peter; de Paula, Julio (2006). Physical Chemistry (8th ed.). Oxford University Press. ISBN 0-19-870072-5.
  15. Jensen, William. (2004). "Entropy and Constraint of Motion." Journal of Chemical Education (81) 693, May
  16. Thomson, William (1852). "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy." Proceedings of the Royal Society of Edinburg, April 19.
  17. Thomson, William (1874). "Kinetic Theory of the Dissipation of Energy", Nature IX: 441-44. (April 9).
  18. Denbigh, Kenneth (1981). The Principles of Chemical Equilibrium, 4th Ed. Cambridge University Press. ISBN 0-521-28150-4.
  19. Atkins, Peter (1984). The Second Law. Scientific American Library. ISBN 0-7167-5004-X.
  20. Wrigglesworth, John (1997). Energy and Life (Modules in Life Sciences). CRC. ISBN 0-7484-0433-3. (see excerpt)
  21. Gupta, M.C. (1999). Statistical Thermodynamics. New Age Publishers. ISBN 81-224-1066-9. (see excerpt)
  22. Starr, Cecie; Taggart, R. (1992). Biology - the Unity and Diversity of Life. Wadsworth Publishing Co. ISBN 0-534-16566-4.
  23. Scott, Andrew (2001). 101 Key ideas in Chemistry. Teach Yourself Books. ISBN 0-07-139665-9.
  24. Leff, H. S., 1996, "Thermodynamic entropy: The spreading and sharing of energy," Am. J. Phys. 64: 1261-71.
  25. Styer D. F., 2000, Am. J. Phys. 68: 1090-96.
  26. "A Student's Approach to the Second Law and Entropy". 2009-07-17. Archived from the original on July 17, 2009. Retrieved 2014-12-12.

Further reading

Texts using the energy dispersal approach

  • Atkins, P. W., Physical Chemistry for the Life Sciences. Oxford University Press, ISBN 0-19-928095-9; W. H. Freeman, ISBN 0-7167-8628-1
  • Benjamin Gal-Or, "Cosmology, Physics and Philosophy", Springer-Verlag, New York, 1981, 1983, 1987 ISBN 0-387-90581-2
  • Bell, J., et al., 2005. Chemistry: A General Chemistry Project of the American Chemical Society, 1st ed. W. H. Freeman, 820pp, ISBN 0-7167-3126-6
  • Brady, J.E., and F. Senese, 2004. Chemistry, Matter and Its Changes, 4th ed. John Wiley, 1256pp, ISBN 0-471-21517-1
  • Brown, T. L., H. E. LeMay, and B. E. Bursten, 2006. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, ISBN 0-13-109686-9
  • Ebbing, D.D., and S. D. Gammon, 2005. General Chemistry, 8th ed. Houghton-Mifflin, 1200pp, ISBN 0-618-39941-0
  • Ebbing, Gammon, and Ragsdale. Essentials of General Chemistry, 2nd ed.
  • Hill, Petrucci, McCreary and Perry. General Chemistry, 4th ed.
  • Kotz, Treichel, and Weaver. Chemistry and Chemical Reactivity, 6th ed.
  • Moog, Spencer, and Farrell. Thermodynamics, A Guided Inquiry.
  • Moore, J. W., C. L. Stanistski, P. C. Jurs, 2005. Chemistry, The Molecular Science, 2nd ed. Thompson Learning. 1248pp, ISBN 0-534-42201-2
  • Olmsted and Williams, Chemistry, 4th ed.
  • Petrucci, Harwood, and Herring. General Chemistry, 9th ed.
  • Silberberg, M.S., 2006. Chemistry, The Molecular Nature of Matter and Change, 4th ed. McGraw-Hill, 1183pp, ISBN 0-07-255820-2
  • Suchocki, J., 2004. Conceptual Chemistry 2nd ed. Benjamin Cummings, 706pp, ISBN 0-8053-3228-6
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.