TransWikia.com

My Misconception of Entropy

Chemistry Asked by dval98 on December 31, 2020

It was recently brought to my attention that my understanding of entropy is wonky at best.

In my experience, entropy was introduced (superficially at best) during general chemistry/foundations of inorganic chem and described in terms of order/disorder of a system, usually followed by some messy room analogy. In one of my textbooks, the author, Gary Wulfsberg, was discussing how a reaction tends to favor the products "if it increases the dispersal of ions/molecules over a larger volume of space…"(Wulfsberg, G., 2018). Wulfsberg goes on to say that

The measure of this dispersal or disorder is known as the entropy (S) of the substance. A
positive entropy change for a reaction indicates increasing dispersal or disorder
(Wulfsberg, G., Foundations of Inorganic Chemistry; ch. 4, pg. 200)

which I interpreted it as him saying that all systems that increase in entropy have a corresponding increase in disorder.

Later in my undergraduate career, Boltzmann’s entropy was introduced during thermodynamics and described by my professor as the amount of micro-states available for the particles within the system and that an increase in entropy corresponds to an increase in the states that the system can exist in.

All was fine and peachy until recently, when I came across an article that discusses the Shannon Measure of Information (SMI) and entropy. In an article by Arieh Ben-Naim, he talks about how the idea of order/disorder in relation to entropy is not necessarily correct, in fact it is a fallacy that does not hold true nor can order/disorder be measured definitely for all systems. He states,

It is true that many spontaneous processes may be viewed as proceeding from an ordered to a disordered state. However, there are two difficulties with this interpretation. First, the concept of order is not well-defined, and in many processes it is difficult, if not impossible, to decide which of the two states of the system is more or less ordered.
J. Chem. Educ. 2011, 88 (5), 594–596

Additionally, he talks about how some systems do have "order parameters", however, it is not in relation to entropy and that not every process where an increase in entropy is observed has a corresponding increase in disorder. He later goes on to describe the SMI treatment of entropy "as the number of binary questions one needs to ask to find the location of the particle." Hence, if the number of yes/no questions one needs to ask to find the location of particle increases, so has the thermodynamic entropy.

So here is my question

Is entropy described by Boltzmann/statistical mechanics the same as the entropy described by Shannon information theory?

Additionally, is there any validity in relating order/disorder in describing the change in entropy of chemical systems?

One Answer

The association of entropy with disorder is anthropocentric. "Disorder" is a concept derived from our experience of the world. The association with "disorder" is clearer once we explain what we mean by "order". The following definition among those provided by Merriam-Webster most closely fits the intended meaning:

a regular or harmonious arrangement

Since regularity always implies lower entropy, all else being equal, it is therefore fair to associate "order" with lower entropy.

The association with "order" or regularity also syncs with the concept of entropy according to Boltzmann's statistical mechanical definition ($S= k_mathrm B log Omega$). $Omega$, the number of microstates available to the system, can be quantified using the entropy. More possible unique microstates implies a higher entropy. Greater regularity implies more constraints regarding arrangement of the system and therefore less possible microstates. Solids are usually more regular and therefore have lower entropy than fluid states at the same T. Same when comparing gases and liquids. The possible microstates increases from solid to liquid to gas.

This also jives with the informational-content definition. You can use less information (use a more compact description) to describe an orderly (regular) system. You need more information to describe all possible arrangements of molecules in a gas or liquid than in a solid. Think of entropy as measuring the length of the recipe required to build all possible arrangements of the system.

Answered by Buck Thorn on December 31, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP