next up previous
Next: About this document ... Up: Lecture_16_web Previous: A Survey of Molar

Microscopic Origins of Entropy in Materials

In pure substances, entropy may be divided into three parts.

  1. Translational degrees of freedom (e.g., monatomic ideal gas molecules have these).

  2. Rotational degrees of freedom (e.g., non-spherical molecules in fluids have these).

  3. Vibrational degrees of freedom (e.g., non-spherical fluid molecules and solids have these).

In non-pure substances (e.g. a solution of \bgroup\color{blue}$ A$\egroup- \bgroup\color{blue}$ B$\egroup) another degree of freedom arises which relates to the ways that the system can be mixed up. Entropy tends to decrease with addition of a solute--this will be discussed when we consider solutions.

Statistical Mechanical Definition of Entropy
Extra (i.e., not on test) but Potentially Instructive Material
Ludwig Boltzmann is credited with the discovery of microscopic definition of entropy. The definition is remarkably simple--however, it is not at all simple to show that Boltzmann's microscopic definition of entropy is the same one as we use in continuum thermodynamics: $ dS = dq_{rev}/T$.

Boltzmann's definition relates the entropy of a system to the number of different ways that the system can store a fixed amount of energy:

$\displaystyle S(U) = k \log \Omega(U)$ (16-4)

where $ U$ is the internal energy of a system and $ \Omega$ is the number of distinguishable ways that the system can be arranged with a fixed amount of internal energy. The proportionality constant is the same one that relates kinetic energy to temperature--it is called Boltzmann's constant and is the ideal gas constant $ R$ divided by Avogadro's number.

One way to interpret equation 16-4 is that if an isolated system (i.e., fixed $ U$) can undergo some change such that the number of states increases, it will do so. The interpretation of this is that if all observations are equally probable, the state of the system with the most numerable observations is the one that is most likely to be observed. This is a slight change from our continuum definition of the second law which stated that an observation of decreased universal entropy is impossible. The statistical mechanical definition says that an observation of universal increased entropy is not probable.

Every course in statistical mechanics will have a section that demonstrates out that--for all but the very smallest systems--the probability of an observation of entropy decrease is so small that it virtually never occurs.

Why should there be a logarithm in equation 16-4? Remember that entropy is an extensive quantity. The number of different states of two systems $ A$ and $ B$ increase when those state are considered together is the product $ \Omega_A \Omega_B$.1Therefore, if $ S_{total} = S_{A} + S_{B}$ then a function is needed that when the product of the number of states occurs then the function is added. The logarithm has this behavior: $ \log \Omega_{AB} = \log \Omega_A + \log \Omega_B$.

1 To demonstrate this consider the number of states that a coin (heads ($ h$) or tails ($ t$)-- $ \Omega_{coin} = 2$) and a die (die is the singular of dice, $ \Omega_{die} = 6$) can occupy when the coin and die are considered together. The possible states are $ \overbrace{h 1}, \overbrace{h 2},
\ldots \overbrace{h 6}, \overbrace{t 1}, \ldots \overbrace{t 6}$ and there are $ 2 \times 6 = 12$ different states.


Figure 16-4: Ludwig Boltzmann's ground state and, apparently, the formula for which he wished to be remembered with....
\begin{figure}\resizebox{6in}{!}
{\epsfig{file=figures/boltztomb.eps}}
\end{figure}



next up previous
Next: About this document ... Up: Lecture_16_web Previous: A Survey of Molar
W. Craig Carter 2002-10-15