Processing math: 100%

Tuesday, January 10, 2012

Where does the ln come from in S = k ln(W) ?

The relationship between entropy (S) and degeneracy (W or g depending on the book) S = k\ln(W) is one of the fundamental equations of statistical mechanics.  But where does it come from?  Or more precisely, why \ln(W) and not, say, \sin(W)?  And why does k have units of J/K?

I believe it goes back the second law of thermodynamics, i.e. it is based on observation.  The second law can be stated in many ways and one is that heat (q) spontaneously flows only from hot (T_h) to cold (T_c) bodies.

Mathematically this can be stated as \frac{q}{T_c}-\frac{q}{T_h}>0 or \Delta S >0 where S is defined* as dS=\frac{dq_{rev}}{T} This establishes the units of S and, hence, k.

Furthermore, since q (like all energy) is additive, S must be additive: S_{total}=S_A+S_B.  However, W_{total}=W_AW_B, so S=kW won't work.  However, S = k\ln(W) will.

The final question is now whether the logarithm is the only mathematical function for which f(xy)=f(x)+f(y).  It turns out that it is** if we require the function to be continuous (thanks to Niels Grønbæk for help here), which we do since S as defined by the second law is non-discrete like the energy.

Another important property of \ln(W) is that is has a maximum value when W is largest.  So the most probable state will have the largest entropy.

* This definition begs the question: if heat is transferred, what T do you use, the one before or after the heat transfer?  The answer is that dq has to be so small that T is not affected.  Since T is not affected this is a reversible process.  Furthermore, notice that this law also introduces the concept of temperature.

** Se also http://www.physicsforums.com/showthread.php?t=566358.  Not that I understand all of it!

Related blog posts
Illustrating entropy
Entropy, volume, and temperature

No comments: