A useful equation: $S=R \ln(g)$
Open any P-Chem textbook and you'll find this expression for the entropy (and often a reference to the fact it is inscribed on Boltzmann's tombstone):$$S=k\ln(W)$$ $W$ is the multiplicity of the system, i.e. the number of (microscopic) arrangements producing the same (macroscopic) state, and is given by$$W=\frac{N!}{N_1!N_2!N_3!...N_g!}$$Here $N$ is the number of molecules and $N_i$ is the number of molecules with a particular microscopic arrangement $i$ of which there are $g$ different kinds.
Confused? Believe me you are not the only one, and most scientists never use this form of the equation anyway. Instead they usually assume that all these microscopic arrangements have the same energy or are degenerate (same thing). This means that each macroscopic arrangement is equally likely and $N_1=N_2=...=N/g$. This simplifies the expression for the multiplicity, $$W=g^N$$and entropy$$S=Nk\ln(g)$$significantly and for a mole of molecules we have $$S=R\ln(g)$$This formula relates the entropy to the degeneracy $g$, the number of microscopic arrangements with the same energy
A simple example
Let's say two molecules A and B bind to a receptor R through a single hydrogen bond (indicated by "||||" in the figure) with the same strength.
If you mix equal amounts of A, B, and R you will get more R-A than R-B at equilibrium even though the hydrogen bond strength is the same in the two complexes. This is because molecule A can bind in four different ways while B can only bind one way, i.e. the R-A complex has a degeneracy of four ($g=4$) and the R-B complex has a degeneracy of one ($g=1$). Put another way, the R-A complex is more likely because it has a higher entropy ($S=R\ln(4)$) than the R-B complex ($S=R\ln(1)$).
Ifs, ands, or buts
Of course this is a simplified picture where we only focus on conformational entropy and ignore contributions from translation, rotation and vibration, not only in the complexes but also for free A and B.
Also it is quite unlikely that the hydrogen bond strength for two molecules will be identical or that molecule A will be perfectly symmetrical so that the four binding modes are perfectly degenerate. In general $S=R\ln(g)$ will give you an estimate of the maximum possible value of the conformational entropy.
See for example this interesting blog post on a paper were the authors rationalize the measured difference in binding entropy in terms of conformation. As I point out in the comments section, the conformational entropy difference ($S=R \ln(2)$) is smaller than the measured entropy difference, so there must be other - more important - contributions to the entropy change.
Derivation
If $N_1=N_2=...=N/g$ then $$W=\frac{N!}{(N/g)!^g}$$For large $N$ we can use Stirling's approximation,$x!\approx (x/e)^x$ $$W=\frac{(N/e)^N}{(N/ge)^{(N/g)g}}\\W=\left(\frac{N/e}{(N/e)(1/g)}\right)^N\\W=g^N$$
Other posts on statistical mechanics
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
Open any P-Chem textbook and you'll find this expression for the entropy (and often a reference to the fact it is inscribed on Boltzmann's tombstone):$$S=k\ln(W)$$ $W$ is the multiplicity of the system, i.e. the number of (microscopic) arrangements producing the same (macroscopic) state, and is given by$$W=\frac{N!}{N_1!N_2!N_3!...N_g!}$$Here $N$ is the number of molecules and $N_i$ is the number of molecules with a particular microscopic arrangement $i$ of which there are $g$ different kinds.
Confused? Believe me you are not the only one, and most scientists never use this form of the equation anyway. Instead they usually assume that all these microscopic arrangements have the same energy or are degenerate (same thing). This means that each macroscopic arrangement is equally likely and $N_1=N_2=...=N/g$. This simplifies the expression for the multiplicity, $$W=g^N$$and entropy$$S=Nk\ln(g)$$significantly and for a mole of molecules we have $$S=R\ln(g)$$This formula relates the entropy to the degeneracy $g$, the number of microscopic arrangements with the same energy
A simple example
Let's say two molecules A and B bind to a receptor R through a single hydrogen bond (indicated by "||||" in the figure) with the same strength.
If you mix equal amounts of A, B, and R you will get more R-A than R-B at equilibrium even though the hydrogen bond strength is the same in the two complexes. This is because molecule A can bind in four different ways while B can only bind one way, i.e. the R-A complex has a degeneracy of four ($g=4$) and the R-B complex has a degeneracy of one ($g=1$). Put another way, the R-A complex is more likely because it has a higher entropy ($S=R\ln(4)$) than the R-B complex ($S=R\ln(1)$).
Ifs, ands, or buts
Of course this is a simplified picture where we only focus on conformational entropy and ignore contributions from translation, rotation and vibration, not only in the complexes but also for free A and B.
Also it is quite unlikely that the hydrogen bond strength for two molecules will be identical or that molecule A will be perfectly symmetrical so that the four binding modes are perfectly degenerate. In general $S=R\ln(g)$ will give you an estimate of the maximum possible value of the conformational entropy.
See for example this interesting blog post on a paper were the authors rationalize the measured difference in binding entropy in terms of conformation. As I point out in the comments section, the conformational entropy difference ($S=R \ln(2)$) is smaller than the measured entropy difference, so there must be other - more important - contributions to the entropy change.
Derivation
If $N_1=N_2=...=N/g$ then $$W=\frac{N!}{(N/g)!^g}$$For large $N$ we can use Stirling's approximation,$x!\approx (x/e)^x$ $$W=\frac{(N/e)^N}{(N/ge)^{(N/g)g}}\\W=\left(\frac{N/e}{(N/e)(1/g)}\right)^N\\W=g^N$$
Other posts on statistical mechanics
This work is licensed under a Creative Commons Attribution 3.0 Unported License.