A useful equation: $S=R \ln(g)$
Open any P-Chem textbook and you'll find this expression for the entropy (and often a reference to the fact it is inscribed on Boltzmann's tombstone):$$S=k\ln(W)$$ $W$ is the multiplicity of the system, i.e. the number of (microscopic) arrangements producing the same (macroscopic) state, and is given by$$W=\frac{N!}{N_1!N_2!N_3!...N_g!}$$Here $N$ is the number of molecules and $N_i$ is the number of molecules with a particular microscopic arrangement $i$ of which there are $g$ different kinds.
Confused? Believe me you are not the only one, and most scientists never use this form of the equation anyway. Instead they usually assume that all these microscopic arrangements have the same energy or are degenerate (same thing). This means that each macroscopic arrangement is equally likely and $N_1=N_2=...=N/g$. This simplifies the expression for the multiplicity, $$W=g^N$$and entropy$$S=Nk\ln(g)$$significantly and for a mole of molecules we have $$S=R\ln(g)$$This formula relates the entropy to the degeneracy $g$, the number of microscopic arrangements with the same energy
A simple example
Let's say two molecules A and B bind to a receptor R through a single hydrogen bond (indicated by "||||" in the figure) with the same strength.
If you mix equal amounts of A, B, and R you will get more R-A than R-B at equilibrium even though the hydrogen bond strength is the same in the two complexes. This is because molecule A can bind in four different ways while B can only bind one way, i.e. the R-A complex has a degeneracy of four ($g=4$) and the R-B complex has a degeneracy of one ($g=1$). Put another way, the R-A complex is more likely because it has a higher entropy ($S=R\ln(4)$) than the R-B complex ($S=R\ln(1)$).
Ifs, ands, or buts
Of course this is a simplified picture where we only focus on conformational entropy and ignore contributions from translation, rotation and vibration, not only in the complexes but also for free A and B.
Also it is quite unlikely that the hydrogen bond strength for two molecules will be identical or that molecule A will be perfectly symmetrical so that the four binding modes are perfectly degenerate. In general $S=R\ln(g)$ will give you an estimate of the maximum possible value of the conformational entropy.
See for example this interesting blog post on a paper were the authors rationalize the measured difference in binding entropy in terms of conformation. As I point out in the comments section, the conformational entropy difference ($S=R \ln(2)$) is smaller than the measured entropy difference, so there must be other - more important - contributions to the entropy change.
Derivation
If $N_1=N_2=...=N/g$ then $$W=\frac{N!}{(N/g)!^g}$$For large $N$ we can use Stirling's approximation,$x!\approx (x/e)^x$ $$W=\frac{(N/e)^N}{(N/ge)^{(N/g)g}}\\W=\left(\frac{N/e}{(N/e)(1/g)}\right)^N\\W=g^N$$
Other posts on statistical mechanics
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
Open any P-Chem textbook and you'll find this expression for the entropy (and often a reference to the fact it is inscribed on Boltzmann's tombstone):$$S=k\ln(W)$$ $W$ is the multiplicity of the system, i.e. the number of (microscopic) arrangements producing the same (macroscopic) state, and is given by$$W=\frac{N!}{N_1!N_2!N_3!...N_g!}$$Here $N$ is the number of molecules and $N_i$ is the number of molecules with a particular microscopic arrangement $i$ of which there are $g$ different kinds.
Confused? Believe me you are not the only one, and most scientists never use this form of the equation anyway. Instead they usually assume that all these microscopic arrangements have the same energy or are degenerate (same thing). This means that each macroscopic arrangement is equally likely and $N_1=N_2=...=N/g$. This simplifies the expression for the multiplicity, $$W=g^N$$and entropy$$S=Nk\ln(g)$$significantly and for a mole of molecules we have $$S=R\ln(g)$$This formula relates the entropy to the degeneracy $g$, the number of microscopic arrangements with the same energy
A simple example
Let's say two molecules A and B bind to a receptor R through a single hydrogen bond (indicated by "||||" in the figure) with the same strength.
If you mix equal amounts of A, B, and R you will get more R-A than R-B at equilibrium even though the hydrogen bond strength is the same in the two complexes. This is because molecule A can bind in four different ways while B can only bind one way, i.e. the R-A complex has a degeneracy of four ($g=4$) and the R-B complex has a degeneracy of one ($g=1$). Put another way, the R-A complex is more likely because it has a higher entropy ($S=R\ln(4)$) than the R-B complex ($S=R\ln(1)$).
Ifs, ands, or buts
Of course this is a simplified picture where we only focus on conformational entropy and ignore contributions from translation, rotation and vibration, not only in the complexes but also for free A and B.
Also it is quite unlikely that the hydrogen bond strength for two molecules will be identical or that molecule A will be perfectly symmetrical so that the four binding modes are perfectly degenerate. In general $S=R\ln(g)$ will give you an estimate of the maximum possible value of the conformational entropy.
See for example this interesting blog post on a paper were the authors rationalize the measured difference in binding entropy in terms of conformation. As I point out in the comments section, the conformational entropy difference ($S=R \ln(2)$) is smaller than the measured entropy difference, so there must be other - more important - contributions to the entropy change.
Derivation
If $N_1=N_2=...=N/g$ then $$W=\frac{N!}{(N/g)!^g}$$For large $N$ we can use Stirling's approximation,$x!\approx (x/e)^x$ $$W=\frac{(N/e)^N}{(N/ge)^{(N/g)g}}\\W=\left(\frac{N/e}{(N/e)(1/g)}\right)^N\\W=g^N$$
Other posts on statistical mechanics
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
8 comments:
Is it fair to call 'W' the partition function of the system?
Short answer: no. Long answer:
Another expression for the entropy is $S=k\ln(Q)+\frac{kT}{Q}\left(\frac{\partial Q}{\partial T}\right)_V$ where $Q$ is the partition function. $Q=W$ only if $Q$ is independent of $T$. This happens only if all energy levels are degenerate.
Yes, and you wrote "...they usually assume that all these microscopic arrangements have the same energy or are degenerate (same thing)."
so you're actually implying that Q=W, not?
For that special case, yes.
Just want to comment this paragraph: «If you mix equal amounts of A, B, and R you will get more R-A than R-B at equilibrium even though the hydrogen bond strength is the same in the two complexes. This is because molecule A can bind in four different ways while B can only bind one way, i.e. the R-A complex has a degeneracy of four (g=4) and the R-B complex has a degeneracy of one (g=1). Put another way, the R-A complex is more likely because it has a higher entropy (S=Rln(4)) than the R-B complex (S=Rln(1)).»
Isn´t the entropy of complexation of the case A higher, because the rotational entropy of the unbinded molecule A is smaller, rather than being the entropy of the complex itself higher?
Shouldn´t be the degeneracy 8 (A) and 2 (B), instead of 4 (A) and 1 (B)? For molecule A, 4 with the molecule facing up and 4 with the molecule facing down. And for molecule B, one with the molecule facing up and one with the molecule facing down.
Good questions! Let me take the last question first since it is a bit easier. If I understood your question correctly your talking about the extra degree of freedom due to rotation around the horizontal axis. In that case it depends on whether you view the model as being 2-dimensional or 3-dimensional. I view it as 2D where this degree of freedom is not allowed. However, if you view it as 3D (i.e. flat molecules in a 3D world) then you are correct.
Now to your first question. Yes you can view it like that if the molecule is perfectly symmetric. The this effect is included in the rotational entropy of free A via the symmetry number (see this post). However, if molecule A is even *slightly* asymmetric then the effect enters as the conformational entropy of the complex. So I think in the symmetric case the effect can also be ascribed to a conformational entropy of the complex.
One could argue that the entropy of A can be measured to settle this issue. However, I don't think that is so straightforward - even conceptually. The entropy at temperature T is measured relative to absolute zero where the entropy is zero by definition. However, this state cannot be reached, and even at very, very low T symmetric molecules will have so-called residual entropy related to their symmetry which will be hard - if not impossible - to measure accurately. But I am not sure.
Bottom line: I think the answer to your question is: ultimately what is measured is an entropy *change* and the higher entropy change for A can be explained *either* as a lower conformational entropy of the complex or a higher rotational entropy of the free ligand. Both are formally correct but I would argue the former is more general as it also works for non-symmetric molecules.
Very nice reply. Just some further points:
«However, if molecule A is even *slightly* asymmetric then the effect enters as the conformational entropy of the complex.» Is this a «law» according to Quantum Mechanics? Because, if you apply the continuous symmetry arguments of Avnir, a molecule which is «slightly» assymetric (a flexible molecule) will have a «slight» degree of assymetry. In this case, a flexible A, for example, would have S = R ln(3.9).
«Both are formally correct but I would argue the former is more general as it also works for non-symmetric molecules.» I do not fully agree with the reason here, because if for some reason the complex has an higher entropy because the ligand has more possibilities of binding, it means that the ligand itself has some symmetry operation somewhere and it can still be present on the rotational entropy of the ligand.
It is true that if you use Avnir's method you will get a non-zero entropy contribution like $S = R \ln(3.9)$, but it won't be exactly the same as the conformational entropy. $S=-R\sum_i^4 p_i\ln(p_i)$ where $p_i=e^{-(G^\circ_i-G^\circ)/RT}$ and $G^\circ=-RT\sum_i^4 e^{-G^\circ_i/RT}$. $G^\circ$ is the free energy of the R-A complex, which has four different binding modes. If A has four-fold symmetry then $G^\circ_1=G^\circ_2=G^\circ_3=G^\circ_4$, $p_i=\frac{1}{4}$ and $S=R\ln(4)$. If you include the symmetry number when computing the entropy of A then this contribution to the free energy is taking care of that way.
If A is formally $C_1$ then the four binding free energies will be different and the conformational entropy will be a value between $R\ln(4)$ and $R\ln(1)$. This is also true for Avnir's method but, based on his equations, I don't see why the entropies computed in these two ways would be the same. However, Avnir's method might yield a good approximation to the conformational entropy, I don't know.
Notice also that the binding free energies can in principle be similar completely by accident and not to do with symmetry in any way.
Post a Comment