Boltzman
The Second Law S=k Ln W was engraved by Max Planck.
Darwin
These are busts of two personalities who shaped the Science and Society in the last century.
The way we describe a system often determines the property we attribute to it. This is particularly true for general laws like the second law of thermodynamics and /or the Darwin’s theory of evolution. Whereas one predicts generation of disorder other predicts the generation of organized complexity or order. How can we reconcile the two?
Darwin's theory is about generation of order and Boltzman's theory is about disappearance of order ( interestingly, the former was a big admirer of the latter). When Darwin originally prposed his theory there was main opposition from *creationists*, but the microscopic interpretation of Darwin's theory took some time to be evident.
Essentially it implied that there was spontaneous generation of order from disorder (may think of Miller’s experiment). But Boltzman on the other hand, considered progress to disorder, from order. Let us first understand the implication of that tomb engraved relation.
* Equation on the Boltzman’s tomb \[S=k. log W\]
The equation means that If S is entropy and W is the possible number of microstates in the system S is proportional to logarithm of W, k being the famous Boltzman’s constant given by \[ k_B=k=1.38064852 \times 10^{-23} JK^{-1}=8.61 \times 10^{-5} eV K^{-1}\].
Let us now consider the following two equations involving the extensive quantity entropy licative quantity W (way of arranging a set of states \[ S=S_1+S_2\]
Like probability the number of arranging microstates are multipicative. If we have two systems each having 3 microstates there will be 9 possible microstates. If we consider 4 nucleotides a code with 3 positions will lead to \(4 \times 4 \times 4=64\) states , which due to degenearcy of codes lead to 20 amino acids. Thus for system 1 with W1 states and 2 with W2 states we an write, \[W=W_1.W_2\].
Let us now consider that entropy is a function of probability $$S= F(W)$$. If we combine the entropy equation with the probability equation we have:
\[ F(W1) +F(W2) = F(W1 \times W2)\]
The functional equation can be stated as if:
\[f(x+y)=f(x)+f(y)\]
the solution is,
\[f(x)=k \times x\]
where, k is a constant.
\[x=ln (u),y = ln(v)\]
If we introduce \[\phi()=f ln()\], the Cauchy’s functional equation takes the product form,
\[\phi(u)+\phi(v)=\phi(uv)\] The solution is: \[ \phi(u) =k ln(u)\] ### Boltzman Equation As, \(\phi \equiv F\) and \(u \equiv W\), F(W), what follows is Boltzman’s equation.
\[S= F(W) =k ln(W)\] ### How Boltzman equation expresses the second law Essentially the equation suggests that increase of W , that would happen when disorderliness increases in the system wouldimply increase in time. As more probable state of a system is a more disordered increase of W occurs and this in turn implies increase in entropy S.
Shannon in 1948 suggested the following measure of uncertainty (H), which is commonly known as the Shannon entropy.\[H =- \Sigma_i p_i ln p_i\]
We will show how this entropy is similar to Boltzman entropy (with k=1) however there are debates in considering the two (Shannon and Boltzman entropy) equivalent. ### The Shannon formulea Let us assume that there exits a die roled N times. Now the number of outcomes of arolling die can be expressed as , \[ W= \frac{N!}{\Pi_i N_i!}\]
where, for a conventional die \(i \subset(1,6)\), \(N_k\) being the number of times the die out come is ‘k’.The equation can be re-written as, \[ ln W = ln N! - \Sigma _i ln N_i!\]
If N is large we can use Stirling approximation,
\[ln N!= N ln N -N\]
we can rewrite equation
\[ln W= N ln N -N -\Sigma_i N_i ln N_i + \Sigma N_i\]
\[\Sigma_i N_i = N, p_i = \frac{N_i}{N} \]
we obtain, \[\frac{1}{N}ln W= ln N -\Sigma_i p_i ln (N) -\Sigma _ip_iln (p_i) = -\Sigma _ip_iln (p_i)\] or, \[H = \frac{ln (W) }{N} = -\Sigma p_i ln p_i\]
Coins
Let us consider a fair coin (connected to a red curved arrow) and unfair coin (connected to a black curved arrow) coin toss- In fair \(p_H=p_T=0.5\).In case of biassed toss \(p_H=0.2,p_T=0.8\)}
For a single trial we can write the information in bits ( write log as \(log_2\))
\[H= -\Sigma_i p_i log_2 p_i = -(p_H log_2 p_H + p_T log_2 p_T) \]
where, H and T respectively represents head and tail. It follows that for a fair coin
\[p_H=p_T=\frac{1}{2}\]
\(\therefore\) we obtain the information for coin trial (normalized by coin tossing attempts) \[ H_{fair}=-\frac{1}{2} log _2 (\frac{1}{2})-\frac{1}{2} log _2 (\frac{1}{2})= -log _2 (\frac{1}{2}) =1 bit \] If on the otherhand we conduct a Toss with Cheating, The Shannon entropy for the biassed tossing trial is given by, \[H=-0.2 log_2 (0.2) -0.8 log_2 (0.8)=0.7219 bit\]
Equations like thermal or osmotic equilibrium does not hold good for the cell. The question is how does the cell defy the Boltzman principle. The second Law of Thermodynamics states that the entropy of a system tends to get higher as time progresses because disorganization increases. The law of entropy is considered to be a basic law of nature and the universe. However as we have seen living things behave in oppsite manner. The life in Siberia and Sahara see completely different weather outside , but for any normal human being the body tempeerature will always 37C. This remains true as long as he or she is alive. Compare this with the figure 1.2, where we find that equilibrium wil enforce equal temperature (we may consider the left and right chambers as inside and outside for example).
In 1943 Erwin Schr\(\ddot o\)dinger, Nobel Lauriate in Physics, used the concept of ânegative entropyâ in his popular-science book What is life? A living system imports negentropy and stores it. Life feeds on negative entropy! To understand Schr\(\ddot o\)dinger’s idea we should understand what are open closed and isolated systems.
Above we have teh geneeral validity of, \[ \Delta S= \Delta S_e +\Delta S_i\]
The second law imposes restriction on the entropy supply but not on the entropy production. This teh supply term \(\\Delta S_e\) can be both positive and negative, whereas the second law of thermodynaics puts restriction on the internal entropy production. Second law demands \[ \Delta S_i \ge 0\] Functional biological system on the other hand demands,\[ \Delta S_e \le 0\].