Home Page |
Overview |
Site Map |
Index |
Appendix |
Illustration |
About |
Contact |
Update |
FAQ |

Boltzmann's Interpretation

Information as the Flip Side of Entropy

von Neumann Entropy, the Quantum Version

Attractive Force vs Entropy

Thermodynamics provides a macroscopic description of matter and energy. It is a branch of physics developed in the 19th century at the beginning of the industrial revolution with the invention of the steam engine (Figure 01). It was driven by the need to have a better source and more efficient use of energy than the competitors (among English, French, and German). It was a case where technology drove basic research rather than vice versa. | |

## Figure 01 Steam Engine |
In modern physics, the heat Q represents "energy transfer" associated with the random motion of particles, while W is the "energy transfer" in organized motion of a collection of particles (Figure 02). |

The mathematical formula for heat transfer is : Q = mc_{p}T,where m is the total mass of the particles in the system, c _{p} the specific heat capacity at constant pressure, i.e., heat capacity of the substance per unit mass,and T is the temperature gradient between the environment and system. It has to be positive for the energy to transfer naturally. | |

## Figure 02 Heat Transfer |

- An adiabatic process to pump cold water at temperature T
_{C}to the boiler heating it up to T_{H}. This quick step is reversible since there no exchange of heat. - The huge reservoir at T
_{H}delivers Q_{H}amount of heat to the system. The process would be isothermal if the delivery runs slowly. - Another adiabatic process delivers work by abrupt expansion dropping the temperature to T
_{C}. - The remaining heat Q
_{C}is removed by cooling in another isothermal process.

| |

## Figure 03 Thermodynamic Entropy [view large image] |

W = (1 - T

and for reversible process W = Q

Equating Eqs. (1) and (2) yields :

Q

For a real process such as the one shown in Figure 03,b, the efficiency would be less, i.e.,

W < (1 - T

Q

This new variable is called "entropy" (en for within, trope for transformation in Greek) S = Q/T ---------- Eq.(6).The formulas in Eqs.(3) and (5) together make up the original form of the Second Law : The entropy of a close system tends to remain constant or to increase.
| |

## Figure 04 |
Third Law : By writing Eq.(6) in differential form, i.e., dQ = TdS, it shows that dQ 0 as T 0, from which this law states that it is impossible to attain a temperature of absolute zero (Figure 04) - a corollary of the 2nd Law. |

where U denotes the change of

U is needed for book-keeping purpose - to balance the amount of heat and work; U = 0 in reversible process.

The sign convention: "+" signifies

So far, the thermodynamic variables such as temperature, pressure, and density etc. are assumed to be in equilibrium condition. There are thermal equilibrium which demands these state variables within the system to be spatially uniform and temporally constant; and thermodynamic equilibrium for which no net macroscopic flows of matter or of energy, either within a system or between systems. The changes in these variables are idealized with a succession of equilibrium states. Many important biochemical and physical processes

(such as in microfluid, chemical reactions, molecular folding, cell membranes, and cosmic expansion) operate far from equilibrium, where the standard theory of thermodynamics does not apply. Figure 05 shows the cases for different kinds of thermodynamic theory. Case 1 is for over all equilibrium in the system, which is described by classical thermodynamics. Case 2 has local equilibrium in different regions. A theory of nonequilibrium thermodynamics (using the concept of flow or flux) has been developed for such situation. In case 3 the molecules become a chaotic jumble such that the concept of temperature is not applicable anymore. A new theory has been formulated by using a new set of | |

## Figure 05 Thermodynamics Theory [view large image] |
variables within the very short timescale for the transformation. The second law of thermodynamics has been shown to be valid for all these cases. (see more in "Non-equilibrium thermodynamics") |

- Each arrangement is called a macrostate, which has unique number of particles in the two boxes.
- The number of different arrangements for each macrostate is given by Z
_{n}= 8!/n!(8-n)! ---------- Eq.(8),

where n = 0, 1, ... 8. This formula is a special case of permutation when the ordering is ignored. This n!-fold degenercy is removed in the formula. - If the constraint between the 2 boxes is removed, the particles will diffuse from macrostate with low number of microstates to higher number until reaching the maximum Z
_{4}= 70 in this example. Such configuration is called equilibrium. - Entropy is defined as S
_{n}= kn(Z_{n}) --------- Eq.(9),

where k = 1.38x10^{-16}erg/K is the Boltzmann constant. - Thus, S
_{4}> S_{2}> S_{1}, i.e., the entropy reaches its maximum at equilibrium. - It is highly impossible for the system to move back to lower entropy state even for the case of 8 particles unless work is done to force it back (but the work would create more entropy in the environment outside). If the number of particles N is very large, then it is virtually impossible to return to lower entropy state naturally - and hence the Second Law of Thermodynamics.
- There are 2-fold degenercy, i.e., Z
_{n}= Z_{8-n}except for Z_{4}.

| |

## Figure 06 Macro/Micro-state [view large image] |

In the application of such formalism to thermodynamics, the macrostate is specified by N (number of particles), V (the volume of the system) and E (the energy of the particles). Each particle is represented by 6 variables - 3 spatial coordinates and 3 components of its momentum. The phase space is the orthogonal combination of the configuration and momentum spaces of all particles having altogether 6N dimensions. Each point in this phase space consists a microstate of the system and each macrostate contains a certain number of microstates corresponding to the arrangement (multiplicity) as mentioned in the simple example previously. Figure 07 illustrates the different size of the macrostate (called "coarse grain) and the tendance of the phase point to wander off from small region to larger one. Once it reaches a very large region, the phase point is virtually trapped in side, i,e, reaching a macrostate of equilibrium. | |

## Figure 07 Equilibrium in Large Grain (Multiplicity) |

Z(N,V,E) = {[(

and the statistical entropy S(N,V,E) ~ Nkn[(

or in term of the temperature T : S(N,V,T) ~ Nkn[(

where we have used the Stirling approximation N! ~ N

p = (2mE)

2

V is the spatial volume containing the particles,

N! is for removing the degeneracy related to the permutation symmetry of identical particles,

and h = 6.625x10

For the case of atmospheric air at 300

Z(N) ~ 10

which shows that the number of microscopic states is enormous for the rather common condition at the surface of the Earth; while the corresponding entropy

S(N) = kn[Z(N)] ~ Nk ---------- Eq.(12),

The change of entropy by varying the number of particles is : S(N) ~ (N

where the subscripts "i" and "f" denote the initial and final state.

For the case of changing the volume V only, according to Eq.(10b) S(V) ~ Nkn(V

and from Eq.(10c) for changing temperature only, S(T) ~ Nkn(T

Similarly, for a small change of the range of energy E, S(E) ~ kn(E

These examples show that entropy increases by varying degree with increasing N, V, T, E or T.

By equating the thermodynamic and statistical definitions of entropy in Eq.(6) and Eq.(9) respectively, i.e.,

S = Q/T = kn(Z) ---------- Eq.(14),

we obtain a new definition of heat in term of the number of microscopic states :

Q ~ NkTn{[(2mE)

For the above example of 10 cubic cm of air molecules, this definition of Q ~ 10

While from thermodynamic, the specific heat capacity at constant volume c

Q = c

in good agreement between the two definitions.

Since then the concept of statistical entropy has been applied to systems not quite related to thermodynamics. For example, the outcomes of throwing a pair of dice can be considered as a microstate. A macrostate has certain number of microstates having same total numerical value from the combination of the two dices (see Figure 08 where = Z, i.e., the partition function mentioned above).

The generalization has degenerated to merely a qualitative descriptor (Figure 09a). For example, the books neatly cataloged on the bookshelf is considered to have lower entropy, while those scattering around on the table has higher entropy. | |||

## Figure 08 Entropy of Pair of Dice |
## Figure 09a |
## Figure 09b Order and Disorder of Books [view large image] |
Such loose association has now linked entropy to mean order/disorder for most laymen (Figure 09b). |

Information is a concept first used to resolve the paradox of Maxwell's demon. About 150 years ago, the physicist J. C. Maxwell came up with an intriguing idea. He conceived a thought experiment, in which a little demon who operates a friction-free trap door to separate air molecules of one type from the other (see Figure 10), and finally arrives at a system with lower entropy. Such organizing ability of Maxwell's seems to violate the second law of thermodynamics as the demon only selects molecules but does no work. This paradox kept physicists in suspense for half a century until Leo Szilard showed that the demon's stunt really isn't free of charge. By selecting a molecule out of the alternative of 2 types, he creates something called information, which produces an amount of entropy (through mental processing in the brain, see Integrated Information Theory of Consciousness) exactly offsetting the decrement in the re-arrangement. The unit of this commodity is bit, and each time the demon chooses a molecule to shuffle, he shells out one bit of information for this cognitive act, precisely balances the thermodynamic accounts. The new concept has since shown its usefulness in communication and computer, but perhaps
| |

## Figure 10 Maxwell's Demon [view large image] |
its greatest power lies in biology, for the organizing entities in living beings - the proteins and certain RNAs - are but the demon's trick in reverse. |

Z

where n = 0, 1, 2, 3 corresponding to one of the box in diagrams a, b, c, d, respectively, e.g., Z

Information for the n

with corresponding entropy S

where the negative sign in I

According to the definition of information in Eq.(17a), the example of demonic selection gives I

In term of lottery draw, you become rich, if you somehow know the winning number out of the Z = N = 13983816 (for Lotto 6/49, Figure 11) combinations (arrangements) before hand, i.e., you get a lot of information. On the other hand, there is very little information, when you randomly pick a ticket. In this example, information is related to the probability p = n/N, and I = log _{2}(p). The choice is unique and contains lot of information for n = 1, while n = N for a random pick is certainly among the combinations but may not be the winning one. Since the probability is always equal to or less than 1, I is negative by definition. There is no need to place a minus sign in front. The original formulation of information made use of probability in binary choice as explained in the following.
| |

## Figure 11 Probability in Lottery [view large image] |
Essentially, the alternate definition just turns the number of arrangement upside down, i.e., turning N to 1/N in the formulation albeit including some refinements. |

_{} ---------- Eq.(18a),where p _{i} = n_{i}/N is the probability of finding the subject (the "ace" in the example) among n_{i} in the partition, N is the total number in the sample. For the special case where the partitions has equal number of members as in Figure 12,I
_{i} = log_{2}(p_{i}) ---------- Eq.(18b) | |

## Figure 12 Information, Definition [view large image] |
Note that I_{8} = log_{2}(1/N) = -3 for an unique choice as shown in Figure 12. |

The relentless increase of entropy holds for living as well as nonliving matter. Living beings continuously lose information and would sink to thermodynamic equilibrium just as surely as nonliving systems do. There is only on way to keep a system from sinking to equilibrium: to infuse new information (Figure 13). Organisms manage to stay the sinking a little by linking their molecular building blocks with bonds that don't come easily apart. However, this only delays their decay, but doesn't stop it. No chemical bond can resist the

continuous thermal jostling of molecules forever. Thus, to maintain its highly non-equilibrium state, an organism must continuously pump in information. The organism takes information from the environment and funnels it inside. This means that the environment must undergo an equivalent increase in thermodynamic entropy; for every bit of information the organism gains, the entropy in the external environment must rise by a corresponding amount. In other word, information is just the flip side of entropy; while entropy is related to the number of microstates, information would specify only one (sometimes many) microstate out of such configuration. The process just turns the demon's trick around to infuse information by lowering entropy. | |

## Figure 13 Life Cycle [view large image] |

## Figure 14 Density Matrix [view large image] |

Anyway, here's the simplest example of 2 identical electrons in empty space (i.e., independent of spatial coordinates) with x' in

## Figure 15 Superposition of Spin [view large image] |
The von Neumann entropy S() is the quantum version of the "Shannon's Measure of Information" : SMI = - _{i} p_{i}log_{2}(p_{i}) (also see Eq.(18a)), i.e.,
S() = -Tr[n()], where Tr is the trace of the matrix. |

Entanglement Measures for the electrons in 5 diatomic molecules has been published in a 2011 paper entitled "Quantum Entanglement and the Dissociation Process of Diatomic Molecules". The post-Hartree–Fock computational method is employed to calculate the wave function and ultimately _{N} as function of the inter-atomic distance R.
Figure 16 shows the Entanglement Measures _{N} between an electron with the rest in 5 different
| |

## Figure 16 Entanglement of Electrons in Diatomic Molecules [view large image] |
diatomic molecules as function of inter-atomic distance R. Figure 17 is a graph to show the limiting cases of _{N} as R 0 and (at different scale in ratio of 0.1/1). |

It has been shown previously that the entanglement measure is expressed in term of the von Neumann Entropy S(). Disorder is the usual notion on entropy (von Neumann and otherwise). A more useful interpretation in the current context would be in term of "multiplicity", which is the number of different arrangements that can arrive at a same configuration (state).
Thus, the general trend of increasing entanglement measure for large R (as shown in Figure 16) can be understood as increasing entropy with larger volume. However, it could not explain the bump near the united atom limit for some of the molecules. | ||

## Figure 17 |
## Figure 18 |
Figure 17 also shows that entanglement measure decreases rapidly as the number of electrons increases. Such trend may be related to the sharing of entanglement between electrons. Entanglement is at its maximum with monogamy (such as the case with the H_{2} molecule), shared entanglement is called polygamy which produces weaker entanglement with more partners (see "Degree of Entanglement"). |

Although the Second Law of Thermodynamics dictates that entropy trends to increase relentlessly, we see orderly structures around from galaxies to houses (Figure 19). Such regular features are derived from the special property of nature and human. It is the attractive force such as gravity and electro-magnetics in nature and the information (meaning work) by human that | |

## Figure 19 Orderly Structures |
have reversed the Second Law "locally" at the expense of the larger environment. Followings are some examples to show the cause and consequence. |

- Evolution of entropy is governed by physical forces as well as the size of the coarse-grains in phase space. For examples, a refrigerator would restrict the point to a range of space and momentum corresponding to certain temperature T and fixed volume V; air molecules of the atmosphere are confined to a layer of about 100 km by Earth's gravity, ...
- Structure such as a building required a lot of work (organized energy) to complete. It relies on the steel frame and concrete to
provide the static force, that keeps it in place. Figure 21a illustrates the connection of entropy with randomness once the support of orderliness is removed by implosion in this case. All man-made structures would slowly crumble down with the ravage of time by thermal bombardment and chemical interactions with air and water (Figure 21b). #### Figure 21a Rise of Entropy by Implosion [view large image]

#### Figure 21b Great Wall

_{} - Beside the infusion of energy to support life, the organization of complexity requires the removal of entropy as shown in Figure 22.
Such requirement is fulfilled by re-transmission of lower frequency radiation back to the cold dark space as illustrated in the figure. The process re-emits more entropy (at lower frequency) back to the space. By Boltzmann's definition of entropy, it has the effect of enlarging the phase space volume and hence returning more entropy to space than receiving from the sun in accordance with the second law of thermodynamics. Ultimately, it is the stars which generate the carbon atoms (see "Origin of Elements"), and the Sun which supplies energy to all life on Earth. #### Figure 22 Entropy and Life [view large image]

- If entropy is likened to the degree of randomness, then it can be expressed alternatively as the degree of freedom in a system (the degree of freedom is the number of different parameters or arrangements needed to specify completely the state of a particle or system). The evolution of entropy in the universe as a whole can be separated into four phases as described briefly below (also see Figure 23, which supplies more details):
- The inflaton field is a coherent system changing rapidly until the end of the inflationary era. Such system has very few degrees of freedom, so it has a very low entropy. (Figure 24, it also shows a brief portray of leptogenesis, which is a theory to explain the asymmetry between matter and anti-matter.)
- At the end of inflation the energy density of the inflaton field decays to zero (see Figure 03a), thereby releasing lots of energy to produce particle anti-particle pairs, and to heat up the universe. It is this "reheating" that produce lots of degree of freedom, and thus lots of entropy.
- The infusion of energy dU ceased once the inflaton energy density vanished, i.e., dU = 0. According to the thermodynamics relation dU = TdS - pdV (where p = pressure, V = volume, T = temperature), the entropy now varies as dS = (p/T)dV. The universe was dominated by radiation up to 10
^{4}years after the Big Bang. During this era p T^{4}; since in term of the size of the universe R, T 1/R, dV R^{2}dR, thus dS dR/R and the entropy S log (R). - In a matter dominated universe p = 0, thus dS = 0; the entropy is conserved as a whole for the rest of the cosmic expansion.
- If acceleration of the cosmic expansion is taken into account, then there is infusion of energy by an amount dU. The entropy dS dU/T will increase until space is nearly empty in attaining the highest entropy state.

| |

## Figure 20 Entropy Evolution [view large image] |
Figure 20,a shows the evolution of entropy in free space with diffusion of particles. In Figure 20,b the system is collapsing to black hole by the pull of gravity, the increase in entropy is via the Hawking radiation. Meanwhile the entropy of the black hole is reduced to at most three bits : |

## Figure 23 Evolution of Cosmic Entropy [view large image] |
## Figure 24 Initial Cosmic Entropy [view large image] |

The scenario above assumes a quasi-equilibrium approach, which is not entirely correct with the cosmos expanding rapidly. Diagram (a) in Figure 20 is a more realistic sequence according to Boltzmann's general definition of entropy in which the evolution of entropy is described by the phase point moving in phase space without any assumption concerning the equilibrium of the system. Diagram (b) shows another sequence of entropy evolution when the gravitational degrees of freedom is introduced to the system.