Home Page Overview Site Map Index Appendix Illustration About Contact Update FAQ


Entropy of Many Particles (2020 Edition)


Contents

Entropy in 19th Century and the Laws of Thermodynamics
Boltzmann's Interpretation
Information as the Flip Side of Entropy
von Neumann Entropy, the Quantum Version
Attractive Force vs Entropy

Entropy in 19th Century and the Laws of Thermodynamics

Steam Engine Thermodynamics provides a macroscopic description of matter and energy. It is a branch of physics developed in the 19th century at the beginning of the industrial revolution with the invention of the steam engine (Figure 01). It was driven by the need to have a better source and more efficient use of energy than the competitors (among English, French, and German). It was a case where technology drove basic research rather than vice versa.

Figure 01 Steam Engine

In modern physics, the heat Q represents "energy transfer" associated with the random motion of particles, while W is the "energy transfer" in organized motion of a collection of particles (Figure 02).

Heat Transfer The mathematical formula for heat transfer is :    Q = mcpT,
where m is the total mass of the particles in the system,
cp the specific heat capacity at constant pressure, i.e., heat capacity of the substance per unit mass,
and T is the temperature gradient between the environment and system. It has to be positive for the energy to transfer naturally.

Figure 02 Heat Transfer

The original version of ENTROPY is derived as an abstract variable not directly measurable. It arose from Rudolf Clausius's study of the Carnot cycle (Figure 03,a) which idealizes the process for steam engine (see Figure 01 for comparison) to obtain maximum work from heat. It involves four reversible steps as described in the followings :
    Thermodynamics Entropy
  1. An adiabatic process to pump cold water at temperature TC to the boiler heating it up to TH. This quick step is reversible since there no exchange of heat.
  2. The huge reservoir at TH delivers QH amount of heat to the system. The process would be isothermal if the delivery runs slowly.
  3. Figure 03 Thermodynamic Entropy [view large image]

  4. Another adiabatic process delivers work by abrupt expansion dropping the temperature to TC.
  5. The remaining heat QC is removed by cooling in another isothermal process.

Accoring to Clausius, the maximum work that a heat engine can produce is :

W = (1 - TC/TH)QH ---------- Eq.(1)
and for reversible process W = QH - QC ---------- Eq.(2)
Equating Eqs. (1) and (2) yields :
QH/TH = QC/TC ---------- Eq.(3)
For a real process such as the one shown in Figure 03,b, the efficiency would be less, i.e.,
W < (1 - TC/TH)QH ---------- Eq.(4), giving :
QH/TH < QC/TC ---------- Eq.(5).
Molecular Energy This new variable is called "entropy" (en for within, trope for transformation in Greek) S = Q/T ---------- Eq.(6).
The formulas in Eqs.(3) and (5) together make up the original form of the
Second Law : The entropy of a close system tends to remain constant or to increase.

Figure 04 Molecular Energy

Third Law : By writing Eq.(6) in differential form, i.e., dQ = TdS, it shows that dQ 0 as T 0, from which this law states that it is impossible to attain a temperature of absolute zero (Figure 04) - a corollary of the 2nd Law.

While the First Law is the conservation of energy (en for within, ergon for work in Greek) U = Q - W ---------- Eq.(7)
where U denotes the change of internal energy which can have many forms including kinetic energies of molecular translation, vibration, rotation, and potential energy of inter-molecular interaction (Figure 04). Its detail usually has no concern in thermodynamics.
U is needed for book-keeping purpose - to balance the amount of heat and work; U = 0 in reversible process.

The sign convention: "+" signifies from the surroundings to the system, "-" means from the system to the surroundings. For example, the process can be reversed (moving counter-clockwise) in the Carnot cycle (Figure 03). It becomes refrigeration with the exchange of "+/-" signs in Q, and expansion/compression. Work is done (-W becomes positive) to the system resulting the removal of heat from the system.

So far, the thermodynamic variables such as temperature, pressure, and density etc. are assumed to be in equilibrium condition. There are thermal equilibrium which demands these state variables within the system to be spatially uniform and temporally constant; and thermodynamic equilibrium for which no net macroscopic flows of matter or of energy, either within a system or between systems. The changes in these variables are idealized with a succession of equilibrium states. Many important biochemical and physical processes
Thermodynamics Theory (such as in microfluid, chemical reactions, molecular folding, cell membranes, and cosmic expansion) operate far from equilibrium, where the standard theory of thermodynamics does not apply. Figure 05 shows the cases for different kinds of thermodynamic theory. Case 1 is for over all equilibrium in the system, which is described by classical thermodynamics. Case 2 has local equilibrium in different regions. A theory of nonequilibrium thermodynamics (using the concept of flow or flux) has been developed for such situation. In case 3 the molecules become a chaotic jumble such that the concept of temperature is not applicable anymore. A new theory has been formulated by using a new set of

Figure 05 Thermodynamics Theory [view large image]

variables within the very short timescale for the transformation. The second law of thermodynamics has been shown to be valid for all these cases. (see more in "Non-equilibrium thermodynamics")

[Top]


Boltzmann's Interpretation

Boltzmann's definition of entropy uses the paradigm of statistical mechanics in 6N space, where N is the number of particles in the system. The idea can be introduced easily by a simple example of 8 distinguishable particles (in different colors) distributed in two boxes in different arrangements (aka multiplicity) as shown in Figure 06.

Equilibrium in Large Grain (Multiplicity) In the application of such formalism to thermodynamics, the macrostate is specified by N (number of particles), V (the volume of the system) and E (the energy of the particles). Each particle is represented by 6 variables - 3 spatial coordinates and 3 components of its momentum. The phase space is the orthogonal combination of the configuration and momentum spaces of all particles having altogether 6N dimensions. Each point in this phase space consists a microstate of the system and each macrostate contains a certain number of microstates corresponding to the arrangement (multiplicity) as mentioned in the simple example previously. Figure 07 illustrates the different size of the macrostate (called "coarse grain) and the tendance of the phase point to wander off from small region to larger one. Once it reaches a very large region, the phase point is virtually trapped in side, i,e, reaching a macrostate of equilibrium.

Figure 07 Equilibrium in Large Grain (Multiplicity)

For a system of ideal gas, the different arrangements for each macrostate (now called Partition Function) is given by :

Z(N,V,E) = {[(1/2(2mE)1/2V1/3)/h]3N/(N!)}(E/E) --------- Eq.(10a)

and the statistical entropy S(N,V,E) ~ Nkn[(3/2(2mE)3/2V)/Nh3] + kn(E/E) --------- Eq.(10b)

or in term of the temperature T : S(N,V,T) ~ Nkn[(3/2(3mkT)3/2V)/Nh3] + kn(T/T) --------- Eq.(10c)

where we have used the Stirling approximation N! ~ NN for N >>> 1,
p = (2mE)1/2(E/2E) is the range of momentum transforming to the energy E,
23N/2(2mE)(3N-1)/2 comes from integrating up to the energy E = p2/2m,
V is the spatial volume containing the particles,
N! is for removing the degeneracy related to the permutation symmetry of identical particles,
and h = 6.625x10-27 erg-sec is the Planck constant conveniently taken as the basic unit (minimum size) of the microscopic states.

For the case of atmospheric air at 300oK and standard atmospheric pressure of 101 kpa, the number of gas molecules N in a cube of 1000 cm3 has been estimated to be ~ 1025, for the averaged mass of the air molecule m = 2.5x10-23 gm, the root-mean-square velocity of the particles vrms = (3kT/m)1/2 ~ 7x104 cm/sec., the corresponding momentum p = mvrms = (2mE)1/2 ~ 1.75x10-18 erg-sec/cm (or E ~ 6x10-14 erg ~ 3.8x10-2 ev, E/E ~ 0.1). The partition function is reduced to :

Z(N) ~ 102N ---------- Eq.(11),

which shows that the number of microscopic states is enormous for the rather common condition at the surface of the Earth; while the corresponding entropy

S(N) = kn[Z(N)] ~ Nk ---------- Eq.(12),

The change of entropy by varying the number of particles is : S(N) ~ (Nf - Ni)k ---------- Eq.(13a),
where the subscripts "i" and "f" denote the initial and final state.

For the case of changing the volume V only, according to Eq.(10b) S(V) ~ Nkn(Vf/Vi) ---------- Eq.(13b),
and from Eq.(10c) for changing temperature only, S(T) ~ Nkn(Tf/Ti) ---------- Eq.(13c).

Similarly, for a small change of the range of energy E, S(E) ~ kn(Ef/Ei) ---------- Eq.(13d).

These examples show that entropy increases by varying degree with increasing N, V, T, E or T.

By equating the thermodynamic and statistical definitions of entropy in Eq.(6) and Eq.(9) respectively, i.e.,
S = Q/T = kn(Z) ---------- Eq.(14),
we obtain a new definition of heat in term of the number of microscopic states :

Q ~ NkTn{[(2mE)1/2V1/3)/h]3/N} --------- Eq.(15a)
For the above example of 10 cubic cm of air molecules, this definition of Q ~ 1012 erg.

While from thermodynamic, the specific heat capacity at constant volume cv = 20 J/mole-K for air molecules, at T = 300K
Q = cvT(1025/6x1023) = 1012 erg ---------- Eq.(15b)
in good agreement between the two definitions.

Since then the concept of statistical entropy has been applied to systems not quite related to thermodynamics. For example, the outcomes of throwing a pair of dice can be considered as a microstate. A macrostate has certain number of microstates having same total numerical value from the combination of the two dices (see Figure 08 where = Z, i.e., the partition function mentioned above).
Entropy of 2 Dice Entropy Descriptors Order of Books The generalization has degenerated to merely a qualitative descriptor (Figure 09a). For example, the books neatly cataloged on the bookshelf is considered to have lower entropy, while those scattering around on the table has higher entropy.

Figure 08 Entropy of Pair of Dice

Figure 09a Entropy Descriptors

Figure 09b Order and Disorder of Books [view large image]

Such loose association has now linked entropy to mean order/disorder for most laymen (Figure 09b).
See the various facets of entropy.

[Top]


Information as the Flip Side of Entropy

Maxwell's Demon Information is a concept first used to resolve the paradox of Maxwell's demon. About 150 years ago, the physicist J. C. Maxwell came up with an intriguing idea. He conceived a thought experiment, in which a little demon who operates a friction-free trap door to separate air molecules of one type from the other (see Figure 10), and finally arrives at a system with lower entropy. Such organizing ability of Maxwell's seems to violate the second law of thermodynamics as the demon only selects molecules but does no work. This paradox kept physicists in suspense for half a century until Leo Szilard showed that the demon's stunt really isn't free of charge. By selecting a molecule out of the alternative of 2 types, he creates something called information, which produces an amount of entropy (through mental processing in the brain, see Integrated Information Theory of Consciousness) exactly offsetting the decrement in the re-arrangement. The unit of this commodity is bit, and each time the demon chooses a molecule to shuffle, he shells out one bit of information for this cognitive act, precisely balances the thermodynamic accounts. The new concept has since shown its usefulness in communication and computer, but perhaps

Figure 10 Maxwell's Demon [view large image]

its greatest power lies in biology, for the organizing entities in living beings - the proteins and certain RNAs - are but the demon's trick in reverse.

As shown in Figure 10 for a simplified example of one of the two boxes containing 6 air molecules in blue and red, the number of arrangement can be computed by the permutation formula having redundant counts for each kind of molecules :

Zn = 6!/[(3+n)!(3-n)!] ---------- Eq.(16),
where n = 0, 1, 2, 3 corresponding to one of the box in diagrams a, b, c, d, respectively, e.g., Z0 = 20, Z1 = 15, Z2 = 6, Z3 = 1.

Information for the nth case is defined by In = -log2(Zn) = log2(1/Zn) ---------- Eq.(17a),
with corresponding entropy Sn = -kIn = klog2(Zn) ---------- Eq.(17b),
where the negative sign in In is for the exact cancellation of the lowering of entropy --- the flip side of each other, the choice of using logarithm of base 2 is related to the original formulation of information using binary choice as example, and log2(x) = n(x)/n(2) = 1.44n(x).

According to the definition of information in Eq.(17a), the example of demonic selection gives I0 ~ -4 bits, while I3 = 0. One particular selection is shown in bottom of diagram (a), Figure 10. It is unique among the 20 arrangements and carries an information of -4 bits. While in diagram (d), there is no distinct feature in the choice to mark its uniqueness, thus I3 = 0, i.e., it provides no information but having minimum entropy, which is the original paradox.

Probability in Lottery In term of lottery draw, you become rich, if you somehow know the winning number out of the
Z = N = 13983816 (for Lotto 6/49, Figure 11) combinations (arrangements) before hand, i.e., you get a lot of information. On the other hand, there is very little information, when you randomly pick a ticket. In this example, information is related to the probability p = n/N, and I = log2(p). The choice is unique and contains lot of information for n = 1, while n = N for a random pick is certainly among the combinations but may not be the winning one. Since the probability is always equal to or less than 1, I is negative by definition. There is no need to place a minus sign in front. The original formulation of information made use of probability in binary choice as explained in the following.

Figure 11 Probability in Lottery [view large image]

Essentially, the alternate definition just turns the number of arrangement upside down, i.e., turning N to 1/N in the formulation albeit including some refinements.

The derivation uses an example of searching for the "ace" among 8 cards. It employs a scheme of acquiring information by picking the half of binary partition containing the ace. Each enquiry provides an information of -1 bit. It needs I = -3 bits to local the ace for this case as shown in Figure 12. The formula to calculate the information in the ith ( = 1, 2, 4, or 8) partition is in the general form :
Information, Definition ---------- Eq.(18a),
where pi = ni/N is the probability of finding the subject (the "ace" in the example) among ni in the partition, N is the total number in the sample. For the special case where the partitions has equal number of members as in Figure 12,
Ii = log2(pi) ---------- Eq.(18b)

Figure 12 Information, Definition [view large image]

Note that I8 = log2(1/N) = -3 for an unique choice as shown in Figure 12.


Numerically, 1 unit of information I = -1 equals to an entropy S 10-16 erg/K, which is very small comparing to the entropy generated in raising 1 gram of water by 1oC at room temperature (27oC), i.e., S = 1.4x10-8 erg/K.

The relentless increase of entropy holds for living as well as nonliving matter. Living beings continuously lose information and would sink to thermodynamic equilibrium just as surely as nonliving systems do. There is only on way to keep a system from sinking to equilibrium: to infuse new information (Figure 13). Organisms manage to stay the sinking a little by linking their molecular building blocks with bonds that don't come easily apart. However, this only delays their decay, but doesn't stop it. No chemical bond can resist the
Life Cycle continuous thermal jostling of molecules forever. Thus, to maintain its highly non-equilibrium state, an organism must continuously pump in information. The organism takes information from the environment and funnels it inside. This means that the environment must undergo an equivalent increase in thermodynamic entropy; for every bit of information the organism gains, the entropy in the external environment must rise by a corresponding amount. In other word, information is just the flip side of entropy; while entropy is related to the number of microstates, information would specify only one (sometimes many) microstate out of such configuration. The process just turns the demon's trick around to infuse information by lowering entropy.

Figure 13 Life Cycle [view large image]

[Top]


von Neumann Entropy, the Quantum Version

In quantum theory, the phase space in statistical mechanics is replaced by the "Density Matrix" and ultimately the "Single-Particle Reduced Density Matrix" (Figure 14).

Probability Density

Figure 14 Density Matrix [view large image]



At classical physics limit the Planck constant h = 0, the uncertainty relation px - xp = 0, the coordinate and momentum spaces become well defined again, and the system revives back to one suitable for Boltzmann's formulation.

Anyway, here's the simplest example of 2 identical electrons in empty space (i.e., independent of spatial coordinates) with x' in and x in spin state respectively (see Figure 15). Then is just the outer product of the spin state, i.e., = .
Superposition of Spin

Figure 15 Superposition of Spin [view large image]

The von Neumann entropy S() is the quantum version of the "Shannon's Measure of Information" :
SMI = - i pilog2(pi) (also see Eq.(18a)), i.e., S() = -Tr[n()], where Tr is the trace of the matrix.


See "Logarithm of a matrix" for the approximate formula for the log of a matrix B --- log(B) ~ (B - I).

Entanglement of Electrons in Diatomic Molecules Entanglement Measures for the electrons in 5 diatomic molecules has been published in a 2011 paper entitled "Quantum Entanglement and the Dissociation Process of Diatomic Molecules". The post-Hartree–Fock computational method is employed to calculate the wave function and ultimately N as function of the inter-atomic distance R. Figure 16 shows the Entanglement Measures N between an electron with the rest in 5 different

Figure 16 Entanglement of Electrons in Diatomic Molecules [view large image]

diatomic molecules as function of inter-atomic distance R. Figure 17 is a graph to show the limiting cases of N as R 0 and (at different scale in ratio of 0.1/1).

Limiting Cases Quantum Dot It has been shown previously that the entanglement measure is expressed in term of the von Neumann Entropy S(). Disorder is the usual notion on entropy (von Neumann and otherwise). A more useful interpretation in the current context would be in term of "multiplicity", which is the number of different arrangements that can arrive at a same configuration (state). Thus, the general trend of increasing entanglement measure for large R (as shown in Figure 16) can be understood as increasing entropy with larger volume. However, it could not explain the bump near the united atom limit for some of the molecules.

Figure 17 Limiting Cases



Figure 18 Quantum Dot



Figure 17 also shows that entanglement measure decreases rapidly as the number of electrons increases. Such trend may be related to the sharing of entanglement between electrons. Entanglement is at its maximum with monogamy (such as the case with the H2 molecule), shared entanglement is called polygamy which produces weaker entanglement with more partners (see "Degree of Entanglement").
The study of entanglement measures in diatomic molecules helps to shed some insight into its origin. Although the quantum dot is more versatile (controllable) and useful in quantum computing, it is similarly an aggregate of electrons (with or without the nuclei) etched into wafers of a semiconductor (Figure 18). Note that the distance between the quantum dots is ~ 200 A, while the equilibrium distance between the two atoms in the hydrogen molecule is 0.8 A (1 A = 10-8 cm).

[Top]


Attractive Force vs Entropy

Orderly Structures Although the Second Law of Thermodynamics dictates that entropy trends to increase relentlessly, we see orderly structures around from galaxies to houses (Figure 19). Such regular features are derived from the special property of nature and human. It is the attractive force such as gravity and electro-magnetics in nature and the information (meaning work) by human that

Figure 19 Orderly Structures
[view large image]

have reversed the Second Law "locally" at the expense of the larger environment. Followings are some examples to show the cause and consequence.