Home Page Overview Site Map Index Appendix Illustration About Contact Update FAQ


Information, SMI, and Entropy


SMI and Entropy

Before the advent of microscopes and particle detectors, scientists in the 19 century can only examine the systems about which their eye can see such as the few examples in Figure 14. The particular example in (c) about heat transfer inspired Rudolf Clausius (1822 - 1888) to propose a new thermodynamics entity called entropy S (after the Greek word "transformation", and contains 2/3 the number of alphabets in "energy" because it is so analogous in its physical significance). The mathematical definition with an infinitesimal change is :

Macroscopic Examples dS = dQ/T where dQ is the infinitesimal heat transfer and T is the absolute temperature.

His bold step is to declare that the concept of entropy is also applicable to other thermodynamics processes such as those shown in (a) and (b) although no heat transfer is discerned. It has the peculiar property that it always tends to increase in a closed system. As many other thermodynamics entities, it is well defined only in a system at equilibrium. It is a function of the independent variables V (volume), T (temperature), P (pressure); but V, E (energy), N (number of particles) are more useful in connection with the information theory to be discussed presently.

Figure 14 Macroscopic Examples [view large image]


Macroscopic Examples Ever since Clausius introduced the concept of entropy, people have endeavored to find a meaning for such term. The identifications with disorder, spreading, and freedom etc (Figure 15) suffer from describing only the state of the system not the entropy; they are all rather qualitative, ill-defined, and highly subjective. Boltzmann's definition of entropy in term of "coarse-graining volume" is valid, but he never identified it to disorder. Equating disorder to entropy is perpetrated for over a hundred years without justification until recently, when Shannon equates the information I with the entropy S via the formula :

S = -[kln(2)] I, where k = 1.38 x 10-16 erg/K is the Boltzmann constant.

Figure 15 Entropy Descriptors [view large image]

It is close to the essence, but missed a critical qualifier as will be explained presently.
Numerically, 1 unit of information I = -1 equals to an entropy S 10-16 erg/K, which is very small comparing to the entropy generated in raising 1 gram of water by 1oC at room temperature (27oC), i.e., S = 1.4x10-8 erg/K.

Entropy Test It was fashionable to equate any monotonically increasing functions to entropy (see for example "Black Hole Entropy"). This is not the case as the following example shows that only a concave downward function is suitable to represent an ever increasing property such as entropy. As shown in Figure 16, a movable partition reduces the volume of the left compartment by an amount V, the right compartment correspondingly gains the same amount. The system then relaxes and allows to return to the equal volume configuration. The final stage can be either in the form of (a), (b), or (c) in Figure 16. It is only in the case of (c), i.e., the concave

Figure 16 Entropy Test
[view large image]

downward function, which produces a net gain of whatever in the "descriptor" (positive change in blue, negative change in red).

The Shannon's Measure of Information (SMI) is just the negative of the information "I" defined in previous section (the Mathematical Formulation of Information), i.e., SMI = H = - (pi) log2(pi), where pi is the probability of finding the particle in location xi or velocity vi in a box of volume V, and SMI is always a positive defined quantity. This is not entropy as both Shannon and John von Neumann would have it. Instead thermodynamics entropy can be derived from SMI as shown in the followings.

Considering a system of ideal mono-atomic gas containing N indistinguishable and non-interacting particles with the total energy E to be the sum of all the kinetic energies. At the limit of continuous spatial distribution the sum goes over to integration, e.g., along the x direction :



It can be shown that this expression of entropy is equivalent to the Boltzmann's definition via a partition function. Table 03 below lists some examples of entropy change.

Process Change of Variable(s) Entropy Change (k')
Expansion V 2V N ln(2)
Demixing + Expansion N1 + N2 N1 & N2, v V + V, N1 = N2 = N, E1 = E2 = E 2N ln(2V/v) > 0
Heat Transfer EC = (EB + EA)/2, EB > EA > 0, EB > EC > EB/2 (3/2)N [ln(EC/EA) + ln(EC/EB)] > 0
Mixing N + N 2N, V + V 2V 0
Mixing + Compression N + N 2N, V + V V - 2N ln(2)
Assimilation + Compression N + N 2N, V + V V - 2N ln(2)

Table 03 Examples of Entropy Change

The entropy change is negative, i.e., decreasing, for the cases involving compression because the processes require work done to the system. There is a net increase in entropy if the environment outside the system is also taken into consideration.

In order to show the relationship between the spontaneous increase of entropy and equilibrium, let's consider an expansion experiment of an
Entropy and Probability Entropy and Equilibrium ideal gas with total number of particles denoted by N as shown in Figure 19. There is n particles in the box on the left (L) and (N - n) particles in the box on the right (R). Alternatively, we can define the corresponding fractions as p = (n/N), and q = (N-n)/N.

Figure 19 Entropy and Probability [view large image]

Figure 20 Entropy and Equilibrium
[view large image]



Thus, we have arrived in a full circle about the folly of entropy. So many people use the terms of entropy, information, disorder, ... without qualifying what they mean. It creates a lot of mis-understanding and alienates the un-initiates because of the confusing usage of these terms. The definition of entropy above is a very simple and specific case for illustration purpose. We should be wary whenever terms like entropy or information appears in the literature (including those in this website). It sounds scientific and impressive, but sometimes ill-defined.

Go to Top of Page to Select
 or to Main Menu

.