Home Page |
Overview |
Site Map |
Index |
Appendix |
Illustration |
About |
Contact |
Update |
FAQ |

dS = dQ/T where dQ is the infinitesimal heat transfer and T is the absolute temperature. His bold step is to declare that the concept of entropy is also applicable to other thermodynamics processes such as those shown in (a) and (b) although no heat transfer is discerned. It has the peculiar property that it always tends to increase in a closed system. As many other thermodynamics entities, it is well defined only in a system at equilibrium. It is a function of the independent variables V (volume), T (temperature), P (pressure); but V, E (energy), N (number of particles) are more useful in connection with the information theory to be discussed presently. | |

## Figure 14 Macroscopic Examples [view large image] |

Ever since Clausius introduced the concept of entropy, people have endeavored to find a meaning for such term. The identifications with disorder, spreading, and freedom etc (Figure 15) suffer from describing only the state of the system not the entropy; they are all rather qualitative, ill-defined, and highly subjective. Boltzmann's definition of entropy in term of "coarse-graining volume" is valid, but he never identified it to disorder. Equating disorder to entropy is perpetrated for over a hundred years without justification until recently, when Shannon equates the information I = - log_{2}(p) (where P is the number of combinations) with the entropy S via the formula :S = - k I, where k = 1.38 x 10 ^{-16} erg/K is the Boltzmann constant.
| |

## Figure 15 Entropy Descriptors |
It is close to the essence, but missed a critical qualifier as will be explained presently. |

BTW, information as the negative of entropy can be interpreted as picking one of the combination of things from certain number of varieties. For example, the winning number of lottery draw is the information awaiting by the ticket owners, all the other combinations become useless. In other word, the amount of information is just the flip side of entropy, i.e., information = log

It was fashionable to equate any monotonically increasing functions to entropy (see for example "Black Hole Entropy"). This is not the case as the following example shows that only a concave downward function is suitable to represent an ever increasing property such as entropy. As shown in Figure 16, a movable partition reduces the volume of the left compartment by an amount V, the right compartment correspondingly gains the same amount. The system then relaxes and allows to return to the equal volume configuration. The final stage can be either in the form of (a), (b), or (c) in Figure 16. It is only in the case of (c), i.e., the concave | |

## Figure 16 Entropy Test |
downward function, which produces a net gain of whatever in the "descriptor" (positive change in blue, negative change in red). |

The Shannon's Measure of Information (SMI) is just the negative of the information "I" defined in previous section (the Mathematical Formulation of Information), i.e., SMI = H = -

Considering a system of ideal mono-atomic gas containing N indistinguishable and non-interacting particles with the total energy E to be the sum of all the kinetic energies. At the limit of continuous spatial distribution the sum goes over to integration, e.g., along the x direction :

It can be shown that this expression of entropy is equivalent to the Boltzmann's definition via a partition function. Table 03 below lists some examples of entropy change.

Process | Change of Variable(s) | Entropy Change (k') |
---|---|---|

Expansion | V 2V | N ln(2) |

Demixing + Expansion | N_{1} + N_{2} N_{1} & N_{2}, v V + V, N_{1} = N_{2} = N, E_{1} = E_{2} = E |
2N ln(2V/v) > 0 |

Heat Transfer | E_{C} = (E_{B} + E_{A})/2, E_{B} > E_{A} > 0, E_{B} > E_{C} > E_{B}/2 |
(3/2)N [ln(E_{C}/E_{A}) + ln(E_{C}/E_{B})] > 0 |

Mixing | N + N 2N, V + V 2V | 0 |

Mixing + Compression | N + N 2N, V + V V | - 2N ln(2) |

Assimilation + Compression | N + N 2N, V + V V | - 2N ln(2) |

In order to show the relationship between the spontaneous increase of entropy and equilibrium, let's consider an expansion experiment of an

ideal gas with total number of particles denoted by N as shown in Figure 19. There is n particles in the box on the left (L) and (N - n) particles in the box on the right (R). Alternatively, we can define the corresponding fractions as p = (n/N), and q = (N-n)/N. | ||

## Figure 19 Entropy and Probability [view large image] |
## Figure 20 Entropy and Equilibrium |

Thus, we have arrived in a full circle about the folly of entropy. So many people use the terms of entropy, information, disorder, ... without qualifying what they mean. It creates a lot of mis-understanding and alienates the un-initiates because of the confusing usage of these terms. The definition of entropy above is a very simple and specific case for illustration purpose. We should be wary whenever terms like entropy or information appears in the literature (including those in this website). It sounds scientific and impressive, but sometimes ill-defined.

or to Main Menu