| Home Page | Overview | Site Map | Index | Appendix | Illustration | About | Contact | Update | FAQ |
![]() |
dS = dQ/T where dQ is the infinitesimal heat transfer and T is the absolute temperature. His bold step is to declare that the concept of entropy is also applicable to other thermodynamics processes such as those shown in (a) and (b) although no heat transfer is discerned. It has the peculiar property that it always tends to increase in a closed system. As many other thermodynamics entities, it is well defined only in a system at equilibrium. It is a function of the independent variables V (volume), T (temperature), P (pressure); but V, E (energy), N (number of particles) are more useful in connection with the information theory to be discussed presently. |
Figure 14 Macroscopic Examples [view large image] |
![]() |
Ever since Clausius introduced the concept of entropy, people have endeavored to find a meaning for such term. The identifications with disorder, spreading, and freedom etc (Figure 15) suffer from describing only the state of the system not the entropy; they are all rather qualitative, ill-defined, and highly subjective. Boltzmann's definition of entropy in term of "coarse-graining volume" is valid, but he never identified it to disorder. Equating disorder to entropy is perpetrated for over a hundred years without justification until recently, when Shannon equates the information I = - log2(p) (where P is the number of combinations) with the entropy S via the formula : S = - k I, where k = 1.38 x 10-16 erg/K is the Boltzmann constant. |
Figure 15 Entropy Descriptors |
It is close to the essence, but missed a critical qualifier as will be explained presently. |
10-16 erg/K, which is very small comparing to the entropy generated in raising 1 gram of water by 1oC at room temperature (27oC), i.e.,
S = 1.4x10-8 erg/K.![]() |
It was fashionable to equate any monotonically increasing functions to entropy (see for example "Black Hole Entropy"). This is not the case as the following example shows that only a concave downward function is suitable to represent an ever increasing property such as entropy. As shown in Figure 16, a movable partition reduces the volume of the left compartment by an amount V, the right compartment correspondingly gains the same amount. The system then relaxes and allows to return to the equal volume configuration. The final stage can be either in the form of (a), (b), or (c) in Figure 16. It is only in the case of (c), i.e., the concave
|
Figure 16 Entropy Test |
downward function, which produces a net gain of whatever in the "descriptor" (positive change in blue, negative change in red). |
(pi) log2(pi), where pi is the probability of finding the particle in location xi or velocity vi in a box of volume V, and SMI is always a positive defined quantity. This is not entropy as both Shannon and John von Neumann would have it. Instead thermodynamics entropy can be derived from SMI as shown in the followings.
| Process | Change of Variable(s) | Entropy Change (k') |
|---|---|---|
| Expansion | V 2V |
N ln(2) |
| Demixing + Expansion | N1 + N2 N1 & N2, v V + V, N1 = N2 = N, E1 = E2 = E |
2N ln(2V/v) > 0 |
| Heat Transfer | EC = (EB + EA)/2, EB > EA > 0, EB > EC > EB/2 | (3/2)N [ln(EC/EA) + ln(EC/EB)] > 0 |
| Mixing | N + N 2N, V + V 2V |
0 |
| Mixing + Compression | N + N 2N, V + V V |
- 2N ln(2) |
| Assimilation + Compression | N + N 2N, V + V V |
- 2N ln(2) |
![]() |
![]() |
ideal gas with total number of particles denoted by N as shown in Figure 19. There is n particles in the box on the left (L) and (N - n) particles in the box on the right (R). Alternatively, we can define the corresponding fractions as p = (n/N), and q = (N-n)/N. |
Figure 19 Entropy and Probability [view large image] |
Figure 20 Entropy and Equilibrium |