| Home Page | Overview | Site Map | Index | Appendix | Illustration | About | Contact | Update | FAQ |
. It is related to SMI - Shannon's mathematical formula for Measurement of Information by :
),
= 1. This doesn't happen very often; it is the case of a perfect crystal at absolute zero temperature. At that temperature there is then only one way of assembling the lattice structure with molecules that are all indistinguishable from one another. Thus, the information content is zero for a perfectly ordered crystal structure. As the crystal is warmed up, this quantity rises above zero. Its molecules vibrate in various ways about their equilibrium positions, and there are then several ways of assembling what is perceived as one and the same structure.
, then it is related to SMI by the above expression. Example b shows the multiplicity (arrangements) of throwing a pair of distinguishable dice with the probability P of the outcome denoted by P =
/(Total # of Microstates). The multiplicity for each macrostate is designated by
(macrostate) in red.
![]() |
consequence when the force (holding up the buildings) is removed by the implosion - it tends to disintegrate into rubble (greater disorder). This tendency toward greater disorder is prescribed by the second law ofthermodynamics and can be used to define the "arrow of time" since the sequence of events always flows one way to greater disorder. It is highly improbable for the reverse to happen naturally. Note that the second law of thermodynamics applies only to a closed system. For an open system, its orderliness can be boosted up at the expense of the surroundings (making it more disorder) with the infusion of energy. |
Figure 10a Arrow of Time [view large image] |
![]() |
continuous thermal jostling of molecules forever. Thus, to maintain its highly non-equilibrium state, an organism must continuously pump in information. The organism takes information from the environment and funnels it inside. This means that the environment must undergo an equivalent increase in thermodynamic entropy; for every bit of information the organism gains, the entropy in the external environment must rise by a corresponding amount. In other word, information is just the flip side of entropy; while entropy is related to the number of microstates, information would specify only one (sometimes many) microstate out of such configuration (albeit with a multiplying constant for the conversion). |
Figure 10b Life Cycle [view large image] |