Home Page Overview Site Map Index Appendix Illustration About Contact Update FAQ


Information, SMI, and Entropy


Order/Disorder and SMI

We have some intuitive notion of orderliness - we can often see at a glance whether a structure is orderly or disorderly - but this doesn't go beyond rather simple architectural or functional periodicities. That intuition breaks down as we deal with macromolecules, like DNA, RNA, or proteins. Considered individually, these are largely aperiodic structures, yet their specifications require immense amounts of information. In paradoxical contrast, a periodic structure - say, a synthetic stretch of DNA encoding nothing but prolines - may immediately strike us as orderly, though it contains a relatively modest amount of information (and is described by a simple algorithm).

The degree of disorder of a given system can be gauged by the number of equivalent ways (the assembly) it can be constructed and denoted by . It is related to SMI - Shannon's mathematical formula for Measurement of Information by :

SMI = ln(),

which is zero when = 1. This doesn't happen very often; it is the case of a perfect crystal at absolute zero temperature. At that temperature there is then only one way of assembling the lattice structure with molecules that are all indistinguishable from one another. Thus, the information content is zero for a perfectly ordered crystal structure. As the crystal is warmed up, this quantity rises above zero. Its molecules vibrate in various ways about their equilibrium positions, and there are then several ways of assembling what is perceived as one and the same structure.

The following two examples in Figure 09 illustrate the relationship between assembly (also known as arrangements or multiplicity) and SMI. In example a, there are eight distinguishable particles distributed in two boxes in different arrangements, which constitute the macrostates with different value of n (the number of particles within the box on the left). The microstates are the individual configurations within each macrostate. If the total number of configurations is denoted by , then it is related to SMI by the above expression. Example b shows the multiplicity (arrangements) of throwing a pair of distinguishable dice with the probability P of the outcome denoted by P = /(Total # of Microstates). The multiplicity for each macrostate is designated by (macrostate) in red.

Figure 09 SMI and Arrangements (Multiplicity)

See "Entropy" for a more general definition.

An orderly system tends to become more disorder with the progress of time. The sequence of pictures in Figure 10a illustrates the
Arrow of Time consequence when the force (holding up the buildings) is removed by the implosion - it tends to disintegrate into rubble (greater disorder). This tendency toward greater disorder is prescribed by the second law ofthermodynamics and can be used to define the "arrow of time" since the sequence of events always flows one way to greater disorder. It is highly improbable for the reverse to happen naturally. Note that the second law of thermodynamics applies only to a closed system. For an open system, its orderliness can be boosted up at the expense of the surroundings (making it more disorder) with the infusion of energy.

Figure 10a Arrow of Time [view large image]


The time's arrow of thermodynamic events holds for living as well as nonliving matter. Living beings continuously lose information and would sink to thermodynamic equilibrium just as surely as nonliving systems do. There is only on way to keep a system from sinking to equilibrium: to infuse new information (Figure 10b). Organisms manage to stay the sinking a little by linking their molecular building blocks with bonds that don't come easily apart. However, this only delays their decay, but doesn't stop it. No chemical bond can resist the
Life Cycle continuous thermal jostling of molecules forever. Thus, to maintain its highly non-equilibrium state, an organism must continuously pump in information. The organism takes information from the environment and funnels it inside. This means that the environment must undergo an equivalent increase in thermodynamic entropy; for every bit of information the organism gains, the entropy in the external environment must rise by a corresponding amount. In other word, information is just the flip side of entropy; while entropy is related to the number of microstates, information would specify only one (sometimes many) microstate out of such configuration (albeit with a multiplying constant for the conversion).

Figure 10b Life Cycle [view large image]


§ This view of "Life Cycle" is similar to the idea in Schrodinger's book on "What is Life". It has been criticized (in 2014) on the ground that life is a highly non-equilibrium process while entropy is defined only for equilibrium state (see "Information and Entropy"). The problem could be reconciled by replacing the reduction and raising in entropy with production of orderly states and decay to disorderly states.

Go to Next Section
 or to Top of Page to Select
 or to Main Menu

.