Home Page |
Overview |
Site Map |
Index |
Appendix |
Illustration |
About |
Contact |
Update |
FAQ |

deck - first for subsets of four card, then of two, and finally of one. This way, we would hit upon the Ace in three steps, regardless where it happens to be. The "yes" or "no" are the constraints (the rules or procedures), which resolve more uncertainties with more questionings and in turn obtaining more information. Thus, 3 is the minimum number of correct binary choices or, by our definition above, the amount of information needed to locate a card in this particular arrangement. In effect, what we have been doing here is taking the binary logarithm of the number of possibilities (N=8), i.e., log_{2}(8) = 3. In other words, the information required to determine the location in a deck of 8 cards is 3 bits. A bit is the smallest unit of information. It contains two distinct alternatives, which can be any of the hot/cold, black/white,
| |

## Figure 07a Information and Rules |
in/out, up/down, ... pairs. The number of possible alternatives N for a series of K trials is 2^{K}. Using up/down for example with K=3 trials. There are N = 2^{3} = 8 possible alternatives: (up, down, up), (up, down, down), |

For any number of possibilities (alternatives) N, the information

or

For the case where the N choices are subdivided into subsets/partitions ( i ) of uniform size (n

Using the example in Figure 07a with N = 8, the different values of n

i 1 2 4 8

n

p

For n_{i} = 8, all the members are distributed randomly in one partition, finding the member there is certain but difficult without any information; while for n_{i} = 1, the members are distributed orderly in eight equal size partitions, the chance of finding the member in each is 1/8 and 3 bits of information is required to find the member. In other words, a greater amount of uncertainty is resolved (more information) if the target chosen is one of a large number of possibilities as shown in Figure 07b.
| |

## Figure 07b Information and Probability [view large image] |

Thus, for N = 8; n

For uniform distribution with p

Alternatively, n

only one choice such as "C" will fit the bill resulting in knowing the date a lot before the rendezvous. If the requirement is relaxed to only a few attributes, then "B" or "C" or "E" may be suitable, and the | |

## Figure 08 Dating Information [view large image] |
knowledge for the dates is reduced accordingly. Finally and desperately, a random selection will result in picking anyone on the list with absolutely no information about them. |

The negative of I is referred as SMI (

SMI = -

This is a positively defined mathematical entity in between the loosely defined "information" and the strict definition of "entropy". It can be applied to cases in equilibrium or non-equilibrium. Unfortunately, it is often mis-interpreted as "entropy" as in the case for the investigation of "Black Hole Entropy". Many references to "information" or "entropy" in the literature should be replaced by SMI.

See "Entropy of Many Particles (2020 Edition)".

or to Top of Page to Select

or to Main Menu