Home Page Overview Site Map Index Appendix Illustration About Contact Update FAQ


Information, SMI, and Entropy


Complexity

Complexity is a relatively new field in science, as such there is no unanimous in its definition. There are many different definitions as shown in Table 01; each has its own problem. One of the definitions is in term of entropy/information. A unified view will be universally accepted only when the field becomes mature.

Complexity As Definition Example(s) Problem
Size Larger size means higher complexity Size of body or genome Some simple organisms have larger genome size than human's
SMI More variation signifies more complex message HHH... has no variation and zero entropy, the random sequence DXW... has lot of variation The most complex object is in between most orderly and complete randomness
Algorithmic Content Shorter computer program to describe the object corresponds to lesser complexity HHH... requires very short description, garbled message cannot be compressed Random object leads to high information content
Logical
Depth
Complexity is measured by how difficult to construct the object HHH... is very easy to construct, while a specific message requires more work It is difficult to measure the difficulty
Fractal Dimension Higher fractal dimension equals to higher complexity The coastal line is more complex than a straight line There are other kinds of complexity not defined by fractal dimension
Degree of Hierarchy Complexity is equated to the number of sub-systems Organ to cells to organelles to macro-molecules to ... It is difficult to separate the whole into parts

Table 01 Definitions of Complexity

Seth Lloyd has once listed 31 Measures of Complexity, the list has now grown to 42.

Cell Types A working definition can be taken for ordinary discourse, which states: "it is a state of intricacy, complication, variety, or involvement, as in the interconnected parts of a structure - a quality of having many interacting, different components". Quantitatively, complexity can be measured roughly in term of the number of components such as the number of parts in a machine, the number of cell types in living organism (see Figure 11), or the vocabulary in a language. It is believed that complexity is created in nature by fluctuations - random deviations from some average, equilibrium value of density, temperature, pressure, etc. - also called "instabilities" or "inhomogeneities". Normally, an open system near equilibrium does not evolve spontaneously to new and interesting structures. It requires energy (~ force) to promote and maintain the off-equilibrium state.

Figure 11 Evolution to Greater Complexity [view large image]

Complexity But should those fluctuations become too great for the open system to damp, the system will then departs far from equilibrium and be forced to reorganize. Such reorganization generates a kind of "dynamic steady state" provided the energy flow rate exceeds the thermal relaxation rate. The feedback loops are positive in this kind of process. Complexity itself consequently creates the condition for greater instability, which in turn provides an opportunity for greater reordering.

Another approach to view the development of complexity is through the concept of energy flow per unit mass. Figure 12(a) shows the increase of complexity as the energy flow (per unit mass) into the various systems increases over the age of the universe. Figure 12b depicts qualitatively the departure from equilibrium at each bifurcate point where the energy flow has reached a critical value and thus can promote more complexity in the system. The dotted curves indicate the options that have not been taken by the evolution. The bifurcation is created when the system enters a nonlinear mode beyond some energy threshold. Thus the development of complexity is a phenomenon closely related to the chaos theory or nonlinearity with a positive feedback loop.

Figure 12 Complexity and Energy Flow [view large image]

Go to Next Section
 or to Top of Page to Select
 or to Main Menu

.