Information, SMI, and Entropy
Complexity is a relatively new field in science, as such there is no unanimous in its definition. There are many different definitions as shown in Table 01; each has its own problem. One of the definitions is in term of entropy/information. A unified view will be universally accepted only when the field becomes mature.
||Larger size means higher complexity
||Size of body or genome
||Some simple organisms have larger genome size than human's
||More variation signifies more complex message
||HHH... has no variation and zero entropy, the random sequence DXW... has lot of variation
||The most complex object is in between most orderly and complete randomness
||Shorter computer program to describe the object corresponds to lesser complexity
||HHH... requires very short description, garbled message cannot be compressed
||Random object leads to high information content
|Complexity is measured by how difficult to construct the object
||HHH... is very easy to construct, while a specific message requires more work
||It is difficult to measure the difficulty
||Higher fractal dimension equals to higher complexity
||The coastal line is more complex than a straight line
||There are other kinds of complexity not defined by fractal dimension
|Degree of Hierarchy
||Complexity is equated to the number of sub-systems
||Organ to cells to organelles to macro-molecules to ...
||It is difficult to separate the whole into parts
Table 01 Definitions of Complexity
Seth Lloyd has once listed 31 Measures of Complexity, the list has now grown to 42.
||A working definition can be taken for ordinary discourse, which states: "it is a state of intricacy, complication, variety, or involvement, as in the interconnected parts of a structure - a quality of having many interacting, different components". Quantitatively, complexity can be measured roughly in term of the number of components such as the number of parts in a machine, the number of cell types in living organism (see Figure 11), or the vocabulary in a language. It is believed that complexity is created in nature by fluctuations - random deviations from some average, equilibrium value of density, temperature, pressure, etc. - also called "instabilities" or "inhomogeneities". Normally, an open system near equilibrium does not evolve spontaneously to new and interesting structures. But should those fluctuations become too
Figure 11 Evolution to Greater Complexity [view large image]
||great for the open system to damp, the system will then departs far from equilibrium and be forced to reorganize. Such reorganization generates a kind of "dynamic steady state" provided the energy flow rate exceeds the thermal relaxation rate. The feedback loops are positive in this kind of process. Complexity itself consequently creates the condition for greater instability, which in turn provides an opportunity for greater reordering.
Another approach to view the development of complexity is through the concept of energy flow per unit mass. Figure 12(a) shows the increase of complexity as the energy flow (per unit mass) into the various systems increases over the age of the universe. Figure 12b depicts qualitatively the departure from equilibrium at each bifurcate point where the energy flow has reached a critical value and thus can promote more complexity in the system. The dotted curves indicate the options that have not been taken by the evolution. The bifurcation is created when the system enters a nonlinear mode beyond some energy threshold. Thus the development of complexity is a phenomenon closely related to the chaos theory or nonlinearity with a positive feedback loop.
Figure 12 Complexity and Energy Flow [view large image]
Table 02 below compares randomness, and complexity in the context of information. In presenting the Alphabetical examples, the unique choice is in bold font; while the regular font shows one out of the many choices.
Table 02 Complexity and Information
The difference between randomness and complexity in Table 02 is that while both may look random to an untrained eye, effort has been spent and information has been created to arrange the sequence in complexity similar to the case of the Maxwell's demon. Only man-made and biological systems have evolved to the level of specific information, in which the units are arranged in such a way that the sequence is capable of performing a certain function. The examples for specified complexity in the table either conveys a message by following a set of grammatical rules or produces proteins according to the genetic codes. Orderly arrangement looks nice but is incapable of conveying a message or encrypting information. In the formula for the information (in the last column), N stands for the total number of possible arrangements and n is the number of choices. In the example of order system both N and n equal to 1; while for the other examples N = 7! = 5040 (including the blank), and n = 5040, 2 and 1 respectively for the cases of randomness, complexity, and specified complexity.
||According to Darwinian selection, complex structures evolve from simpler ones through a gradual evolutionary process with intermediate forms along the way. Recently in the 2010s it is argued that natural selection gives rise to less complexity than the other means, e.g., the Drosophila flies brought up in laboratories, where they live a pampered life without external challenges. They grow many extra parts as shown in Figure 13. Such observations define complexity by the number of parts. Actually, the more adequate definition should be "specified complexity", which only count those parts that are functional. Anyway, specific complexity can also be controlled by internal selection, which takes place within organisms. Thus, whether external or internal, natural selection is still the primary agent to confer "specified complexity" to living organisms in this world.
Figure 13 Mutant Fly