Complexity


"Complexity," as a label of a scientific interest area, generally refers to the study of large-scale systems with many interacting components. Complex systems that are often offered up as examples include financial markets with competing firms, social insects (such as those that form ant colonies and build wasp nests), the human immune system, commodity markets in which agents buy and sell through auctions, and the neural circuits of the brain.

What makes these systems complex, aside from their raw composition, is that the most interesting ones exhibit behavior on scales above the level of the constituent components. In a superconducting metal, where electrical resistance vanishes, it doesn't make sense to ask what the resistance of a given electron is. Rather, the superconducting effect arises from a large collection of electrons interacting with the atoms in the metal's crystal lattice. Financial markets, to take an example from an entirely different realm, are used to set prices for goods that would be otherwise impossible for individual agents on their own to determine. The functioning of the human brain, or even any one of its subsystems, like the visual cortex, is a property of the neurons and their circuits operating together. Thus, the functioning of complex systems often reflects cooperative behavior and the emergence of structure. The Nobel Laureate physicist Phil Anderson summarized this two decades ago by noting that often "more is different".

"Complexity", as a character of natural processes, has two distinct and almost opposite meanings.

The first, and probably the oldest mathematically, goes back to Andrei Kolmogorov 's attempt to give an algorithmic foundation to notions of randomness and probability and to Claude Shannon 's study of communication channels via his notion of information. In both cases, complexity is synonymous with disorder and the lack of structure. The more random a process is, the more complex it is. An ideal gas, with the molecules bouncing around in complete disarray, is complex as far as Kolmogorov and Shannon are concerned. Thus, this sense of "complexity" refers to degress of complication.

The second sense of "complexity" refers instead to how structured, intricate, hierarchical, and sophisticated a natural process is. That is, in this sense, "complexity" is an indicator of how many layers of order or how many internal symmetries are embedded in a process. The human brain is complex in this sense due to the high degree of structure in its neural architecture, in the many different scales of information processing from perception to interpretation of stimuli, and in the intricate social behaviors it supports in human groups.

When confronted with a phenomenon, the distinction between these two meanings can be revealed by answering a simple question, Is it complex or is it merely complicated?

The recent history of the study of complex systems can be seen as a natural and necesary follow-on to the studies of chaos and nonlinear dynamics of the late 1970's and early 1980's. During the latter period, the main focus was the study of how randomness appeared from simple systems. Building on the successes in answering this question, by the mid-to-late 1980's the opposite, but complementary question had come to the fore, How does order emerge in large, complicated systems? So, initially we had complication arising from simplicity, then we had simplicity emerging from complication.

Complex nature, of course, is the interplay of just these sorts of tensions.

Exhibits || CompLexicon || Timeline

© The Exploratorium, 1996