About a year ago Stephanie Forrest of the University of New Mexico and Terry Jones of the Santa Fe Institutethe nerve center of "complexity" researchwere fooling around with ECHO, an artificial evolution program developed by Chris Langton, the father of "artificial life." Like most research in the emerging field of complexity, ECHO involves creating "digital organ isms," small snippets of computer code equipped with instructions to help them survive and replicate in the silicon environment. When turned loose in cyberspace, these computer creatures act very much like living organisms. They can be programmed to compete for scarce resources, trade genes through sexual interactions, fight or trade with other organisms, andin a fast-for ward of evolutionreproduce themselves over thousands of generations.
The particular program that Jones and Forrest were tuning was designed to mimic biologi cal evolution. After a while, they began noticing an odd pattern. "No matter what organisms we started with, a few species would become predominant while the vast majority remained scarce," says Jones. "The species might change places occasionally, but the pattern always remained the same. Even programs that had bugs in them produced the same configuration."
Their curiosity piqued, Jones and Forrest showed their results to Jim Brown, an ecologist at the University of New Mexico. His eyes opened wide. "That's Preston's curve," said the startled Brown. "It's a well-known ratio in ecology which says that most species remain small while only a few species widely proliferate."
The replication of Preston's curve in the computer is typical of discoveries being made today in the field of complexity. At times, the insights seem sublime, as if the divine architecture of the universe were being revealed. Then again at others, the research is disappointing: "The major discovery to emerge from the [Santa Fe] Institute thus far is that 'it's very hard to do science on complex systems,'" writes John Horgan of Scientific American, quoting University of Chicago mathematical biologist Jack D. Cowan. Neither the promise nor the disappointment is surprising, however. Complexity is a young field, examining difficult problems with new tools. Its results are often remarkable, deepening our understanding of how the world works (or might work). But they are also quite preliminary and subject to conflicting interpretations, especially when applied to social systems.
Complexity theory explores how systems that are open to sources of energy are able to raise themselves to higher levels of self-organization. Oil droplets in water will arrange them selves into a sphere that is hollow inside. Snowflakes form crystals that always have a hexagonal design. Previously these patterns had been thought of as coincidences or curiosities. But begin ning with Ilya Prigogine, the 1977 Nobel laureate in chemistry, theorists have been arguing that complex systems are capable of making unexpectedly large leaps in self-organization that allow them to maintain a precarious non-equilibrium.
This ordered non-equilibrium forms a temporary stay against the dreaded Second Law of Thermodynamics, which says that any order in a closed system will eventually dissipatea phenomenon called "increasing entropy." Water and ink mixed together, for example, eventually turn a uniform light blue. Pockets of hot and cold air in a room will become uniformly luke warm. At bottom, the logic is that there are only a few orderly states and many, many more messy or disordered ones. The chances of achieving an orderly state by accident are vanishingly small.
Popular alarmists from Henry Adams to Jeremy Rifkin have preached pessimism by misus ing the Second Law to argue that life on Earth is quickly winding down. (Unaware of nuclear energy, Adams predicted that the sun's core would burn out in a few thousand years.) But Earth is not a closed thermodynamic systemand the universe may not be either. Life on Earth lives off the constant energy provided by the sun. This allows the biosphere to pile up order in the form of complex organic molecules, stored genetic information, and stable ecosystems. As for the universe itself, it is still living off the energy of the Big Bang. Just where that energy came from and whether it will eventually dissipate to thermodynamic equilibrium is still being debated by the cosmologists. But concerns about the universe winding down into a uniform state of low -grade energy are premature at best.
Says Harold Morowitz, editor-in-chief of Complexity , a bi-monthly journal that published its first issue in September: "Entropy is only defined for closed, equilibrium systems. The Earth is not a closed system and for all we know, the universe may not be, either. But as far as non -equilibrium systems are concerned, any system that is in something other than its most random state, as a result of boundary conditions or energy flowing through the system, is experiencing some kind of spontaneous order."
Enthusiasts point to examples of spontaneous order almost everywherethe gathering of winds into a hurricane, the self-replication abilities of DNA, the interlocking relationships of an ecosystem, the immune system, the human mind, the development of language and common law, the evolution of human cultures. All are representative of the slow, cumulative development of "autocatalytic" patterns and positive feedback loops that eventually emerge as "complex, adap tive systems"organisms capable of sustaining their identity while constantly changing to meet new environments.
As Stuart Kauffman, a MacArthur fellow and Santa Fe Institute star, puts it: "I read about entropy and I expect a world filled with disorder. Yet I look outside my window and see nothing but order. It is the order that needs explaining."
In his first brilliant discovery 20 years ago, for example, Kauffman created a system of 100 computer-dwelling "genes," all turning each other on and off, just as real genes are believed to do in any living organism. Although the situation appeared completely chaotic, Kauffman dis covered that when each gene was controlled by only two other genes, the on-off patterns would settle down into about 10 repeating cycles, or "basins of attraction." He posited that these basins corresponded to the different cell types that are produced by the same set of genes in every multicellular organism. In one of those sublime revelations that dot the history of complexity research, Kauffman later discovered that he had duplicated a long-known principle of genetics: that the number of cell types in an organism is roughly equal to the square root of the number of its genes.
"The simplicity of this result astonished me at the time," he says. "It still astonishes me."
While everyone in complexity research agrees that spontaneous order exists throughout the cosmos, no one seems to be able to agree on exactly what that implies. "The consensus right now is that there is no one thing called complexity theory," says Michael Simmons, director of re search at Santa Fe. "Talk to any person who is working in the field and they'll give you a differ ent definition."
One area of agreement is the concept of "algorithmic information
content," or AIC. This refers to the length of the instructions
needed to generate the solution to a problem. AIC can be very small
or very large. The instruction: "Print 1 followed by a trillion
zeros" produces a very large outcome with only one simple
instruction. The instruction: "Print 17898369806895434
8230..." up to a trillion randomly selected numbers would require an AIC larger than the answer itself.
It is only in the middle range that things start to get interesting. Suppose we tell the com puter: "Print a million numbers at random. Now scan the string and find how many times the numbers '1234567890' appear in succession. Remove these numbers and scramble the remaining numbers again. Repeat step 2. Repeat step 3. Continue until the string does not occur. Count the digits in the remaining number. Print the answer." The computer will have to do a lot more "thinking" to complete the instructions.