Earlier this month, the computer scientist and psychologist John Holland passed away. Holland is known as the "father of complexity" for being the pioneer in the study of complexity and complex adaptive systems. He studied under Arthur Burks, who worked with the visionary computer scientist John Von Neumann, whom he helped design the first electronic digital computer, and who came up with the idea of the Von Neumann machine, a self-reproducing computer program, before DNA was discovered.
Scott Page writes at the Washington Post:
Holland was fascinated with von Neumann's "creatures" and began wrestling with the challenge and potential of algorithmic analogs of natural processes. He was not alone. Many pioneers in computer science saw computers as a metaphor for the brain.
Holland did as well, but his original contribution was to view computation through a far more general lens. He saw collections of computational entities as potentially representing any complex adaptive system, whether that might be the brain, ant colonies, or cities.
His pursuit became a field. In brief, "complex adaptive systems" refer to diverse interacting adaptive parts that are capable of emergent collective behavior. The term emergence, to quote Nobel-winning physicist Phil Anderson's influential article, captures those instances where "more is different." Computation in the brain is an example of emergence. So is the collective behavior of an ant colony. To borrow physicist John Wheeler's turn of phrase, Holland was interested in understanding "it from bit.'"
Read the rest of Page's write-up at the Post.
Readers interested in introducing themselves to Holland should read Signals and Boundaries: Building Blocks for Complex Adaptive Systems, which applies the ideas of complexity to biology, markets, and even governments, and vice versa.