All origin-of-life researchers face the baffling question of how the biochemical complexity of modern living cells emerged from a barren, primordial geochemical world. The only feasible approach is to reduce biological complexity to a comprehensible sequence of chemistry experiments that can be tackled in the human dimensions of space and time—a lab bench in a few weeks or months. George Cody, Hat Yoder, and I were eager to continue our hydrothermal experiments, but what should come next? We knew that the simplest living cell is intricate beyond imagining, because every cell relies on the interplay of millions of molecules engaged in hundreds of interdependent chemical reactions. Human brains seem ill suited to grasp such multi-dimensional complexity.
Scientists have devised countless sophisticated chemical protocols, and laboratories are overflowing with fancy analytical apparatus. Chemists have learned to synthesize an astonishing array of paints, glues, cosmetics, drugs, and a host of other useful products. Yet when confronted with the question of life’s ancient origin, it’s easy to become mired in the scientific equivalent of writer’s block. How does one begin to tackle the chemical complexity of life?
One approach to understanding life’s origin lies in reducing the living cell to its simpler chemical components, the small carbon-based molecules and the structures they form. We can begin by studying relatively simple systems and then work our way up to systems of greater complexity. In such an endeavor, the fascinating new science of emergence points to a promising research strategy.
It is unlikely that a topic as complicated as emergence will submit meekly to a concise definition, and I have no such definition.
John Holland, Emergence: From Chaos to Order, 1998
Hot coffee cools. Clean clothes get dirty. Colors fade. People age and die. No one can escape the laws of thermodynamics.
Two great laws, both codified in the nineteenth century by a small army of scientists and engineers, describe the behavior of energy. The first law of thermodynamics establishes the conservation of energy. Energy, which is a measure of a system’s ability to do work, comes in many different forms: heat, light, kinetic energy, gravitational potential, and so forth. Energy can change from any one form to another over and over again, but the total amount of energy does not change. That’s the first law’s good news.
The bad news is that nature places severe limitations on how we can use energy. The second law of thermodynamics states that heat energy, for example, always flows from warmer to cooler regions, never the other way, so the concentrated heat of a campfire or your car’s engine gradually radiates away. That dissipated heat energy still exists, but you can’t use it to do anything useful. By the same token, all natural systems tend spontaneously to become messier—they increase in disorder, or “entropy.” So any collection of atoms—be it your shiny new shoes or your supple young body—gradually deteriorates. The second law of thermodynamics is more than a little depressing.
But look around you. You’ll find buildings, books, automobiles, bees—all of them exquisitely ordered systems. Despite the second law’s
dictum that entropy increases, disorder is not the only end point in the universe. Observations of such everyday phenomena as sand dunes, seashells, and slime mold reveal that the two laws of thermodynamics may not tell the entire story. Indeed, some scientists go so far as to claim that a fundamental law of nature, the law describing the emergence of complex ordered systems (including every living cell), is missing from our textbooks.
The discovery of a dozen or so natural laws represents the crowning scientific achievement of the past four centuries. Newton’s laws of motion, the law of gravity, the laws of thermodynamics, and Maxwell’s equations for electromagnetism collectively quantify the behavior of matter, energy, forces, and motions in almost every human experience. The power of these laws lies in their universality. Each law can be expressed as an equation that applies to an infinite number of events, from the interactions of atoms to the formation of galaxies. Armed with these laws, scientists and engineers confidently analyze almost any physical system, from steam engines to stars.
So sweeping and inclusive are these natural laws that some scholars of the late nineteenth century suggested that the entire theoretical framework of science had been deduced. All that remained to be discovered were relatively minor details, like filling in the few remaining gaps in a stamp collection. Though this turned out not to be the case—modern physics research has revealed new phenomena at the relativistic scales of the very small, the very fast, and the very massive—the classic laws do indeed still hold sway in our everyday lives.
Yet in spite of centuries of labor by many thousands of scientists, we do not fully understand one of nature’s most transforming phenomena—the emergence of complexity. Systems as a whole do tend to become more disordered with time, but at the local scale of a cell, an ant colony, or your conscious brain, remarkable complexity emerges. In the 1970s, the Russian-born chemist Ilya Prigogine recognized that these so-called complex emergent systems arise when energy flows through a collection of many interacting particles. The arms of spiral galaxies, the rings of Saturn, hurricanes, rainbows, sand dunes, life, consciousness, cities, and symphonies all are ordered structures that emerge when many interacting particles, or “agents”—be they mol-
ecules, stars, cells, or people—are subjected to a flow of energy. In the jargon of thermodynamics, the formation of patterns in these systems helps to speed up the dissipation of energy as mandated by the second law. Scientists and nonscientists alike tend to value the surprising order and novelty of such emergent systems.
The recognition and description of such emergent systems provides a foundation for origin-of-life research, for life is the quintessential emergent phenomenon. From lifeless molecules emerged the first living cell. If we can understand the principles governing such systems, we may be able to apply those insights to our experimental programs.
If you want to enunciate a law that characterizes emergent systems, then the first step is to examine everyday examples. You can observe emergent behavior in countless systems all around us, including the interactions of atoms, or of automobiles, or of ants. This universal tendency for systems to display increased order when lots of objects interact, while fully consistent with the first and second laws of thermodynamics, is not addressed explicitly in either of those laws. We have yet to discover if all emergent systems possess a unifying mathematical behavior, though our present ignorance should not seem too unsettling. It took more than a half-century for each of the first two laws of thermodynamics—describing the behavior of energy and entropy, respectively—to develop from qualitative ideas into quantitative laws. I suspect that a mathematical formulation of emergence will be discovered much sooner than that, perhaps within the next decade or two.
Scientists have already identified key aspects of the problem. Many familiar natural systems lie close to equilibrium—that is, they are stable and unchanging—and thus they do not display emergent behavior. Water gradually cooled to below the freezing point equilibrates to become a clear chunk of ice. Water gradually heated above the boiling point similarly equilibrates by converting to steam. For centuries, scientists have documented such equilibrium processes in countless carefully controlled scientific studies.
Away from equilibrium, dramatically different behavior occurs. Rapidly boiling water, for example, displays complex, turbulent convection. Water flowing downhill in the gravitational gradient of a river
valley interacts with sediments to produce the emergent landform patterns of braided streams, meandering rivers, sandbars, and deltas. These patterns arise as energetic water moves.
Emergent systems seem to share this common characteristic: They arise away from equilibrium when energy flows through a collection of many interacting particles. Such systems of agents tend spontaneously to become more ordered and to display new, often surprising behaviors. And as patterns arise, energy is dissipated more efficiently, in accord with the second law of thermodynamics. Ultimately, the resulting behavior appears to be much more than the sum of the parts.
Emergent patterns in water and sand may seem a far cry from living organisms, but for scientists studying life’s origins there’s a big payoff in understanding such simple systems: Of all known emergent phenomena, none is more dramatic than life, so studies of simpler emergence can provide a conceptual basis, a jumping-off point, for origin-of-life research.
Even though emergent systems surround us, a rigorous definition (much less a precise mathematical formulation) remains elusive. If we are to discover a natural law that describes the behavior of emergent systems, then we must first identify the essential properties of such systems. But what characteristics distinguish emergent systems from other less interesting collections of interacting objects?
All emergent systems display the rather subjective characteristic of “complexity”—a property that thus far lacks a precise quantitative definition. In a colloquial sense, a complex system has an intricate or patterned structure, as in a complex piece of machinery or a Bach fugue. “Complexity” may also refer to information content: An advanced textbook contains more detailed information, and is thus more complex, than an elementary one. In this sense, the interactions of ants in an ant colony or neurons in the human brain are vastly more complex than the behavior of a pile of sand or a box of Cheerios.
Such complexity is the hallmark of every emergent system. What scientists hope to find, therefore, is an equation that relates the properties of a system on the one hand (its temperature or pressure, for example, expressed in numbers), to the resultant complexity of the
system (also expressed as a number) on the other. Such an equation would in fact be the missing “law of emergence.” But before that is possible we need an unambiguous, quantitative definition of the complexity of a physical system. How to proceed?
A small band of scientists, many of them associated with the Santa Fe Institute in New Mexico, have thought long and hard about complex systems and ways to model them mathematically. But their efforts yield surprisingly diverse (some would say divergent) views on how to approach the subject.
John Holland, an ace at computer algorithms and a revered founder of the field of emergence, models emergent systems as computer programs with a fixed set of operating instructions. He suspects that any emergent phenomenon, including sand ripples, ant colonies, the conscious brain, and more, can be reduced to a set of selection rules. Holland and his followers have made great strides in mimicking natural phenomena with a few lines of computer code. Indeed, for Holland and his followers the complexity of a system is closely related to the minimum number of lines of computer code required to mimic that system’s behavior.
A delightful example of this approach is BOIDS, a simple program written by California programmer Craig Reynolds that duplicates the movements of flocking birds, schooling fish, swarming insects, and other collective animal behaviors with astonishing accuracy. (To check it out on the Internet, just Google “BOIDS.”) Lest you think that this effort is idle play, remember that computer programmers of video games and Hollywood special effects have made a bundle on this type of simulated emergent behavior. Think of BOIDS the next time you watch dinosaur herds on the run in Jurassic Park, swarming locusts in The Mummy, or schools of fish in Finding Nemo.
Physicist Stephen Wolfram, a mathematical prodigy who made millions in his twenties from the elegant, indispensable computer package Mathematica, provides a complementary vision of emergent complexity from simple rules. Like Holland, Wolfram was captivated by the power of simple instructions to generate complex visual patterns. Sensing a new paradigm for the description and analysis of the natural world, he has spent the past 20 years developing what he calls “a new kind of science” (NKS for short). A mammoth tome by that title published in 2002 and an elaborate Web site (www.wolframscience.com) illustrate some of the stunning ways whereby geometric complexity
may arise from simple rules. Perhaps, Wolfram argues, the complex evolution of the physical universe and all it contains can be modeled as a set of sequential instructions.
Many other ways to view complex systems have been proposed. The late Danish physicist Per Bak described complex systems in terms of a mathematical characteristic called “self-criticality.” These systems evolve by repeatedly achieving a critical point at which they falter and regroup, like a growing pile of sand that avalanches over and over again as new grains are added. Santa Fe theorist Stuart Kauffman proposes another tack, focusing on the emergence of chemical complexity via competitive “autocatalytic networks,” by which collections of chemical compounds catalyze their own formation. And Nobel laureate Murray Gell-Mann, who also works at the Santa Fe Institute, has recently introduced a new parameter he calls “nonextensive entropy”—a measure of the intrinsic complexity of a system—as a path to understanding complex systems.
All these approaches and more inform the search for a law of emergence; all provide a glimpse of the answer. Yet each seems too abstract to apply to benchtop chemical experiments on the origin of life. An experimentalist needs to decide on the nitty-gritty details: What should be the starting chemicals at what concentrations; how acidic or basic the solution; what run temperatures, pressures, and times? Is there any way that the ideas of emergence can help?
A classic scientific approach to discovering general principles and laws is to examine the behavior of specific systems. The study of simple systems that display emergent behavior may well point to physical factors that lead to patterning in much more complex systems, including life. We can hope that observations of specific systems will eventually point to more general rules.
You don’t need a laboratory to observe emergent phenomena. In fact, you can’t go on a hike without seeing dozens of examples of emergence in action. Among my favorite emergent phenomena are interactions of water and sand, which provide a convenient and comprehensible example of structures arising from the energetic interactions of lots of agents (not to mention a great excuse to spend the day at the shore). When moving water (or wind, for that matter) flows across a
flat layer of sand, new patterns arise. Periodic sand ripples appear, as sand grains are sorted by size, shape, and density. The system thus becomes more orderly and patterned as energy—the flow of wind or water—dissipates.
My favorite emergent sandy system lies at the base of the fossil-rich hundred-foot-tall cliffs that border the Chesapeake Bay’s western shore in Calvert County, Maryland. Fifteen-million-year-old whale bones, razor-sharp sharks’ teeth, branching bleached corals, and robust fist-sized clamshells abound in the wash zone, where waves constantly wear away the soft sediments. Walks along those majestic formations often lead to thoughts about the factors that contribute to complexity.
At times of unusually low tide, especially near a new moon in the cold clear winter months, receding waters expose a gently sloping pavement of ancient sediments below the base of the cliff—a formation called blue marl. Treacherously slippery when wet, this firm flat surface commonly accumulates a thin layer of sand—particles that display emergent patterns when subjected to the wash of shallow water. Over the years, I’ve noticed four distinct factors that contribute to the emergence of complex sand patterning.
The first obvious factor in achieving a patterned, complex system is simply the density of sand grains—that is, the number of interacting particles per square centimeter of the blue marl’s surface. It’s easy to estimate this number by collecting almost every grain of sand from an area 10 centimeters square, about the size of a small paper napkin. I collect the sand in a plastic bag or bottle, take it back to the lab, dry it, and weigh the sample. Using a microscope, I count out 100 grains from the sample and then weigh that batch. As it turns out, the total number of grains per square centimeter is approximately equal to the total weight of sand from the 100 square-centimeter (10 × 10) area divided by the weight of 100 sand grains.
I find that with fewer than about 100 sand grains per square centimeter, the dusting of particles is too sparse for any noticeable patterns to emerge. Given the minute size of the average sand grain, typically less than half a millimeter in diameter, 100 grains per square centimeter provides a sparse coverage over less than 10 percent of the smooth
Patterns in sand grains emerge as the concentration of grains increases. At about a thousand grains per square centimeter (A) small, black-topped piles are observed; at a few thousand grains per square centimeter (B) discontinuous bands arise; and above 10,000 grains per square centimeter (C) continuous ripples cover the surface.
blue marl surface. Increase the sand concentration to about 1,000 grains per square centimeter, however, and an intriguing pattern of regularly spaced sand piles, each a centimeter or two across, appears on the hard blue surface. What’s more, a small circle of darker sand grains typically crowns each little tan pile. Evidently a minimum concentration of several hundred grains per square centimeter is required to initiate patterning in sand.
Increase the sand concentration slightly to a few thousand grains per square centimeter and you get discontinuous short bands of sand at right angles to the gentle back-and-forth wave motion of the shallow water. As with the mini-sandpiles, each tan band is topped by a line of darker grains. And as sand concentration exceeds 10,000 grains per square centimeter, continuous, evenly spaced, black-capped ripples form across the hard pavement. I’ve seen this classic rippled surface cover hundreds of square meters of shallow water in patterns so hypnotically regular that I hesitated to disturb the symmetry by walking on it.
And that’s it. Higher concentrations of sand simply provide a deeper base for the regular ripples. Buried sand grains don’t participate in the process so no new structures arise beyond the elegant, wavelike, periodic forms on the surface.
This systematic behavior suggests that the concentration of interacting agents plays a fundamental role in the emergent complexity of a system. Below a critical threshold, no patterns are seen. As particle concentrations increase, so too does complexity, but only to a point. Above a critical saturation of agents, we find no new behaviors.
Similar observations have been made about other emergent systems. One ant species—Eciton burchelli, the army ant—stays close to home as long as the colony consists of fewer than about 80,000 individuals. Exceed that number of army ants, however, and the colony exhibits new emergent behavior; like a bursting dam, the ants pour out in a massive “swarm raid” to attack adjacent colonies. At higher populations, half of the ants may spontaneously leave to form a new colony. Studies of termite colonies also reveal that the construction of pillar-type mounds requires a critical density of individuals.
At a much greater scale, spiral galaxies require a minimum number of about 100 million stars to trigger development of the familiar spiral arm structure. According to theoretical models of astrophysicists, the majestic arms form as a result of gravitational instabilities caused in part by a large central mass of stars.
Human consciousness and self-awareness also emerge from the interactions of trillions of neurons. Sadly, as those of us who watch friends and relatives afflicted with Alzheimer’s disease must observe, when a critical number of cells and their connections are destroyed, self-awareness fades away.
These findings suggest that the emergence of life might have depended on achieving some minimal concentration of biomolecules, the essential agents of cellular life. Too few molecules, no matter how friendly the environment, and life could not arise. That’s a useful idea to bear in mind when designing origin-of-life experiments.
Sand grains influence each other by direct contact, the simplest local way to interact. A rounded grain at the surface of a sandpile typically touches about a half-dozen adjacent grains. The balance between these
stabilizing contacts and gravity on the one hand, and the restless, disruptive flow of water on the other, leads to a controlled shuffling of grains and ultimately to the rippled patterning of sand. By contrast, ants in an ant colony interact over much greater distances, by marking the ground with a variety of pheromones, which are chemical signals that point other ants to food, alert them to danger, and provide other vital information. In this way, any given ant has the potential to interact with thousands of colony mates in varied ways. These differences in interconnectedness provide part of the reason why ant colonies are more complex than water-shaped sandpiles.
The conscious brain, the most complex system we know, is also the most complexly interconnected. Each of the trillions of neurons in your brain interacts with hundreds of other nearby cells through a branching network of dendrites. Electrical signals between any two neurons, furthermore, may be stronger or weaker, like the current controlled by the dimmer switch on your lamp. Interconnections of the brain are vastly more intricate than those of sand or ants.
These observations of emergent systems suggest that life’s origin must have relied on a wide repertoire of chemical interactions. Experiments that optimize the number and type of molecular contacts might thus be more likely to display emergent behaviors of interest.
Regardless of how many sand grains or ants or neurons are present, no pattern can emerge without a flow of energy through the system. Sand grains will not start hopping without a certain minimum water-wave speed (typically about 1/2 to 1 meter per second along the shores of the Chesapeake Bay). More energetic waves with greater speed and amplitude move grains more easily and generate sand patterns more quickly, though these patterns do not appear to differ fundamentally in their shapes.
But every complex patterned system has a limit to the magnitude of energy flow it can tolerate. During energetic storms, crashing waves obliterate sand ripples and other local sedimentary features. Black and tan sand grains become jumbled and all signs of emergent patterning disappear.
The human brain exhibits strikingly similar behavior in terms of energy flow. During normal waking hours, the brain maintains a mod-
erate level of electrical impulses—the normal healthy flow of energy through the neural system. Deepest sleep corresponds to a sharp drop in electrical activity as we slip from consciousness, whereas the excessive electrical intensity of an epileptic seizure thwarts conscious action by scrambling the usual patterned electrical flow.
The emergence of complex patterns evidently requires energy flow within rather restrictive limits: Too little flow and nothing happens; too much flow and the system is randomized—entropy triumphs. This conclusion is important for the experimental study of life’s chemical origins. A reliable source of energy is essential, to be sure, but lightning, ultraviolet radiation, and other intense forms of ionizing energy can blast molecules apart and may be too extreme to jump-start life. We must look for gentler chemical energy sources, like the steady, reliable chemical potential energy stored in a flashlight battery, to sustain the metabolism of primitive life.
Many natural systems are subject to cycles of energy: day and night, summer and winter, high tide and low tide. Such cycles may play a fundamental role in the evolution of emergent systems, though it’s often difficult to document the effects of these subtle cycles in nature.
Laboratory wave tanks, though considerably less scenic than the Chesapeake Bay in January, facilitate the study of sand-ripple formation under controlled conditions. Recent research on natural patterned systems reveals that cycling of energy flow through a system is a fascinating and previously unrecognized fourth factor in generating complex sand patterns. In 2001, physicist Jonas Lundbek Hansen at the Niels Bohr Institute and his Danish colleagues announced this surprising wrinkle in the mechanics of ripple formation. Most previous experiments had involved fixed wave amplitudes (that is, wave height) and frequencies (how many waves pass a given point in a second). Such studies typically generate perfectly spaced, straight ripples. Instead, Hansen and his colleagues wondered what might happen if they cycled these variables. Over periods of several minutes, they increased and then decreased the amplitude or the frequency of their water waves. The results were breathtaking. Rather than simple parallel sand ripples, they produced elegant intertwined and branching sand structures. These new patterns appear remarkably similar to sand features that
commonly arise along the Chesapeake Bay when the water is only a few inches deep—conditions that apparently favor periodic fluctuations in wave amplitude.
Alert to the potential power of energy cycling, PhD student Mark Kessler and Professor Brad Werner of the University of California, San Diego, recently analyzed amazing stone circles and other so-called “patterned grounds” in Alaskan Arctic terrain that is subject to cyclical freezing and thawing. With each thaw, rounded boulders shift slightly, interacting with one another over many years to produce remarkable fields covered by natural circles of stone. [Plate 2]
The role of cycling in the emergence of patterns represents a frontier area of study that is keenly watched by some origin-of-life investigators. After all, the primitive Earth was subject to many cycles—day/night, high tide/low tide, wet/dry, and more. Perhaps such cycles, which can be duplicated in a controlled experimental environment, contributed to the emergence of life itself.
So what might a mathematical law of emergence look like? My guess is that the expression will take the form of a mathematical inequality, something like this:
That’s a short-hand way of saying that the emergent complexity of a system, denoted by the letter C (for “complexity”), is a number less than or equal to some value that is a mathematical function (f) of the concentration of interacting particles (n), the degree of those particles’ interconnectivity (i), the time-varying energy flow through the system [∇E(t)], and perhaps other variables as well.
At least two daunting impediments thwart the completion of this potentially simple formulation. First, as previously noted, we lack a precise definition of complexity. It’s impossible to quantify something when you don’t really know what that something is. And second, we are woefully ignorant of the exact mathematical relationships between complexity and the three possible key factors: the concentration of interacting agents, the interconnectivity of those agents, and the cyclical
energy flow. Simple systems yield tantalizing clues, but we are still a long way from any definitive formula.
![]()
This quest to characterize emergent phenomena, though initially couched in mathematical abstraction, is not ultimately an abstract exercise. Emergent systems frame every aspect of our experience. Our environment, our bodies, our minds, the patterns of our lives and our culture—all display emergent complexity. A comprehensive theory of emergence will foster applications to myriad problems in everyday technology: long-range weather prediction, computer network design, traffic control, the stabilization of ecosystems, the control of epidemics, perhaps even the prevention of war. Armed with such a law, we will acquire a deeper understanding of any system of many interacting agents—indeed, even of the origin of life itself.