Politics

Biology: 2001

Understanding culture, technology, and politics in "The Biological Century"

|

As the millennium approaches like an overloaded freight train--fat with metaphor--we shall see many attempts to peer beyond the veil of that magical number, 2000. (For purists, 2001.) Many try to do linear extrapolations from current trends. Others assume, like southern Californian weather forecasters, that tomorrow will be pretty much like today, only more crowded.

Perhaps the best approach to prognostication uses analogy: Could our century have been foretold in the 1890s?

First, recall that the 19th century was dominated by the metaphors and technological implications of two sciences: chemistry and mechanics. Wonders as striking as railroads and steamships conspired with humble revolutions like the use of artificial fertilizer to make the world new and bountiful.

To be sure, other themes were sounding through the serene Victorian atmosphere. At mid-century, the audacious Darwin-Wallace theory of evolution by natural selection began preparing the ground for modern biology and excited enormous public furor. Elsewhere in England, Michael Faraday and James Clerk Maxwell were laying the foundations of electromagnetic technology. Their discoveries promised much that could be used fairly soon in practical ways. Darwin troubled minds, but had no pragmatic implications.

While the older crafts and models of Newtonian mechanics and workaday chemistry drove the great economic and social engines of the Victorian era, in the waning decade of the century, Edison, Marconi, and others sounded the opening theme of the next, electric era. These inventors caught the public's imagination. Radio saved the Titanic's passengers; Edison captured movement and sound. The great, unsettling conceptual shifts of relativity and quantum mechanics followed later. By 1910, the transition was obvious.

For clearly, physics has dominated our century. It has altered everything, from wars to whores. Electromagnetic theory and experiment gave us the telephone, radio, TV, computers, and made the internal combustion engine practical--thus, the car and airplane, leading inevitably to the rocket and outer-space exploration. Indeed, the fateful wedding of rockets with the other monumental product of physics, nuclear bombs, led to the end of large-scale strategic warfare--as profound a change as any in modern times.

Even now, as the century wanes, physicists remain our scientific Brahmins. They dominate government committees, holding forth on topics far beyond their nominal expertise: defense, environmental riddles, social policy. And yet, far from the physics departments of the great campuses, a clarion call is sounding through our time, one that responds to hot-button environmental problems and that incorporates rapid advances in other laboratories: Biology has turned aggressively useful.

By analogy, we stand on the threshold of the Biological Century. Just as the 1890s hummed with physical gadgetry, our decade bristles with striking biological inventions. Conceptual shifts will surely follow. Beyond 2000, the principal social, moral, and economic issues will probably spring from biology's metaphors and approach, and from its cornucopia of technology. Bio-thinking will inform our world and shape our vision of ourselves.

The Easy Era

Even as particle physicists desperately, unsuccessfully tried to get their $10 billion Superconducting Super Collider built in Texas, a smaller initiative quietly proceeded: the Human Genome Project. This vast effort, eventually costing about $3 billion, will map the human genetic code—our DNA. (Researchers have already completely sequenced a bacterium genome.)

The Genome Project's first director was James Watson, co-discoverer, with Francis Crick, of DNA. The project is the largest job ever attempted in biology, but surely not the last foray of biologists into "Big Science," where physicists have cultivated their own plantations for decades.

DNA sequencing opens vast ethical issues. We shall be able to know who has defective genes. What will it mean when we can be sure we're not all born equal? Worked out, the implications will scare a lot of people. Insurance companies will not want to cover those with a genetic predisposition to illness, for example. Here lurk myriad lawsuits.

But these are short-term ethical questions, surely. The true solution lies in fixing genes, not merely reading them. If parents-to-be can have their problem genes edited into normal ones, most of the issues may evaporate. And this is just one of many advances that portends much. Will we stop at cleaning up what we see as defects? I doubt it.

As we all saw in grade school, once you learn how to read a book, somebody is going to want to write one--that's how authors are made. Once we know how to read our own genetic code, someone is going to want to rewrite that "text," tinker with traits--play God, some would say.

True rewriting lies a few decades off, I believe. The first years of the Biological Century will probably be an Easy Era, much as physics enjoyed a period of largely uncritical acceptance of wonder after wonder, until The Bomb. Remember that radium was widely thought to be a general cure until Madame Curie died of her exposure.

The first signs of a quiet revolution in our daily lives will probably come with some fairly non-controversial commercial products. Much research has gone into cellular critters that can digest oil spills or other toxic contaminants. Some work reasonably well already. Soon enough such research will give us a spectrum of organisms which digest unpleasant substances. That should mean refineries that don't stink, rivers that don't catch on fire, streams that aren't sewers.

Plants have plenty of chemical defenses and a smart farmer will come to use that. In temperate zones, winter is the best insecticide; it keeps the bugs in check. The tropics enjoy no such respite, so plants there have developed a wide range of alkaloids that kill off nosy insects and animals. Nicotine is an excellent insect foe; the fact that humans addict to it is a curious side effect. Adapting such defenses to orchards and crops is an obvious path for biotech.

Consider the farm of the next century, which we might better call a "pharm"--because it may well be devoted to growing proteins, not wheat. Already researchers can synthesize proteins in animals by co-opting their own schemes for making, for example, milk.

Genetically altered goats have been made to yield in their milk a particular human protein which effectively dissolves the fibrin clots responsible for coronary occlusions. Efficiencies are low, but probably won't remain so. To get high yields, it will be a good idea to go to dairy cows, which produce 10,000 liters of milk a year.

Imagine a cow which yields insulin, the expensive lifesaver of diabetics. We could make such a beast by editing the genes that control the cow's internal chemistry. The simple way would be to make two kinds of cows, one that produced milk rich in the "alpha" chain that helps make up insulin, and a second that makes the "beta" chain. This would free the cows from having to contend with insulin in their own systems, for only when the alpha and beta chains are mixed do we get insulin itself.

Insulin grown down on the pharm would probably be much cheaper than ours today. Similarly, there seems no barrier to making many pharmaceuticals in natural systems. Sheep might be specialized to a whole range of useful drugs, for example. International giants such as Merck are studying such avenues now.

Sheep, goats, and cows would become the essential "bioreactors" that reproduce themselves in a barnyard biotech which could benefit many farmers who never heard of protein tinkering. But there will be troubles, because such animals don't breed true. A dairyman in Argentina will have to come back to Pharms Unlimited for his next calf. Indeed, Pharms Unlimited would be mad to make its cows so they can reproduce their (patented!) technology without a fat fee. So the Third World may see this as just another way to keep them on an unbreakable economic string.

Such technology will spread into the immensely profitable realm of direct consumer goods. The easiest will be items so commonplace that on television they look like simple extensions of what we already have.

Imagine a kitchen cleanser that dissolves waste in those hard-to-get places, maybe even invading the grouting of tile in pursuit of fungus. Present "enzymatic" cleaners like Tilex have to be sprayed on and rinsed off. A living variety could patrol on its own, digging very deep, then be rolled up into its holder, lying dormant until needed again. You won't run out of it.

Or ponder a bath mat that slowly tugs itself across the floor, slurping up puddles, deposits of soap and hair spray, hairs, general "human dander." It lives on the stuff, plus an occasional helpful dollop of diet supplements from the otherwise distracted homemaker--who thinks of it as a rug, not a pet.

The opening wedge of such products will be less startling. Invisible, convenient, they'll come innocently, not seeming to announce a revolution. Resident "toothpaste" that does the essential policing up after lunch, and maybe even makes your breath smell, well, not so bad. Stomach guardians that ward off Montezuma's Revenge before you notice a single symptom--permanently, because the microbes are symbiotic with you, and live throughout your digestive system.

Even insulin cows would become obsolete if we develop microbes that can do the essential task directly in the body. We now can modulate some controlling genes that supervise elaborate biochemical transactions. It seems feasible within a decade or two to tailor these controller genes exquisitely. Then we could produce a capsule containing controller microbes that directly sense the patient's blood glucose level.

This scenario has been suggested by molecular microbiologist Mark O. Martin of Occidental College. Once inserted in the body, such a capsule would respond to glucose changes by making insulin, just as the patient's body should. With a specially designed capsule wall that lets nutrients and glucose in, and insulin out, but keeps the bacteria confined, side effects would be minimal. This could be far more reliable than having patients clumsily self-sample and inject, as they do now.

Such medical improvements face little opposition, particularly if they are hidden inside the body. Overt changes will not be so welcome. Some, though, could be attractive.

Market forces seem likely to spur imaginations. Fad blends easily with fantasy. Maybe there will be a fashion in bio-corduroy, which lives off your sloughed-off skin, perspiration--and even, if you like, some of your less agreeable excretions.

The theme in the Biological Century will be biological balance--what's waste to one creature can become food to another--with a desirable job done in the transaction. This is "homeostasis," the biological equivalent of the thermostat. Such ecology on a small scale could become a public sign of trendy virtue, as popular as recycling is today.

Biotech revisits ancient arts. For many millennia we've been breeding cows and corn, collard greens and collies, to our whim. We can expect more exotic foods, of course, but as important, we may see new and better ways of growing them.

Ants and Analogies

Consider a field of maizecorn, to Americans. At its edge a black swarm marches in orderly, incessant columns. In one column, each ant carries a kernel of corn. In another, each carries bits of husk. In still another, an entire team coagulates around a chunk of a cob. The streams split, kernel-carriers trooping off to a ceramic tower, climbing a ramp and letting their burdens rattle down into a sunken vault. Each returns dutifully to the field. Another, thicker stream spreads into rivulets which leave their burdens of scrap at a series of neatly spaced anthills; dun-colored domes with regularly spaced portals, for more workers.

These had once been leaf-cutter ants, content to slice up fodder for their own tribe. They still do, pulping the unneeded cobs and stalks and husks, growing fungus on the pulp deep in their warrens. They were already tiny farmers in their own right. But biotech had genetically engineered them to harvest and sort first, processing corn right down to the kernels.

Other talents can be added. Acacia ants already defend their mother trees, weeding out nearby rival plants, attacking other insects which might feast on the acacias. Take that ability and splice it into the corn-harvesters, and you do not need pesticides, or the drudge human labor of clearing the groves. Can the acacia ants be wedded to corn? We don't know, but it does not seem an immense leap. Ants are closely related and multi-talented. Evolution seems to have given them a wide, adaptable range.

Following chemical cues, they seem the antithesis of clanky robots, though insects are actually tiny automatons engineered by evolution, the engine that favors fitness. Why not just co-opt their ingrained programming, then, at the genetic level, and harvest the mechanics from a compliant Nature? After all, agriculture is the oldest biotech. But everything else will alter, too.

Mining is the last great, traditional industry to be touched by the modern. We still dig up crude ores, extract minerals with great heat or toxic chemicals, and bring to the surface unwanted companion chemicals. All that suggests mining must be rethought--but on what scale? "Biomining" is actually quite ancient.

Romans working the Rio Tinto mine in Spain 2,000 years ago noticed fluid runoff of the mine tailings was blue, suggesting dissolved copper salts. Evaporating this in pools gave them copper sheets. The real work was done by a bacterium, Thiobacillus ferroxidans. It oxidizes copper sulfide, yielding acid and ferric ions, which in turn wash copper out of low-grade ores. This process was rediscovered and understood in detail only in this century, with the first patent in 1958.

A new smelter can cost a billion dollars. Dumping low-quality ore into a sulfuric-acid pond lets the microbes chew up the ore, with copper caught downhill in a basin; the sulfuric acid gets recycled, at trivial cost. Already a quarter of all copper in the world, from Peru to Alaska, comes from such bio-processing.

Gold enjoys a similar biological heritage. The latest scheme simply scatters bacteria cultures and fertilizers over open ore heaps, then picks grains out of the runoff. This raises gold recovery rates from 70 percent to 95 percent. There is not much room for improvement. Phosphates for agriculture can be had with a similar, two-bacteria method. By noticing that our mining waste was food for another phylum, we close a biological loop, cleanse our human world of "pollution," extend our resources, and make a profit.

All these developments use "natural biotech." Farming began by using wild wheata grass. Antibiotic therapy first started with unselected strains of Penicillium. We've learned much, mostly by trial and error, since then. The next generation of biomining bacteria are already emerging. A major problem with the natural strains is the heat they produce as they oxidize ore, which can get so high that it kills the bacteria.

To fix that, researchers did not go back to scratch in the lab. Instead, they searched deep-sea volcanic vents and hot springs, such as those in Yellowstone National Park. They reasoned that only truly tough bacteria could survive there, and indeed, found some which appear to do the mining job and can take near-boiling temperatures.

Bacteria also die from heavy-metal poisoning, just like us. To make biomining bugs impervious to mercury, arsenic, and cadmium requires bioengineering, currently under way. One tries varieties of bugs with differing tolerances, then breeds the best to amplify the trait. This can only take you so far. After that, it may be necessary to splice DNA from one variety into that of another, forcibly wedding across species.

We are already in the Easy Era. Around us, often without fanfare, emerge new technologies: engineered drugs, pest-resistant plants, single-gene alterations in plants and animals, genetic diagnostics (though usually only DNA testing of suspects makes the news). Within two decades we shall see "bioactive" products which work and live among us and in us, engineered organisms, "pharms," and limited genetic editing.

Alas, only in retrospect shall adjusting to these changes seem easy.

Cultural Convergence

From our blinkered perspective, a Biological Century looks like a fundamental shift in world view, with ramifications that will reach into every cultural corner.

Physics proceeds by atomizing nature, and this habit of mind has deeply penetrated realms far from science. Every fiction writer knows that the trick is immersing the reader in a world, a knitted vision. Yet some schools of literary criticism have aped physics, deconstructing literature until it is a swarm of disjoint words, each ambiguous, their author irrelevant. This stress on contradictory or self-contained internal differences in texts--jam jar labels or novels alike--rather than their link to a culture of meaning, merely leads to literature seen as empty word games.

A biologically sophisticated world view would counter this, looking for how artists and writers manage their integrative effects. Current academe often casts a cold eye on genres, whether the lyric or the epic, the western or the musical, contending that such overt formal differences don't matter much. This may be because genres don't yield to theories of atomized arts, but are best seen as cross-talk, conversations within communities, progressing down through time--evolving. Genres clearly unfold and interact, ragtime to jazz to blues to rock, suggesting biological metaphors.

What would a criticism look like, done in a biological style? Both species and genres have intense interior interactions. Seen in the large, population statistics in literature and life may follow similar laws, though of course occasional individuals can deflect the prevailing tide, in both culture and in survival fitness. Weighing these seemingly contradictory thrusts is the integrative task before us, selectively using conceptual roots from modern biology.

Biologist Richard Dawkins has stressed the role of cultural ideas which propagate themselves, termed "memes," after their analogy to genes. Similar conceptual leaps could dispel the physics-latent mechanical imagery which inhabits much of economics, politics, and the humanities. Marxian ritual invocations of control and commodification, shrouded in a fog of conspiracy, proceed ultimately from misapplied classical mechanics. Self-organization through inherently chaotic interactions, the signature of our economic times, cannot emerge from the linear landscape of deterministic mechanical laws.

Replacing such habits of thought in the minds of managers requires such movements as "bionomics" and, generally, recalling that biology itself is no fixed set of immutable ideas. Evolution itself evolves as an idea, in our time introducing punctuated equilibria, sociobiology, and cladistics--the method of looking through time for evolutionary relationships among creatures.

All these will have to be tested, and in turn will generate fresh analogies for our views of economics. We shall see economic analogies that use not mere competition/cooperation balances, but complex, nonlinear responses to ever-changing circumstances. Predator/prey evolve into symbiotes; nature need not be red in tooth and claw.

Once our technologies learn the trick of reproduction à la nature, not à la factory, we may see a collision between the classical economy of scarcity and one of bio-plenty. Thinkers like Freeman Dyson have been pointing out that the specters which haunt our present--strip-mining and burning up our dwindling resources--may be as narrow a vision as was Spain's obsession with taking gold out of the New World, while missing tobacco, the potato, "love apples" (tomatoes), and the rest.

Biotech opens the promise that the truly fundamental resources will be sunlight, water, organic chemicals, and land--privileging the tropical South and "green tech." This could neatly turn the tables on the industrial, "gray tech" North which will develop the biotech in the first place. Spain, after all, sent Columbus, but missed the boat conceptually in the following century.

An immense payoff for a small, but self-reproducing investment of "smart" biotech is a daunting possibility. We dropped jackrabbits into Australia before we knew their long-range impact. This point is not lost on the Luddites of our time, the Jeremy Rifkin crowd that fears any biotech product, and considers animal husbandry as "slavery."

The Wild Blue Maybe

One of the troubles with such apparently open-ended future projections is that we have no firm idea of what the limitations on biotech will be. Chances are, they'll be wilder than we think. The Frenchmen who first rode hot air balloons and gazed up at the lunar crescent surely did not glimpse the centuries-long path that led through the airplane and the rocket to Tranquility Base.

The most complex riddle in biology is our own brains, possessing about 100,000 times the connections in a state-of-the-art Cray supercomputer. These connections work about a hundred thousand times faster than the comparable computer networks. This yields an organism with about 10 billion times the capabilities of our billion-dollar number crunchers. Consider what could be done by modifying some of the wiring diagram of that brain, or perhaps just some of its inherent chemistry. The potential for vast improvement--and vast damage--is immense.

Our currently common idea of software running on hardware works for machines, but not for brains. Our brains don't just store data in files. They modify themselves in response to strong inputs, laying down fresh patterns. This is why your memories of an incident can be modified by hearing another's version, or seeing a film of it. Brains form new routes for thinking--self-programming and self-hardwiring.

To reflect this, I think we will need a new category--liveware.

Like art, "living" is a property nobody can define exactly but everybody thinks they can recognize. The virtue of live technology is the same as the dray horse--it can look after itself, in its own fashion. Cropping grass, burning that grass for energy in its belly, relieving itself, the horse does a lot of its own maintenance. Liveware would similarly police its own act, and be able to make copies of itself in the bargain, just like the dray.

A bioteched piece of liveware should be patentable. Europeans generally recoil from patenting any "living invention," whether a gene, a cell, an engineered plant, or a human body part. In the United States, patents are commonly given. We shall see a major collision between voices of environmentalism, "social justice" and religion, and international corporations like Monsanto. American patent expert Rebecca Eisenberg recently observed, "In the U.S., we think of the morality issue as outside the realm of the patent system."

This fundamental division will accelerate as business tries to patent genes and plant traits extracted from the tropical world. The Department of Commerce estimates that life patents will be worth $60 billion worldwide by 2010. This is a deep, gut issue, like abortion, and it is approaching quickly.

But what's patentable is also, alas, mutable. Once made, it can undergo mutation and make something we did not intend. That's evolution, folks--biological, social, psychological. As the title to Kevin Kelly's survey of these developments suggests, in many unsettling ways, the Biological Century will be Out of Control.

This very quality collides spectacularly with the pervasive scientific illiteracy of most industrial societies, especially the United States. Throughout the early 1990s, the University of California at San Francisco waged a costly, ultimately unsuccessful battle with nearby residents simply to carry on research. They were routinely accused of letting loose infectious pathogens, toxic wastes, and radioactivity. In public hearings, one excitable citizen suggested that doing recombinant DNA work had produced the AIDS virus. Another declared on television her outrage that "those people are bringing DNA into my neighborhood."

By the time the public begins to see just how rapidly change can come, the Easy Era will be over, never to return.

Human Categories

Intention is the crux of the moral issues we will face. The abortion battles of our day will pale compared with the far more intimate and intricate capabilities that yawn just a decade or two away. In the United States, abortion hasn't gone away as an issue--and the issue is essentially unchanged. The nature of biotech is change. What will happen, then, when changes come thick and fast, as they are already starting to?

Consider: Brahmins in India use amniocentesis to determine the sex of a fetus early on in pregnancy--and then preferentially abort the girls, because sons are more prized. This "genetic counseling" frames a typical conflict between our easy categories. Where does "reproductive choice" end when it systematically acts against females? If allowed to go on, we could produce harrowing population differences far from the near-50/50 balance of sexes, a testosterone-steeped society with ever-more crime and war. The questions can only get tougher.

And more subtle, as well. The first genetic tuneups will be for the elimination of inheritable diseases--kidney disorders, hemophilia, and the like. Such single-gene tailoring could appear by 2000. The pope will oppose it as the opening wedge, and the battle will be joined.

But it won't be settled, only broadened. Then will come genetic cosmetics: tailoring for eye and hair color, skin tint, maybe breast size (look at the implant industry today), and height. We do not know if these are controlled by single genes, but probably some are, and the others will prove to have only a few loci.

We're already familiar with the yuppie competition to get Junior into the very best kindergarten. What expense will parents spare if, a decade or two into the next century, they can tailor their children for beauty? A firm jaw for men, firm breasts for women? We all know that good-looking people do well. What parent could resist the argument that they were giving their child a powerful leg up (maybe literally) in a brave new competitive world?

This will outrage many. Science is being perverted, they will say. From the noble elimination of a hideous disorder, like hemophilia, we will descend to the mere pursuit of transient appearances.

Somewhere, law (excruciatingly arrived at), or fashion (often underestimated, never absent), or deeper arguments (heard only by an elite) will draw the line. There will remain the familiar problem of oversubscription. Just as a bachelor's degree was once a proud emblem and is now tarnished by being commonplace, beauty--and, lest we grow smug, maybe even brains--will come to be so. Indeed, since beauty is another form of fashion, generations may sport characteristic, trendy noses and thighs, just as we currently see passing fads in children's names. But names can be changed with the stroke of a pen.

Of course, the first genetic editing and rewriting will be done for the rich. When F. Scott Fitzgerald told Ernest Hemingway that the rich were different, Hemingway could confidently reply, "Sure, they have more money." No longer will that be strictly true. Rancor arising from built-in superiority, by right of inheritance, could soar.

One of our challenges will be to spread the benefit, or else see growing class separations of frightful complexity and depth. We could reach the stage in which one could spot the rich by their looks, or even their smarts. Or their mates. Classical liberalism holds that information is good. If you can afford it, that is.

Why, then, should a prospective bride not know the precise genetic endowment she would get from a candidate swain? We are just beginning to consider whether a genetic propensity for disease should be made known to insurance companies or employers.

Those legal battles can be settled in the context of privacy rights. But how about something as intensely personal as marriage? People care deeply about their children. It seems plausible that they would want to know what they are getting before going to the altar.

Being Human

All these naturally arising problems will tend to make us think of other people as anthologies of genetic traits--to atomize, a most thoroughly modern impulse. This reflects science's tendency to slice and dice experience for convenience of analysis, but it is a poor model for knitting up the already raveled threads of a tattered society.

So somewhere, a line will be drawn. It had better be fixed by open public debate, rather than by our current method of leaving it up to lawyers in courtrooms, who usually know little and care less. Biology touches the wellsprings of our deepest emotions, and will make posturing before juries even worse than now.

Other developments, just over the horizon, will probably force us to entirely rethink present ideas of good and evil. Within a generation, we will probably be able to make cocaine from a bacterial culture. Kids will grow it or morphine or opium--in bathtubs, not in elaborate labs.

This will do for our current drug prohibition what home-brewed beer did for Prohibition. Even easier ways are plausible: say, a bacterium which lives in your digestive tract and makes just the right level of cocaine every day. (Something like this has happened naturally. A patient turned up who was permanently drunk, from a yeast which made alcohol in his innards. Therapy freed him of a condition others might have envied.) Far more exotic methods of eluding detection, and of making new designer drugs, will no doubt emerge.

Such a ready supply will almost certainly doom a simple "War On Drugs" approach. Legalizing, taxing, and regulating their use will come to be far cheaper than following a Prohibition mentality against an ever-improving biotechnology.

In fact, I believe it already is cheaper and smarter. We have over 1.3 million in American prisons, the majority for drug-related crime. The average sentence for murder in California is for fewer years (eight) than the average sentence for drug crimes.

Prohibition of anything is about power and imposing a uniform value system. Technology in the next century will probably act against central control. This will push our cultural boundaries, with biotech steadily increasing the friction. In the end this may force a new social solution, resembling the European programs already using partial legalization, combined with the social pressure that has reduced tobacco and alcohol use.

Dreams and Dreads

Out of playfulness--and cowardice--I've scrambled many ideas together without talking about when they might come.

To orient ourselves, I would call "mundane" the measures that have obvious market roles right away, and little social resistance. This includes pollution policers, simple bathroom cleaners, crops that resist pests and herbicides, pharm animals, "designer" plants (blue roses, low-cal fruit), bacterial mining, and the like. Even correcting human inheritable diseases will probably go through without major opposition. All this, perhaps within the first two decades of the new century.

The battles will begin in earnest with conceivable but startling capabilities. The list is long. Big changes in our own genome. Harnessing natural behaviors to new tasks (the acacia ant-corn harvesting marriage). Designer animals, like a green Siamese cat to match your furniture, or even a talking collie (and what would it say?). These may preoccupy the middle of the next century.

Even further out would be major alterations in the biosphere, and in us. Adapting ourselves to live in a vacuum or beneath the sea, or to convert sunlight directly into energy, would alter the human prospect beyond recognition. Changing homo sapiens to something beyond will be a step fraught with emotion and peril. Such issues will loom large as the Biological Century runs out. And what could lurk beyond that horizon? The mind boggles.

All these are mere glimpses of what awaits us. A century is an enormous span, stretching our foresight to the full. Reflect that H.G. Wells's The Time Machine appeared only a century ago, in 1895. Biotech, as Aldous Huxley foresaw in Brave New World (1932), can usher in as profound a revolution as industrialization did in the early 19th century. It will parallel vast other themes--the expansion of artificial intelligence, the opening of the inner solar system to economic use, and much, much more.

The Achilles' heel of predictions is that we cannot know the limitations of a technology until we get there. A 19th-century dreamer might easily generalize from the forthcoming radio to envision sending not merely messages by the new "wireless," but cargoes and even people. Matter, after all, is at bottom a "message."

But there's more to it than that, and the awesome radio didn't develop into a matter transmitter, which is no closer to reality than it was then.

So undoubtedly I'm wrong about some of these analogy-dreams, particularly the timing. What I will bet on is that, despite the current fashion for "nanotechnology"--artifice on the scale of a nanometer, the molecular level--biotech will come first. It is easier to implement, because the tiny "programs" built into lifeforms have been written for us by Nature, and tested in her lab.

In fact, some of the most interesting prospects of nanotech- like thinking come from biological materials. The basic mystery in biology is how proteins figure out how to fold themselves, which determines myriad biological functions. An obvious long chain molecule to fold and use as a construction material is DNA itself. A self-replicating "bio-brick" could be as strong as any plastic.

Consider adding bells and whistles at the molecular level, through processes of DNA alteration. Presumably one could then make intricately malleable substances, capable of withstanding a lot of wear and able to grow more of itself when needed.

It isn't fundamentally crazy to think of side-stepping the entire manufacturing process for even bulky, ordinary objects, like houses. We have always grown trees, cut them into pieces, and then put the boards back together to make our homes. Maybe we will someday grow rooms intact, right from the root, customized down to the doorsills and window sizes. Choose your rooms, plant carefully, add water and step back. Cut out the middleman.

Whether such dreams ever happen, it seems clear that using biology's instructions will change the terms of social debate before nanotech gets off the ground.

The rate of change of our own conception of ourselves will probably speed up from its present already breakneck pace. The truly revolutionary force in modern times has been science, far more than "radical" politics or the like. This seems likely to be even more true in the future.

Yet the above examples underscore the implications of leaving genetic choices to individuals. Society has some voice in defining boundaries, surely. But typically we arrive at consensus slowly, whereas biotech speeds ahead. Perhaps here we see the beginnings of a profound alteration in the essential doctrine of modern liberal democratic ideology. There may be genetic paths we will choose to block. How do we recognize them, quickly?

Our species has made enormous progress through swift cultural evolution. Now that quick uptake on changing conditions can come from deep, genetic change. We will hold the steering wheel, however shaky our grip, and not pitiless, random mutation.

We will emerge from a Biological Century with a profoundly different world view. Perhaps some new technology will promise to shake our foundations in the dawn of the 2100s, too. Mercifully, we cannot see that far.

Our prospect is wondrous and troubling enough. It is as though prodigious, bountiful Nature for billions of years has tossed off variations on its themes like a careless, prolific Picasso. Now Nature finds that one of its casual creations has come back with a piercing, searching vision--and its own pictures to paint.

Gregory Benford (gbenford@uci.edu) is a professor of physics at the University of California, Irvine and the author of Timescape. This is the first article in a multi-author series on science and society.