I am a clone.

Or rather, I am better than one. Or so any identical twin surely must see the matter.

The recent media feeding frenzy about a cloned sheep, Dolly, showed us journalism in its fullest modern form. Many of those writing about this genuine watershed moment in techno-culture followed current journalist practice: their foremost research instrument was the telephone. Of those who called me--an unlikely authority, since I am not a biologist--none realized that DNA does not solely determine the heritage a child gets from its parents.

My brother, Jim, and I shared a womb without a view for nine months. (Though not always restfully, our mother reports.) Genetically identical, we also enjoyed the same currents and chemicals of our mother. After a rather traumatic birth--both had our appendixes removed within days--we were brought up in the same house, with constant attentive parents, and even wore matching clothes until our late teens. (How much trauma this clothing ritual induced in our personalities I leave to others to decide; suffice to say that being seen as sugary-cute has left me with a decided prejudice against sweets in any form.)

True twins share womb chemistry and endure many fateful slings and arrows together. The fabled connection between twins is true, in my case. We are distantly dismissive of mere fraternal twins (different DNA) and regard all others as "singletons," those condemned by birth to endure the isolation of never truly sharing the intuitive grasp that we enjoy without paying a price.

Or nearly so. There is mild statistical evidence that identicals have slightly lower IQ. This might be plausibly so; the comfort of ready communication may well lead to a certain mental laziness.

Jim and I felt the opposite. Reared in rural southern Alabama, we enjoyed an idyllic Huck Finn boyhood. But education there was casual at best. Our mother and father were high school teachers, and challenged the pervasive easy-going ignorance. We attended a one- room schoolhouse, with each row of seats a separate grade. Against this my brother and I united, reading widely and enjoying the clash of cultures which paraded by. After we were 9 our father became a career Army officer, whisking us to Japan for three years, Germany for another three, and further isolating the twins from a continuity that might have sucked us into the conventional.

So we are an odd pair even among twins. Jim got his doctorate from the same institution as I, UC San Diego, in the same area (plasma physics) and now lives a few kilometers from where I once lived, in northern California. Such correlations appear often among twins. We grow up in a culture of sameness, so have a sense of self always shared.

Among singletons, interest in twins is enduring. Do we feel some mystical sense of connection? Of course; but whether it is mystical or not begs description. I am writing this at 35,000 feet over Greenland, on the way back to UC Irvine from Lapland. I know without thinking about it that my brother is probably body surfing on a beach near La Jolla, though I have not spoken to him for ten days. I remember his itinerary and without conscious deliberation feel where he is likely to be. This is processing at the unconscious level, and as an experiment, when I see him in two days I shall check with him and let you know the outcome. [Later: my estimate was right to within the hour.]

But this is scarcely mystical. Instead, I attribute the innumerable similar incidents in our lives to a lot of automatic thinking, based on intuitions cooked up through more than five decades. To singletons this can look uncanny.

Speaking as a twin, clones seem a lesser form. They grow up in a later era than their genetic duplicates, with different upbringings. Would knowing that they were genetic duplicates trouble them? Surely such people would not be inherently more mentally fragile; Siamese twins are far more like each other than ordinary twins, yet suffer no higher incidence of mental illness than is usual, suggesting that even extreme parallels in nature and nurturer are not damaging.

The furor over Dolly puzzled me by the emotional level of debate. Reasonable people like political commentator George Will asked, "What if the great given--a human being is a product of the union of a man and a woman--is no longer a given?" This issue properly comes from a broader issue in biotechnology, the entire field of artificial birth in all forms, for there are no precise boundaries in this new territory.

Certainly I see no reason why society should prevent grieving parents from having a baby cloned from the cells of a dead child, if they wish. Beyond such emotionally wrenching cases, where should we erect walls? Oxford biologist Richard Dawkins asserted that he could see purely intellectual issues intriguing enough to justify cloning himself: "I think it would be mind-boggingly fascinating to watch a younger edition of myself growing up in the twenty-first century instead of the 1940s."

Many no doubt find his position puzzling or even immoral or disgusting. Even so, why should Dawkins be prevented from having a cloned child? What is society's mandate?

The Dolly debate produced several claims that cloning violated the fundamental principle of individual dignity. Twins certainly belie that argument. Fears of interchangeable people armies, usually marching robotically onward, come from a simple-minded genetic determinism. And the grounds for a principle of uniqueness seem vague at best.

After all, why treat clones differently?--we twins and clones are all "monozygotes", as the biologists put it. In fact, clones necessarily separate in outside influences from their first moments in the womb, for the wombs are different. Another's DNA inserted into a host egg will acquire "maternal factors" from the proteins of that egg, affecting later development. The womb's complex chemical mix varies with each mother, so nine months of different "weather" will change the outcome in the fetus; the baby will not be a photocopy of its older original.

And clones will be full-fledged people with all rights attendant to that status. Nobody forces twins to serve as organ farms for their other twin; clones would have the same legal status.

The true first use of cloning will undoubtedly be in the "copying" of highly selected farm animals. These could first be excellent milk cows or racing horses. More futuristically, we shall see--and quite soon-- the cloning of "pharm" animals which yield biotech products of use to us, such as insulin-rich milk from cows, and a whole array of therapeutic hormones, enzymes and proteins.

Plants have already been extensively engineered. More than three-quarters of the cotton grown in Alabama last year was genetically tuned to kill predatory insects. Already scientists are experimenting with cotton plants that contain polyester fibers, too, surely a boon for fans of leisure suits.

Still, cloning should indeed furrow the brow of long- perspective thinkers. We believe sexual reproduction holds sway over much of the kingdom of life because it provides ever-new gene mixes, allowing a species to build fresh defenses against the ever-mutating pathogens that infest the natural world. The perpetual arms race between prey and predator favors sex as a defense. Seen this way, we are men and women because the primary predator on humans have always been microbes, not tigers.

So "pharm" animals cloned over and over will face the very real threat of infectious diseases which wipe out a herd overnight. But surely nobody will clone huge numbers of humans, so such plagues will be quite unlikely. The breeds of influenza that regularly attack us genetically diverse humans will do far more damage.

As I write this, a presidential panel seems about to recommend a uniform federal ban on human cloning experiments. I believe this will be a mistake, generally, and an ineffective move anyway. The technology is fairly simple; others will pick it up. In Latin American countries or on offshore islands, clinics will offer the service at a hefty charge. Underground, without legal oversight, we will indeed see some tragedies and even horrors.

Bioethics is a field with many practitioners but few obviously qualified savants. Often the bans which spring from such federal committees prove ill-advised, their only long-term effects negative. This was the case with the two-year moratorium on recombinant DNA, which simply slowed the field without deciding anything. So did similar bans on selling organs or blood, and I predict, so shall the recent Clinton prohibition on using human embryos in federally backed medical research. The ultimate price for these momentary interruptions -- and so far they have always been momentary -- is lives lost because the resultant technology arrives too late for some patients.

Bioethicists tend to see problems everywhere, and saying no gives them visible power. Letting technology evolve willy-nilly, responding to what people want -- maybe even people without advanced degrees! -- gives bioethicists no perks or prominence; unsurprising, then, that they seldom go that route. They aren't the patients clinging to life, or infertile, or stunted in some potentially fixable way.

They also tend to think collectively, omitting the inconvenient needs of real people. Bioethics professor George Annas of Boston University flatly demands, "I want to put the burden of proof on scientists to show us why society needs this before society permits them to go ahead and [do] it." Note that he does not require this rule in his own work, including testing the above sentence by its own standards. Instead, Virginia Postrel has noted of Annas and many others, "We will hear the natural equated with the good, and fatalism lauded as maturity. That is a sentiment about which both green romantics and pious conservatives agree."

Indeed. We would save ourselves much trouble if we could agree that the proper place for most bioethical thought lies in counseling those affected, not in dictating the spectrum of possibilities.


Cloning arouses anxieties stemming from a general uncertainty with the very concept of the self. Legally, even the mind-body unity seems shaky. In 1991 the California Supreme Court decided that a cancer patient did not have a right to share in the profits from UCLA's use of his diseased cells to produce new drugs. This meant that a patient does not even own his own body, and so his integral self is not simply bodily.

Consciousness seems to us to be slippery and yet intuitively obvious. We feel ourselves to be the same person all along our life- trajectory, unique and self-contained. Just as an ant colony or a baseball game has an integrity even as its insects or players change, we have an irreducible selfness.

Of course, such assertions are hard to prove. (Indeed, proving who you are is done by showing a partial copy of yourself-- fingerprints, or a drivers' license.) We all readily assent to knowing that we experience a continuous self.

Yet we fall asleep every day, a loss of conscious continuity. People who lapse into year-long comas can emerge again with the same personality. Still better, patients in brain operations who have their heads chilled down until they are legally brain dead, with no alpha and beta rhythms at all, are still themselves when they are warmed back up and revived. Their memories and mannerisms come through intact.

Over what spans of time and condition can we keep our sense of selfness unbroken?

The bedrock issue of preserving one's selfness then intersects the increasing interest in prolonging life--if necessary by either freezing oneself after death, or even "uploading" into computers.

Making yourself into a computer file and programs fits one present picture of our Self: the mind is software running on the hardware of the brain. The Self, then, is whatever program is running on your customized operating system, one developed by the rubs and rituals of your upbringing.

Of course, such an analogy is suspect, for our brains self- program themselves, laying down memories in chemical pathways that are not simply erased, and aren't under our conscious control.

But the uploader's central point is that one can copy a mind much as a tape copies a piece of music, without knowing how music is made. The brain, they say, is the same.

Minds are self-organizing, evolving systems, however, unlike fixed musical works; but the image is striking, still. Where does it lead us?

The first novel about uploading, Charles Platt's 1991 The Silicon Man, does not directly confront a basic problem of copying, Levinson's Paradox: To the degree that a copy approaches perfection, it defeats itself. In being an absolutely perfect copy --so that no one can tell it from the original--it transforms the original into a duplicate. This means the perfect copy is no longer a perfect copy, because it has obliterated, rather than preserved, the uniqueness of the original--and thus failed to copy a central aspect of the original.

A perfect, artificial human intelligence would inevitably have this effect on its natural original. Sf author Paul Levinson pointed out this feature, hinting that it portended even deeper problems, in the 1980s. While the paradox may seem a mere logical quibble, it underlines how little we know of how much fidelity to the original truly implies that the self has been preserved.

No mere technological improvement can remove this logical difficulty. Given enough memory maintenance, we could maintain numerical versions of ourselves, assuming that the recording process would not destroy our fleshy originals.

This raises great troubles, though. Termed variously Dittos, Duplicates or Copies, these digital entities lead a tenuous existence. Real, fleshy folk would decisively reject the Copy Fallacy: the belief that a digital Self was identical to the Original, and that an Original should feel that a Ditto itself somehow carried them forward into immortality. (As long as nobody pulls the plug, of course.)

Refuting this Copy Fallacy is straightforward. Imagine yourself promised that you will be resurrected digitally, immediately after your death. Assign a price tag you will pay for that, insurance of a sort. Then imagine the guy who sells you on this notion saying that, uh, well, maybe it would not be started right away, but sometime in future...we promise. As that date recedes, people's enthusiasm for paying for Self Copies dims -- demonstrating that it is the hope of continuity they unconsciously relish.

As an identical twin, I have never bought the Copy Fallacy in any form. Though my brother and I have diverged in personality and appearance, due to differing environments and histories, for the first twenty years of our lives few could tell us apart. He and I could, though, and that's the nub of the argument: the Self is defined internally, not externally.

In the end, Copies benefit themselves, not the dead; machine immortality is more like having your twin live on, not yourself.

Some thinkers about computer identities, in the years since publication of The Silicon Man, have begun to push an agenda of Copy rights -- the expansion of classical liberty into the digital wilderness. Dittos still will be people, the argument goes, with different skills and drawbacks, rather like the "differently abled."

The freedom to change your own clock speed, morph into anything, or even remake your own mind, goes along with the admitted liability of not being physically real. Unable to literally walk the streets, they will be like amputated souls.

Platt envisioned tele-presencing and some digital prosthetics that might reach in limited fashion into the concrete universe, but these would be re-creations; if a Ditto feared for its life, why lurk fully in the dangerous real world?

Also, "rights" for Dittos also get tied up with our own deep- seated fears -- of digital immortals who amass wealth and like fungus reach into every avenue of natural, real lives; parasites, nothing less. Platt plainly foresees issues looming over the horizon, as soon as the digital world amasses financial power. Tycoon Dittos!

Running a Ditto of your Self, then giving it autonomy, means it could get rich and also change itself. Your Ditto could shape its own motivations, goals, habits, edit away memories and tastes. It then stops mimicking your own evolution. Your Ditto could erase any liking for Impressionist Opera and overlay instead a passion for rap, enjoying rhythms that would have bored the true Self into a coma. The easy access of a Ditto to his entire underpinning -- unlike ourselves, with much of our personality lying in our subconscious and not consciously fixable -- implies constant change, personality tinkering, perhaps worse.


Is consciousness just a property of special algorithms, sliding sheets of information, digital packets jumping through conceptual hoops? How we envision our selfness depends on this huge question, now a hot topic.

Does a model simulating watching a sunset have to feel the same way its Original did? Why doubt simulated consciousness, when nobody asked the same question of programs that balance checkbooks? Such issues perplex many philosophers today, but I think feeling one's way through them in fiction is a rather more revealing path than abstract argument.

Consider that a Ditto is forcefully reminded that he is not the Original, but a mere fog of digits. All that gives him a sense of Self as continuity is the endless stepping forward of pattern. In people, the "real algorithm", computes itself by firing synapses, ringing nerves, getting the feel of continuity from the dance of cause and effect.

Dittos on the other hand are simply time-stepped forward, in processes that could just as easily run backward without the Ditto even noticing. Even time is fragile, a convention, in a digital universe.

Dittos surely would stand on shaky metaphysical ground here. Would we find that a Ditto fidgeted out of pure self-anxiety? His digital stress chem shoots up, metabolics lurch, heart-sims hammer, lungs flutter in intense uneasiness? Would typical Dittos talk incessantly, acutely uncomfortable, and make odd demands of their keepers?--that they be edited, truncated, improved, perhaps finally killed?

The dream of bodiless existence does not imply the end of the human condition, if we are still truly simulating humans.

Consider how well one would have to describe what our everyday life is like. Making a Ditto's body seem right to its critical intelligence demands sets of overlapping rules. After all, the Ditto remembers what a pleasure eating, say, used to be, back there in the gritty, real world.

As he (or she) chews, teeth have to thunk down on food, saliva squirt to greet the munched mass, enzymes started to work to extract the right nutrient ratios. The program can bypass the involved stomach and colon processes, simplifying into a satisfying concentration of blood sugars, giving him a carbohydrate lift, a pleasant electrolyte balance, hormones and stabilizers all calculated with patchwork templates for the appropriate emotional levels.

The body becomes a set of recipes for seeming like oneself. No underlying physics or biology at work, just a good-enough fake, put in by hand--the unseen hand of some Programmer God. So emerges an existential angst as profound as anything Camus felt, surely.

All other detail can be discarded, once the subroutines got right effect, simulating the tingling of nerve endings. All this is to ground a visceral sense of Self, seemingly rock-solid, though really just a patched-in slug of digits, orchestrated by a mosaic of ten thousand ad hoc rules, running together.

So much effort, just to approximate what we get for free every day!

But of course, digital selves need not age or die, as long as somebody pays the power bill and doesn't pull the plug on us. Ordinary fear of mortal death will become a fear of being cut off, your Self never run again. Each interruption in running the Self will come to a Ditto as a possible final end, for he cannot act in the world while not running.

Indeed, when booted up again, he might not be able to tell he had suffered a hesitation-death, or whether it was a mere second or a hundred years of real time. This is a kind of heavenly eternity, to some. To me it seems like a hell of existential anxiety. If to us twins, singletons have to go through life with rather rickety mental identities, think of a Ditto's lot!

But the possibilities! proclaim some of the uploading Digiterati.

With enough computing space and speed, one could be King Me the Magnanimous, endowing many proto-Michelangelos with creative time...or perhaps becoming Michelangelo oneself, with time. What if genius is just a matter or accumulating greater computing capacity?

Rebuilding yourself from the ground up then emerges as at least a hope. That which is buried in the digits might be harvested, changed. Learn to freezeframe your own emotional states, like painting a self portrait for study later. Perhaps that could help understand oneself, like a botanist putting himself on a slide and under a microscope. Could slices of the Self, multiplied, be the Self? With even emotions as programs?

Such ideas run through Platt's seminal book. They provoked later writers like Greg Egan, who in Permutation City sees a special SelfHood Suite menu, eerie in its temptations. With it one could present interior-configurations as separate subroutines, elements in the modeled brain. Here, grouped under headings--Qualms, Anxieties, Aversions, Likes, Habits, Unconscious Appeals--could rest items he could edit, improve, erase entirely. Not knowing what the Self is, which irreducible kernel of menu items define oneself, for oneself, suggests that Dittos will be tempted into rapt navel gazing.

Given the chance, which would you choose in pursuit of "immortality" -- uploading or cryonics? Forget about the probabilities of success -- each seems fraught with peril. (In an earlier column I estimated that the probability of being successfully frozen and later revived by future technology was at best one percent. A small chance, but infinitely larger than plain, flat zero...)

Uploading gives a pure Copy; cryonics yields your own brain, no doubt altered by much chemistry and microengineering necessary to pull your consciousness back out of the ice.

Which is truer? As far as I know, sf has yet to confront this question.

My intuitive choice is cryonics. At least in its perfect form, you recover the true you, the original synapses and holistic organization of the hopelessly complex brain. With uploading, you at best get a model of yourself, a rendering in 0s and 1s which reproduces for an outside audience--though not including you, the true best critic--your basic personality and memories.

Of course, having your brain frozen leaves out much: your physiology, your body instincts, will have to come from some body grown from your own cells (the reproductive ones, probably) to accompany the revival of your brain.

Here the cloning of humans is essential. Current cryonics organizations (I know of four) routinely preserve not just the head of their patients, but the reproductive organs and other body samples.

The idea is to send forward in time as much information as possible. While some patients elect to have their entire bodies frozen in liquid nitrogen, a far more expensive proposition, most take the head-only route.

They anticipate that a body can be cloned for them at some far future date when it will not only be technically possible, but even fairly inexpensive. Even more, they trust that no medical prohibitions will have halted cloning research. Further, cryonicists hope that cloning technology will have avoided the clear dangers.

But if a body is grown for a defrosted and repaired head, what becomes of the body's head? Was it deliberately stunted from "birth", so that it never developed as a conscious human? It seems unlikely that anyone could grow a body in some chemical vat, no matter how sophisticated, without using the many complex functions that the brain provides for that body.

So even to envision cryonics proceeding, one must require that future society has solved both scientific and moral questions about selfness and its implications. This is not an easy future to foresee, not at all.

But remember that the future is infinite, or at least very long indeed. Note how primitive medicine was a mere century ago. A few more centuries of steady growth could yield a social and philosophical landscape beyond our present comprehension.

Suppose cryonics could work. You would have grabbed back from time's maw the pure raw stuff of Self. Cybernetics gives a digital model, one always suspect because it has to choose how to configure the myriad data points of any brain-readout.

Choice begets the particular, and to have the whole Self, you must have the true, full general Self, in whatever deep labyrinths it lies.