You and Me Baby Ain't Nothing but Mammals

|

At The Chronicle of Higher Education, the authors of a forthcoming book on literature and biology note a common thread running throughout fictional dystopias: Rejection of the idea of human nature. It got me thinking, though: O'Brien is surely mistaken when he claims that he and the Party "make human nature," and even Huxley's vat-bred good citizens seem to be pretty standard issue homo sapiens. But it's increasingly clear just how contingent that is—eventually we will make human nature. Which brought to mind a line from an old Bruce Sterling essay:

Human thought itself, in its unprecedented guise as computer software, is becoming something to be crystallized, replicated, made a commodity. Even the insides of our brains aren't sacred; on the contrary, the human brain is a primary target of increasingly successful research, ontological and spiritual questions be damned. The idea that, under these circumstances, Human Nature is somehow destined to prevail against the Great Machine, is simply silly; it seems weirdly beside the point. It's as if a rodent philosopher in a lab-cage, about to have his brain bored and wired for the edification of Big Science, were to piously declare that in the end Rodent Nature must triumph.

Anything that can be done to a rat can be done to a human being. And we can do most anything to rats.

Which got me wondering…if we passed the planet to a successor species that didn't want or need freedom, say, or love, were perfectly happy without those things, would we (now) have reason to regard that as a bad thing? Do we care about those things because they're important to us, or do we further want it to be the case that they continue to be important to whoever mans the great gap between us and the giant telepathic cockroaches?