For the past several years, technology has been a major topic of well-received books–most of them resolutely opposed to the idea. Seeing no reason why thoughtful, well-written books should be limited to condemning the artifacts of human ingenuity, REASON asked a number of writers and scholars to recommend three books–preferably including a work of fiction–that provide a more positive, or at least more complex, view of the relationship between human beings and the made world.
Walter Truett Anderson
In recent times we have seen the emergence of a new polarization–anti-technology vs. pro-technology, Luddite vs. techie. The neo-Luddites dream of people leaving technology behind and advancing into a future that looks–well, looks a lot like the past. The techies dream of artificial intelligence–computers so brilliant that they can advance and leave people behind. (See, for example, the cyberpunk classic Neuromancer.) Along with this goes a lot of argument–much of it useless hyperbole–about whether technology is good or bad, destroyer or savior.
What we really need to do, rather than take sides in any such simplistic fistfights, is to understand how inseparable technological change is from human evolution. Technology is us.
Two books that I will mention here address this issue straight on. The third, a work of science fiction, illuminates it in more indirect ways, as fiction should.
Origins of the Modern Mind (1991), by the Canadian psychologist Merlin Donald, argues that the human species has evolved by developing new "systems of representation," and that at each stage–as people invent new ways to communicate and manage information–we become in fact a different species. The first big jump, he says, was the invention of mimesis. Then came speech, then–much later–writing. We are now in the midst of another such transition, and it is literally changing the way we think: "The growth of the external memory system," he says, "has now so far outpaced biological memory that it is no exaggeration to say that we are permanently wedded to our great invention, in a cognitive symbiosis unique in nature." What this means is that we are now evolving into computer-connected beings with a computer culture and a computer civilization.
Bruce Mazlish of MIT makes a similar point in The Fourth Discontinuity (1993), although his framework is more historical than evolutionary. He takes his title from the proposition that the human species has in recent centuries gone through a number of "discontinuities," each of which involved learning new–and disturbing–lessons about the world and our place in it. We learned the Copernican lesson that our planet is not discontinuous from the heavenly bodies, we learned the Darwinian lesson that humans are not discontinuous from the animals, and we learned the Freudian lesson that the conscious mind is not discontinuous from its preconscious origins. Now, he says, "humans are on the threshold of decisively breaking past the discontinuity between themselves and machines," discovering "that tools and machines are inseparable from evolving human nature." Mazlish's book doesn't footnote Donald's, and I don't think that is an academic oversight. Rather, I suspect he merely moved along his own disciplinary path (he is a historian) and came to a quite similar conclusion.
Kim Stanley Robinson isn't trying to make any such point in Blue Mars (1996), but he makes quite a few anyway. This is the third volume in a trilogy (the previous installments were Red Mars and Green Mars) about an expedition to Mars that results in the deliberate transformation of the planetary ecology and the growth of a new human civilization. Robinson's books reflect most of the current scientific thinking about "terraforming," and also show how that issue might lead to a new kind of technophobe-technophile argument. The major political groupings in his story are the Greens who are eager to modify the planet, and the Reds (more or less similar to the Greens here on Earth), who prefer to leave it alone. In the books the Reds win many of the arguments, but the Greens proceed to change Mars–while human beings move on to terraform Venus, various asteroids, and the moons of the outer planets. Along the way they go through several technological revolutions and evolve some fancy artificial intelligence, but remain recognizably human. Technological change and human evolution proceed inseparably from one another, much as (according to Donald and Mazlish) they always have.
Walter Truett Anderson (email@example.com) is a political scientist, journalist, and author/editor of numerous books-most recently Evolution Isn't What It Used to Be: The Augmented Animal and the Whole Wired World (W.H. Freeman) and The Truth About the Truth: De-confusing and Re-constructing the Postmodern World (Tarcher/Putnam).
To grasp the significance of technology, it's helpful to look at a society that didn't have much of it to go around. Conquest (1993), Hugh Thomas's magisterial account of the destruction of the Aztec Empire, shows precisely how far a society could advance without wheels, nails, or candles. (The lack of firearms was a comparatively minor problem.) Thomas demonstrates what can and can't be done in such a society, and he dramatically illustrates its vulnerability to any competitor that has marginally less primitive tools.
But material technology is the child of intellectual technology, whose best conquests are peaceful ones. A remarkable example is the sudden triumph of agriculture on the North American plains–an effect of the advanced intellectual technology of free enterprise. Willa Cather's great novel O Pioneers! (1913) richly evokes the experience. Cather's protagonist is a young woman who is distinguished by her skillful use of capitalist methods. Hoping to do more than scratch out a modest living through sheer hard work, she takes the risk of thinking. She invests in real estate and farming methods that other people scorn, and her investments pay off. She transforms both her land and her life.
You can't understand technology without understanding how people think about technology, and, of course, you need to know the bad ideas as well as the good. That's why I want to insert a recommendation of just one influentially bad book, Thorstein Veblen's The Engineers and the Price System (1921). Veblen ably advocates a leading myth of the machine age: the idea that material technology "advances" because of people's collective efforts, only to be manipulated and hindered by capitalists for the sake of their private profits. This idea represents a profound misapprehension of the ways in which material technology is affected by investment, market prices, and property rights. Veblen recommended that capitalists be replaced by a "Soviet of technicians" that could "take care of the material welfare of the underlying population"–a proposal that is either chilling or comic, depending on the way you want to take it, but that is very much in the 20th-century spirit.
One of the finest books written in opposition to that spirit is Isabel Paterson's The God of the Machine (1943; republished 1993, with an introduction by me). Paterson offers a complex and compelling theory of history that explains the relationship between a dynamically developing material technology and the concepts of individual rights that are fundamental to capitalism. And she adds a warning to anyone who assumes that the industrial machine on which modern life depends will keep on humming even in a world dominated by social engineering. "For the very reason that the action of inanimate machinery is predetermined," she says, "the men who use it must be free. No other arrangement is feasible." After reading Paterson, one can hardly look at a toaster, much less a computer, without thinking of the Bill of Rights. But that's as it should be. It's not an accident that rights came first and toasters came second.
Stephen Cox (firstname.lastname@example.org) is a professor of literature at the University of California, San Diego, and the author of Love and Logic: The Evolution of Blake's Thought (University of Michigan Press).
When you're in a Spielberg state of mind, try this: Take a baby from 150,000 years ago and raise him/her in modem Manhattan. What have you got? You've got a 21st-century kid, with in-line skates. Now, take the next kid born now and send him/her back 150,000 years and what have you got? Some grub-scrounging missing link.
Technology is all that matters. Technology is all that makes us human. You want books on technology? Every goddamned book is about technology. Every conversation is technology. Technology is all we got. If you don't like technology, you don't like humans.
If you want the above premise written by authors who aren't smartasses, try Making Silent Stones Speak: Human Evolution and the Dawn of Technology (1993), by Kathy D. Schick and Nicholas Toth. They're a nutty couple that went out, lived in the bush, made stone-aged tools, and used them for wacky stuff like butchering an elephant. Is that science or performance art? It's the best of both. Read it.
You want another book that'll rock your world? Try The Beak of the Finch (1994), by Jonathan Weiner. It's about another cool science-nut couple. The couple is perfect, but the author screws the pooch with biblical references. Jesus and Bible quotes in a science book are pure evil. He also misuses the phrase, "Possession is 9/10ths of the law," and yaps about Zen, global warming, and other hippie Luddite ideas. It's still a great book. This nutty couple goes and lives on one of those rocky, useless, Darwin islands and measures the length of the beaks of the finches (it's a well-named book). Again very close to performance art. They see evolution happening! Technology finding out exactly how we got here.
These books make you love goofy couples. The book I want now is the sex book about these wilderness-science couples. Never mind Pamela Anderson and Tommy Lee, I want to hear the nasty on the professionals that have nothing to do but discover our world like humans and screw like beasts. Those are the techno-videos I want to study.
The cheeses at REASON asked me to include one work of fiction. Why not include the best work of fiction in the world? The Mezzanine (1986), by Nicholson Baker, will kick your ass. I laughed, I cried. What the hell more you want? After science-sex couples use technology to learn about who we are and how we got here, let Nick think about it. His character takes a deep whiff of the madeleine of our modem world and records everything he thinks while riding the escalator. It's an importantly funny book. When your thoughts overlap his, you get that universal-one-world-one-people feel that's so great and when you don't overlap it's "oh-man-we're-all-alone," Camus-city. Both feelings are important. Both are true. The Mezzanine is true.
And it's technology that lets us think together.
Penn Jillette (email@example.com) is more than half of Penn & Teller by weight. He has co-authored Penn & Teller's Cruel Tricks for Dear Friends (Villard) and Penn & Teller's How To Play with Your Food (Villard). He also does some sort of magic show.
Jonathan Kochmer and Jeff Bezos
Jonathan Swift's 1696 satire, "Battle of the Books," describes a passionate war between armies of living books–the Ancients and the Modems. Exactly 300 years later, equally vigorous yet often simplistic battles rage among today's Ancients and Modems: technophobes and technophiles.
The Pinball Effect, Out of Control, and Haroun and the Sea of Stories are three books that abandon simple explanation, instead acknowledging complex interdependencies between the makers and the made. They also explore the possibility that distinctions between society and technology are becoming less pronounced.
Everyone recognizes that technological history is complex, but most authors still clutch timelines and linear paths of cause and effect like drowning sailors. In contrast, James Burke's The Pinball Effect (1996) is a stunningly original and joyously otterine swim throughout the sea of history: Few other books we know so masterfully document the dizzyingly intricate symbioses of inventors and invention.
Kevin Kelly's Out of Control (1994) demonstrates that human artifacts such as machines, networks, and economies are becoming like organisms, ecologies, and societies–sometimes by design, but increasingly by emergence. Will centralized, engineering-oriented solutions be abandoned as the distinction between the made and the living dissolve? Some call Kelly a technophile, but he is in fact a syncretic biophile: His rallying cry is "life is the ultimate technology."
Finally, Salman Rushdie's Haroun and the Sea of Stories (1990) can be read as a lyrical parable of culture as a technology, with stories portrayed as the machinery of cultural replication. And few would suspect that Rushdie has written one of the best metaphorical descriptions of the Internet: "…the water…was made up of a thousand thousand thousand and one different currents, each one a different color weaving in and out of one another like a liquid tapestry of breathtaking complexity; and Iff explained that these were the Streams of Story, that each coloured strand represented and contained a single tale…as all the stories that had ever been told and many that were still in the process of being invented could be found here, the Ocean of the Streams of Story was in fact the biggest library in the universe….the stories were held here in fluid form, they retained the ability to change…unlike a library of books, the Ocean of the Streams of Story was much more than a storeroom of yarns. It was not dead, but alive….'And if you are very, very careful, or very, very highly skilled, you can dip a cup into the Ocean…and you can fill it with water from a single, pure Stream of Story.'"
Increasingly, the meaningful question may not be whether technology is good or bad, but instead, whether there are substantive differences between the makers and the made.
Jonathan Kochmer (firstname.lastname@example.org) is a senior editor at Amazon.com Books, a Weh-based bookseller, and author of four Internet books and scientific articles on evolutionary theory, animal behavior, and climate change. Jeff Bezos (email@example.com) is president of Amazon.
I first began to think how technology relates to society when I read and reread Daniel Defoe's Robinson Crusoe (1719) in grade school. The stranded Crusoe is a society of one person for much of the book. He defied the claim of John Donne and Ernest Hemingway that no man is an island. Crusoe was a social island on an ocean island. He fought his way through John Locke's state of nature only to later have to fight the island natives in Thomas Hobbes's war of all against all.
Crusoe used technology to fight these battles for personal and political survival. He used ideas to shape the structure of his world. Crusoe killed goats for their meat and hides. He made weapons from bones and stones. He built a shelter from tree parts and hides. Defoe helped him with the rifle and the shipwrecked barrels of gunpowder that had floated to shore. Crusoe gladly took those gifts from his maker and used them to make himself a better world.
A second book gave me a deeper insight into how technology drives survival and vice versa. I was 18 years old and stood in line at my high school graduation in the small Kansas farm town of Lansing. My physics teacher came up and shook my hand and gave me a book as a graduation gift. He was an Army colonel at nearby Fort Leavenworth. He came to our high school as a type of pro bono effort to teach just one physics class of seven students. The book was Carl Sagan's The Dragons of Eden. It went on to win the Pulitzer Prize in 1978 but was out of step with a Bible Belt farm town with a posted population of little over 4,000 persons.
Sagan's book showed how the ancient battle between the reptiles and mammals has shaped our brains. Dreaming often involves our reptile-like midbrain. Most of us fear touching a harmless black garter snake more than we fear touching the more dangerous rabbit or raccoon. Brains house their evolutionary history in their present structure and function.
Sagan also made a conjecture that forced me to think about just how much of personal and social behavior stems from genes. You sometimes flinch yourself awake just as you fall asleep. Sagan suggested that this was a reflex that helped keep our hominid ancestors from falling out of their trees at night. The reptiles or "dragons" down below ate those great-grandparents who did not have the reflex or who had it and still fell. This book put me on a path that led me in time to write the text Neural Networks and Fuzzy Systems on the mathematical structure of the brain.
The book also led me in time to test one of its ideas on my newborn daughter just minutes after she emerged from a C-section birth. I held my swaddled daughter as all proud fathers do. Soon I could not resist touching the side of my forefinger to her small bare foot. Sagan was right. Her foot clutched at my finger just as a young monkey's paw would clutch at a tree branch.
Arthur C. Clarke showed where survival and technology could end in his 1956 novel The City and The Stars. A billion years have passed. Earth's mountains have crumbled and its seas have dried up. Humans have been to the stars and come back. They have lived now for eons in a huge city under the control of an omnipotent but benign central computer.
Humans have lost their teeth and nails in this dull steady state. They have lost much of the hominid within them–and with it their thirst for change and adventure. Hope comes only from the rare loner who challenges the system with no chance of changing it.
Bart Kosko is a professor of electrical engineering at the University of Southern California and directs USC's Signal and Image Processing Institute. He is the author of Fuzzy Thinking (Hyperion) and the new Prentice Hall text Fuzzy Engineering. Bantam/Broadway will publish his Heaven in a Chip and Avon will publish his novel Nanotime in the spring of 1997.
The capacity to produce technology–to remake the world around us in the image of our thoughts–is a basic aspect of human nature. It is as old as our first ancestors, who chipped stone axes 2 million years ago: We have named them homo habilis–"handy man"–because of their toolmaking.
In the past century or so, however, this capacity has achieved an entirely new potency. A sustained, focused, and intricately integrated creative outburst on the part of millions of people has redefined the pace and possibilities of human existence in ways previously only dreamed about. Life dominated by natural rhythms and limits has given way to life mediated and liberated by artifacts.
This transformation, still unfolding, is one of the greatest quantum leaps in human development. But of course technology cannot be disentangled from the rest of human nature, and all its frailties and contradictions. And so the process of building this new, man-made world has been an exceedingly messy one-fluky and unpredictable, often tragic and misunderstood.
In this regard, it is humbling to realize this great release of creative energies was made possible by violence and predation. Yet through history, warfare has done as much as anything to drive technological development. William McNeill's fascinating The Pursuit of Power (1982) tells of the unique contributions made by the intense but inconclusive military competition that roiled medieval and modern Europe. Incessant conflict accelerated the pace of innovation directly, but the indirect effects were even more momentous. By combining domestic order with international anarchy, the fractured political landscape of Europe created spaces within which commerce could develop and flourish. The addition of profit motive to plunder motive gave technological growth an unstoppable momentum.
The resulting Industrial Revolution created unprecedented wealth and opened broad new avenues for creativity and achievement. But its promise was perverted by two grievous misconceptions widely shared among its implementers and champions: first, a belief that the logic of industrial development required ever more centralized control on ever more massive scales; and second, a withering reductionism that rejected as irrelevant or even valueless anything that is not measurable and concretely utilitarian. These two ways of thinking combined in powerfully destructive cultural and political movements that sought to bring mankind down to the level of the clanking, soulless machines it had constructed. These movements–which can be summed up with the terms "technocracy" and "social engineering," and which can be seen as a kind of Industrial Counterrevolution–inflicted deadening bureaucracy at their best, totalitarian horror at their worst.
The misbegotten ideal of technocracy was brilliantly satirized in Yevgeny Zamyatin's We, written in 1920-21, just as the Soviet Union was being established as its purest historical incarnation. The book is set in a dystopia of the far future, where in a city cordoned off from lush, unruly nature by the "Green Wall," humanity has been brought nearly to mechanistic perfection. "Numbers," as people are known in the "One State," live in glass apartments, so that all actions are visible to the ever present but unidentified "Guardians." A "Table of Hours" prescribes every action of every day, down to 50 chews per bite at mealtime, masticated in unison by all the numbers in the One State. (Interestingly, the "ancient" identified as the chief prophet of the One State is not Karl Marx, but Frederick Winslow Taylor, the father of "scientific management" and, incidentally, no small influence on Lenin.)
In Zamyatin's fiction, the parts of human nature suppressed by the One State–the spontaneous and unpredictable, the imaginative, the biological–are not eliminated: They persist outside the Green Wall, and are cherished by dissidents within who plot the tyranny's overthrow. Zamyatin's slender hope has been fulfilled in our generation, as the technocratic ideal has suffered stunning reverses-from the collapse of the Soviet Union to the restructuring of corporate bureaucracies.
Here again, however, things are messy: The abuses of the social engineers have helped to strengthen the case of Luddite reactionaries who despise and fear the man-made world. What has happened, then, is that much of the intellectual critique of technocracy has been of the baby-and-bathwater variety. Both sides have shared a common false premise: that technology is dehumanizing. The technocrats reject humanity; the Luddites reject technology.
Where to go from here? Fortunately, a body of thought that transcends this false dichotomy has been developing in recent years. The new sciences of chaos and complexity focus on the general capacity–in material objects, living creatures, and human institutions–for spontaneous order: complex behavior that is not the product of any central design, and cannot be reduced to the sum of the parts. While there are many excellent books that provide the general reader an overview of this new thinking, let me recommend Louise Young's The Unfinished Universe (1986). Young offers a lyrical vision of the universe's unfolding complexification-from physical order to chemistry, then biology, and then mind and culture.
This new way of understanding the world–non-mechanistic, non-reductionist–reconnects the technological and biological by showing that both are manifestations of spontaneous order. In this way, it can support a new kind of idealism about technology's possibilities. The ongoing creation of a man-made world can be seen, in this view, as part of the larger aspiration of mankind's continuing complexification: the development, through free institutions, of the best of human nature in all its variety and richness.
Contributing Editor Brink Lindsey (10213 firstname.lastname@example.org) practices trade law in Washington, D.C.
I begin with "The Dynamo and the Virgin," a chapter from The Education of Henry Adams (1905). In it, Henry Adams visits the Great Exposition of 1900 and, viewing the incredible, towering power sources on display, begins to contemplate the awesomeness of the developing technologies. The only precedent he can think of is the inspiring and miraculous power the Virgin held in the history of art.
Adams was on to something about why technology has been so compelling in the 20th century. Men are every bit as moved by the might of the machines they can build as they are in thrall to the power of sexual beauty. Every little boy knows what it's like to stare wide-eyed at a fire engine or a fighter jet or a bulldozer. This is a large part of what has moved men to make bigger and more majestic bridges, buildings, and then machines. That sheer glory of force, of size and power and muscularity, is a large part of what made the film Top Gun such a hit. Sure, there was a perfunctory love story in the movie, but what it's about, what you can't forget, is the roar of those jets, the fire and the blaring sound of them, and the thrill and fun the pilots have in actually being in charge of all that force.
As our control over technology developed, though, a necessary and inevitable lesson accompanied our progress: humility. The second book on my list is Walter Lord's A Night to Remember (1956), on which the 1958 film was based. The story of the sinking of the Titanic is the stuff of pure myth-if it had not actually happened, someone would have had to make it up. The sinking of the Titanic is most useful as a story about hubris. Our technology is the manifestation of our dreams, and nothing in the story suggests we should stop trying to make our marvelous, outsized visions real. On the other hand, we ought to keep in mind that while we are godlike, we are not gods. Glorious luxury liners are well worth the effort and the cost, but that's no reason not to have enough lifeboats on board.
My third choice is The Dancing Wu Li Masters (1979), by Gary Zukav. The goal of the book was to make the obscure field of quantum mechanics comprehensible to non-scientists, and, surprisingly, it succeeds in illuminating what the attraction of all that heady stuff is. Like Adams, these men (and they're pretty much all men) experience real awe, this time around, not at size and force, but at the enigmatic elegance of the physical world. The Dancing Wu Li Masters helps explain why modern scientists are driven to explore particles so infinitesimally small that they are (sometimes quite literally) nothing but ideas.
This move from Adams to Zukav is summed up in Stanley Kubrick's 2001: A Space Odyssey. The film begins with man's fascination with the first tool; in one brilliant edit it moves dazzlingly to the end of our century, where the idea and execution of tools has been all but perfected. What is left is the realm of pure mystery. Which, in a way, brings everything back to where Adams found it in 1900-the point of wonder.
David Link (email@example.com) is a Los Angeles writer whose essays appear in Beyond Queer: Challenging Gay Left Orthodoxy (The Free Press), edited by Bruce Bawer.
Most of us tend to think of technology in terms of the macro rather than the micro. I refer here not to sheer physical size–surely Silicon Valley has taught us that the mightiest technological achievements can come in the tiniest of processing chips. I refer instead to our notions of technological complexity and, especially, technological power, whether measured in horsepower, kilowatts, megatons, or gigabytes. For the most part, our cultural mindset goes, bigger is better and biggest is best.
But technological power, not to mention technological utility, comes in many sizes. For every supercomputer, digital camera, and electronics laboratory, there's a host of products that may seem far more mundane yet are no less remarkable: office supplies, kitchen gadgets, canned goods, children's toys. Don't mistake these items' ubiquity for technological simplicity. Our tendency to take them for granted is in fact a testament to an immensely sophisticated production system, a system so vast and efficient that it can provide these things without most of us even stopping to ponder how it all gets accomplished. The mechanical engineering and industrial design processes that produce these items may be less glamorous than, say, software design, but they're no less important or impressive. Think of them collectively as inconspicuous technology.
There are a number of books that focus on inconspicuous technology. One of the best is Henry Petroski's The Evolution of Useful Things (1992), a loving examination of such minor miracles as paper clips, tin cans, pins and needles, nuts and bolts, silverware, adhesive tape, pull tabs, and the like. You don't need to be a minutiae fetishist to appreciate Petroski's examination of the subtler aspects of our everyday world, and you don't need to be an engineer to understand his prose. (The book also devotes several pages to a discussion of the zipper, which is itself the subject of an excellent book: Robert Friedel's Zipper: An Exploration in Novelty ).
Packaging is another key component of inconspicuous technology. Package-design elements like the Coca-Cola wave and the Wrigley's arrow have become subsumed into our collective cultural psyche. For a look at the history of packaging, including an examination of the formidable technological challenges packaging can present and the additional technological innovations it can facilitate, try Thomas Hine's The Total Package (1995), which examines everything from milk cartons to cereal boxes and will forever change your perceptions of your local supermarket.
These books are wonderful and instructive, but they're also a bit scholarly; the best way to appreciate inconspicuous technology is in the context of the real world. Curiously, the best example of this is a work of fiction: Nicholson Baker's The Mezzanine (1986), in which the narrator enthusiastically pursues a series of obsessive intellectual tangents on such unlikely subjects as soda straws, men's-room urinals, shoelaces, doorknobs, tear-off perforations, cigarette butts, and vending machines, all in the course of a short escalator ride. To fully appreciate the wonders of the inconspicuous world, in theory and practice, start here.
Paul Lukas (firstname.lastname@example.org) is a columnist for New York magazine, the editor of Beer Frame: The Journal of Inconspicuous Consumption, and the author of the forthcoming Inconspicuous Consumption: An Obsessive Look at the Stuff We Take for Granted, to be published by Crown in January.
To understand technology fully, it is necessary to understand the nature of engineering. The formulation and solution of technical engineering problems is, of course, at the heart of every technological endeavor, whether it be the design and production of an automobile or the generation and distribution of electricity, but dealing with technical problems within the constraints of the laws of nature is only one aspect of the total engineering enterprise. Real engineering in the real world is inextricably complicated by cultural, social, political, economic, and aesthetic goals that shape and in turn are shaped by the technical objectives.
The full story of just about any ostensibly technical project will illustrate the complex interrelationships among competing goals and constraints that engineers must deal with in the course of producing a technological artifact. Among the best of such stories is that of the building of the Brooklyn Bridge, told by David McCullough in his 1972 book, The Great Bridge. The true story of the conception and realization of a bridge that has become a cultural treasure is also the very human story of how John Roebling, his son, Washington Roebling, and his wife, Emily Warren Roebling, dealt with accidents and death, not to mention political corruption and greed, along with the physical and technical challenges of constructing the largest bridge in the world. It is as gripping as any novel.
A more recent book, The Innovators (1996), by David P. Billington, tells the stories of engineering pioneers who shaped modern technology and thereby made America modern. Rather than weaving an extended narrative about a single artifact, however, Billington uses famous technological achievements such as the steam engine, the distribution of electricity, the telegraph, and steel making to show how even the most technical aspects of engineering–its equations and formulas–are influenced by such factors as social, economic, and aesthetic considerations. By explaining in simple terms how technical decisions must incorporate a wide range of seemingly nontechnical considerations, Billington's book shows more explicitly than any other the true nature of engineering as a social and cultural, as well as a technical, endeavor.
Engineering is done by engineers, of course, and the 1976 book by Samuel C. Florman, The Existential Pleasures of Engineering, has become a classic for understanding the passion and enthusiasm individual engineers can feel for their work and the satisfaction they can experience as they make tangible contributions to society and culture. Florman's book has recently been reissued in a second edition (1994), which includes a new preface and chapters from some of his other books, and it is considered by many to be required reading for those wishing a full understanding of the engineer and engineering in modern society.
Henry Petroski (email@example.com) is A. S. Vesic Professor of Civil Engineering and professor of history at Duke University. His most recent books are Engineers of Dreams: Great Bridge Builders and the Spanning of America (Alfred A. Knopf) and Invention by Design: How Engineers Get from Thought to Thing (Harvard University Press).
John J. Pitney Jr.
Foundation (1951), by Isaac Asimov: In this classic of science fiction, a "psychohistorian" uses computers and advanced mathematics to predict the decline and fall of the Galactic Empire, and launches a plan to shorten the coming dark ages. In writing this novel, Asimov assumed that human nature would remain flawed but that technology would evolve to the point of producing accurate forecasts of events centuries in the future. The latter assumption is highly debatable, for as Hayek wrote, the "mind can never foresee its own advance." Asimov was on firmer ground when he included a parable about technological literacy. As the Empire declines, the planet Terminus gains power because its inhabitants are the only ones who still understand nuclear energy. In the surrounding kingdoms, people remember the operating instructions for their nuclear-powered machines, but have long forgotten how the things actually work. Terminus exploits this ignorance by creating a mystical cult around the technology, which allows it to manipulate the kingdoms with spells and sorcery. Remember this part of the novel the next time you read about the state of science education in the public schools.
The Ascent of Man (1973), by Jacob Bronowski: This companion volume to the excellent television series of the 1970s is not your typical coffee-table book. Though highly readable and handsomely illustrated, it also has a strong philosophical viewpoint. The title itself is revealing: The late Bronowski believed that the growth of scientific and technological knowledge did indeed constitute an "ascent." His book celebrates discovery and invention while bemoaning the "loss of nerve" represented by the all-too-frequent mysticism of pop culture. And it powerfully dismisses the widespread notion that technology will turn people into numbers. At one point, we see a photo of Bronowski squatting in a muddy field, and the accompanying text explains: "This is the concentration camp and crematorium at Auschwitz. This is where people were turned into numbers. Into this pond were flushed the ashes of four million people. And that was not done by gas. It was done by arrogance. It was done by dogma. It was done by ignorance."
See How They Ran: The Changing Role of the Presidential Candidate (1991), by Gil Troy: This volume may seem an odd choice in this context, but it is filled with wise observations about the ways in which technology has changed (and has failed to change) the relationship between voters and candidates. During the 1850s, for instance, rail travel and telegraphy allowed candidates to reach more people than ever before, thus allowing a hitherto obscure Illinois lawyer named Abraham Lincoln to become a national figure. Since then, technology has continued to expand the potential for public conversation on campaign issues. But the realization of that potential hinges on the substantive content of the campaigns, which is something that technology itself cannot determine.
Contributing Editor John J. Pitney Jr. (firstname.lastname@example.org) is associate professor of government at Claremont McKenna College.
Virginia I. Postrel
One of the pleasures of editing a magazine is bringing together fine, insightful writers whose work enriches and extends your own ideas. On the subject of our relation with technology, then, it's no surprise that my touchstones include Walt Anderson's Evolution Isn't What It Used to Be (1996), Henry Petroski's The Evolution of Useful Things (1992), and Fred Turner's Tempest, Flute, and Oz (1991)–very different books, all important and well written, all ably represented by the presence of their authors in these pages. So, invoking another editor's pleasure, I can pick three other books.
Few writers better explore the tensions among technological innovation, scientific evolution, and politics than physicist Freeman Dyson. And few writers more appreciate the ecologies of human societies, with their complex dynamics, odd feedback effects, and occasional paradoxes. In From Eros to Gaia (1992), Dyson collects some of his finest essays, including the priceless "Six Cautionary Tales for Scientists," originally written in 1988. It contrasts the small-scale, unglamorous, and effective Plan A approach to technical problems with the politically successful Plan B–and finds the same patterns in the First, Second, and Third Worlds. The essay should be required reading for all science policy makers.
"Technology" means not only gadgets or computers but any embodied knowledge, including business systems. Both sorts of technology combine in Joseph Nocera's A Piece of the Action: How the Middle Class Joined the Money Class (1994), which tells the story of how bank credit cards and money market funds made credit and investment mass, middle-class products. Since these instruments work only because they are, in Paul Lukas's phrase, "inconspicuous technologies," we never appreciate the absolute genius, pigheaded determination, computer power, and sometimes disastrous experiments that building such systems required. Nocera is a terrific storyteller, and he has a fascinating story to tell. Plus his chapter on the Age of Inflation is itself a remarkable piece of technology: It so captures the utter, spiraling financial panic that hit middle-class America in the late 1970s that it will give you flashbacks. Good reading, too, for GenXers who think economic insecurity began with them.
Following my own instructions, I end with a novel or, rather, a series of novels. It's been 25 years since I read Laura Ingalls Wilder's Little House books, but they left some indelible technological images: the joys of a father's fiddle playing, the advantages of blindness (you can sew when it gets dark), the preservation and uses of a slaughtered hog, the economic triumph represented by glass window panes. Ghostwritten by Wilder's daughter, the important libertarian intellectual Rose Wilder Lane, the Little House books remind us of both the value of old technologies in their time and the wonders of newer ones in ours.
Virginia I. Postrel (VPostrel@aol.com) is the editor of REASON and a columnist for the technology magazine Forbes ASAP. She is writing a book, called The Future and Its Enemies, on clashing ideas about social and economic evolution, which will be published by The Free Press.
Adam Clayton Powell III
The future is always with us, in some form or outline, so wherever technology is taking us, the clues are already here. Each of these books informs us in its own way to look for hints of the future in what is already around us.
To a man with a hammer, goes the expression, the world appears a nail. And so to Andrew Grove–the president of Intel, the world's most successful producer of microprocessors–the world is a universe of hardware and software. However, by remaining so narrow, Grove's Only the Paranoid Survive (1996) focuses on exactly the right aperture for a glimpse of the future.
Computer makers are in territory so new and changing so rapidly that they (and we) must plan for a largely unknown and unknowable future, making a path across a trackless land: "It's about finding your way through uncharted territories." Grove insists this unknown territory is suffused with the thrill of the possible–and, too, with the fear that often accompanies the unknown. But he insists that fear can be a force for positive change: "Ideally, the fear of a new environment sneaking up on us should keep us on our toes." (Now we know what to make of the book's title.) Grove provides an analogy of a fire department: Firefighters do not know exactly where the next fire will be, but they can guess a rough total and the historical distribution pattern, so they can plan accordingly.
Grove also notes the fundamental generation gap now forming between most of us industrial-age, analog-era adults and the post-industrial, under-25 digerati who have spent most of their lives learning about the world, playing games, working on homework, and socializing with each other online. Just as their grandparents abandoned afternoon newspapers for television and their parents found generational refuge in FM radio, audience data now show that the computer generation born since 1970 is abandoning print and electronic mass media to embrace the more tailored, more fragmented, computer-based media. This has profound implications for public discourse, which Grove calls "a demographic time bomb ticking away."
And so with Grove's vision of future media, which may be science fiction to the industrial agers, but to the under-25s, it is just another upgrade–and just a few years away: "Processing power is going to be practically free and practically infinite," said Grove in a September Newsweek interview. "This will allow us to turn automatic 3-D photorealistic animation into a ubiquitous reality within two or three years."
Kevin Kelly, executive editor of Wired magazine, takes Grove's vision of the future of computers and extrapolates that vision to describe the future of society, with a fundamentally different social contract. He embraces the chaos of ultimate decentralization and individual power, writing a book that in its very title celebrates the complete absence of authority imposed by any central place or person. "There is no central keeper of knowledge in [the Internet], only curators of particular views," writes Kelly in Out of Control (1994). So he argues we are moving to a "highly connected yet deeply fragmented society" where "distributed, headless, emergent wholeness becomes the social ideal."
And if that leaves us out on the edge, it may be a challenge for Tom Stoppard's puzzle of a play, Arcadia (1993), to take us even further. But further it does, in a work that has nothing to do with computers or microchips but everything to do with research, society, the past, and the future–and also manages to encompass Lord Byron, quantum physics, and trends in British garden design in the late 18th century.
Stoppard presents alternating scenes of today's researchers and the subjects of their research in 1809, and we can see how painstaking care and rigorous procedure lead our present-day characters to exquisitely erroneous conclusions about the past. Amid this lesson in humility is one exclamation that serves as a signpost, an emblem for our sea change to the brave new digital world. The character Valentine, a twentysomething mathematician, is entranced by the possibilities of chaos theory and quantum mechanics, as excited by the fall of the old order as are his real-life twentysomething digerati counterparts.
"The future is disorder," says Valentine. "A door like this has been cracked open five or six times since we got up on our hind legs. It's the best possible time to be alive, when everything you thought you knew is wrong!"
Adam Clayton Powell III (email@example.com) is vice president, technology programs, at The Freedom Forum.
For critics of technology, The State of Humanity is a terribly depressing read. The 1995 book, edited by the economist Julian Simon, is a collection of essays and graphs analyzing human welfare over the past few millennia. The trends are just about all positive: Humans are enjoying longer, healthier, wealthier, and freer lives in a world with less pollution and more plentiful resources. The reason for all this good news, of course, is technology. Doomsayers claim to be taking the long view of technology and its problems, but how can they ignore evidence like this?
The answer to that question can be found in William Tucker's book, Progress and Privilege: America in the Age of Environmentalism. This 1982 work (Tucker had the misfortune of being too far ahead of his time to sell many books) anticipates the neo-Luddites and environmentalists of the 1990s. As Tucker shows, today's critics of technology–including the ones who think they're progressives fighting to help the poor–are part of an elitist, conservative tradition. Aristocrats, clerics, and intellectuals have always come up with creative reasons for opposing any technological change that threatens their comfortable status or interferes with their determination to write the rules for everyone else. Tucker offers lovely analyses of our current environmental "crises." Consider his explanation for the angst over population growth in the Third World: "There have been two lines of demagogic argument that have always gone down well in history. The first is to tell the poor that the rich have too much money. The second is to tell the rich that there are too many poor people."
If these books haven't persuaded you to ignore the Luddites and Malthusians, try a work of inadvertent fiction by an ornithologist named William Vogt: Road to Survival. It preaches against technological innovators-"the freebooting, rugged individualist" must be recognized as "the Enemy of the People"–and warns of pending famine and poverty due to pollution, overpopulation, vanishing topsoil, depleted stores of natural resources, and various environmental catastrophes. It reads exactly like a tract from Bill McKibben or the Worldwatch Institute. But this book was published in 1948–and the doomsday predictions were being made about the United States. Reading it will make you laugh, restore your faith in modern technology, and free you from any temptation to take the doomsters seriously.
John Tierney is a staff writer for The New York Times Magazine.
In some ways the most interesting book on the relationship between humans and our technology that I have read recently is William R. Jordan III's remarkable work The Sunflower Forest, but it is as yet unpublished (smart publishers, take note). Jordan is a curator at the University of Wisconsin, Madison, Arboretum, and his book is a searching analysis of the way in which our Cartesian and Puritan heritage has led us to divide humans from the rest of nature and thus, finally, to the view espoused by the likes of Bill McKibben, in which nature is by definition what is untouched by human hand, and humanity is a blight and cancer upon nature. Jordan presents an alternative view, in which the human appetite for ritual, myth, play, and gardening can actually improve the richness of the earth's living ecosystem, with beneficial economic results and without governmental invasion. We shall trade with the rest of nature, he believes, to our mutual benefit, thought as an extension of biological evolution.
But since the book is unpublished, I cannot make it one of my three. A science fiction work, Neal Stephenson's The Diamond Age (1995), set in industrial China a hundred years hence, is a marvelous fountain of ideas about the human technological future. It is, like some recent works of Greg Bear and Gregory Benford, a convincing vision of a nanotechnological age (nano-engineered diamond has become the most popular building material in Stephenson's future). What is especially provocative about it is that Stephenson, despite his countercultural roots, has perceived, correctly in my view, that the masters and mistresses of a future technological age will be people with solid family backgrounds, personal honor, and discipline, the new Victorians or "Vickies" as he calls them. As the subtitle of his book, A Young Lady's Illustrated Primer, suggests, however, a sufficiently sophisticated and artistic piece of educational software, based on storytelling, might serve as a substitute family. But the point is that chemical technology is more potent than our present physics-based technology, biotechnology more potent than chemical technology, and nanotechnology is the most potent of all, exerting the greatest leverage upon the physical world.
My second bona fide book is Homer's Odyssey. This poem is among other things a good introduction to the brilliant simplicity of ancient Greek technology, and an exemplary demonstration of what constitutes user-friendliness in both software and hardware. Homer shows us how a bow, a door, a garden, a ship, a narrative verse tradition, an olive grove, an adze, a polytheism, a viticulture can work together to the enrichment of the world.
My last book is Martyn Fogg's Terraforming (1995), the current bible of those who believe we must begin to become a spacefaring civilization. He makes a compelling argument that we can colonize Mars using contemporary technology, and that we should do so. Reading Fogg, who is the president of the British Interplanetary Society, one feels a blush of shame at the pusillanimity and malingering of our times, the most prosperous in history, and of our generation, which would rather bicker over nasty little social prizes and slights than take up once again the heroic mantle of our destiny.
Frederick Turner is a poet, a theorist of the links between the sciences and the humanities, and Founders Professor of the Arts and Humanities at the University of Texas at Dallas. His most recent books include April Wind (University Press of Virginia) and The Culture of Hope (The Free Press).