Policy

TEOTWAWKI!

Or, the end of the world as we know it at the Global Catastrophic Risks conference

|

Oxford, England—People have long been fascinated by the end of the world. Some interpretations of Hindu scripture suggest that the world will end with the imminent conclusion of the current Kali Yuga cycle. Some New Agers believe that the world will undergo apocalyptic changes as the Maya Long Count calendar comes to an end on December 21, 2012. Some Christian End Timers believe that the period preceding the Day of Judgment described in the Book of Revelation is now upon us. Religious believers are not alone in their fascination with doomsday. Secular catastrophists predict environmental doom or worry about calamity raining down on us from outer space.

This week the Future of Humanity Institute at Oxford University, headed by bioprogressive philosopher Nick Bostrom, is convening a conference on Global Catastrophic Risks. The Institute's work focuses on how radical technological developments such as nanotechnology, artificial intelligence, and life-extension treatments will affect the human condition. One of the Institute's research programs is global catastrophic risks which mulls questions like: What are the biggest threats to global civilization and human well-being? Will the human species survive the 21st century?

The savants gathered here in Oxford will consider a wide variety of potentially apocalyptic risks. For example, Michelangelo Mangano from the European Organization for Nuclear Research (CERN) will explore the possibility that certain scientific research—e.g., the Brookhaven Lab's high energy experiments that might produce a black hole—could inadvertently destroy the world. Mike Treder and Chris Phoenix from the Center for Responsible Nanotechnology will discuss how the advent of molecular manufacturing could lead to massive economic and social disruptions, including a new arms race, the spread of tyranny, and dangerous environmental degradation. At the cosmic level, the Technion Institute's Arnon Dar will look at the devastation that a nearby supernova could wreak, and astronomer and author William Napier will evaluate the chances that the earth might soon suffer an asteroid strike. Whether future advanced artificial intelligences will think of us as pets or pests will be pondered by Singularity Institute for Artificial Intelligence research fellow Eliezer Yudkowsky.

In addition to the more exotic risks noted above, the conferees will also be discussing the prospects for nuclear war and nuclear terrorism. More reassuringly, Princeton University Program on Science and Global Security fellow Ali Nouri is apparently set to argue that trends in biotechnology are making it less likely that bad guys could unleash a man-made plague. On an even happier note, technoprogressive bioethicist James Hughes will discuss how to avoid cognitive biases toward over-pessimism and over-optimism. And Steve Rayner, director of Oxford's James Martin Institute (which is co-sponsoring the conference), will point out that much contemporary doomsaying shares cultural characteristics with earlier superstitious predictions of imminent catastrophe.

The whole cheery conference kicks off this evening with a talk by Sir Crispin Tickell entitled, "Humans: Past, Present and Future." Apparently Tickell buys into the whole litany of environmentalist doom. However, he thinks that doom can be avoided if we "radically change our thinking on global governance" and pursue some "interesting" technological options.

This is the first dispatch from the Oxford conference on Global Catastrophic Risks. Since the conference runs through the weekend, future dispatches will report various gloomy presentations chiefly as blog posts at reason online. I will finish up coverage of the conference with my science column next Tuesday.

Ronald Bailey is reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.

Disclosure: The Future of Humanity Institute is covering my travel expenses for the conference; no restrictions or conditions were placed on my reporting.