First, I want to thank the Foundation for the Future for inviting me to give this talk. I am more than a little apprehensive giving it in front of such a distinguished group of thinkers. But here goes–the state of the planet, that is the environmental health of the Earth and that of its citizens is surprisingly good and likely to get better over the next century. Of course demonstrating this in the course of an half hour talk is going to be difficult, but here goes.
Since this seminar is devoted to trying to foresee the opportunities and problems that lay ahead for humanity over the next millennium, I think that looking back at forecasts of what the state of the planet was supposed to be at the end of the last millennium might be a good place to start. I will be looking chiefly at (1) global population, (2) non-renewable resources, (3) pollution, (4) climate change. And then I'm going to explain why there remain some environmental problems and how they are likely to be solved.
POPULATION–First, then population. In 1968, Stanford University biologist famously predicted in his best-selling The Population Bomb: The battle to feed all of humanity is over. In the 1970s, the world will undergo famines–hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now." For the first Earth Day in 1970, Ehrlich in an article entitled "Eco-Castastrophe" in The Progressive magazine offered a scenario in which 4 billion people would starve to death between 1980 and 1989, 65 million of whom would be Americans. Ehrlich was far from alone, Lester Brown, the founder of the Worldwatch Institute and Denis Hayes, one of the organizers of Earth Day and many others concurred in this grim forecast.
What happened? Well, first why did population increase so dramatically in the 20th century?–rising from around 1.6 billion in 1900 to 6 billion today. As Harvard University demographer, Nicholas Eberstadt puts it, "Global population increased not because people started breeding like rabbits, but because they stopped dying like flies." What happened is that babies stopped dying shortly after birth. The global infant mortality rate dropped from a couple of hundred per thousand to below 50 per thousand today. The result is that human life expectancy has more than double from an average of around 30 years in 1900 rising to 46 years by 1950 and is now 66 years in 2001. The World Health Organization expects that to rise to an average of 73 years by 2020. Surely this is evidence for the greatest improvement in the human condition in all of history.
What about future population trends? One still hears from activists that world population will rise to 12 to 15 billion by 2050 or if they're more cautious 2100. But there's a lot of evidence to suggest that that is very unlikely. First, let's look back at the predictions made in 1970s. If famine were somehow miraculously avoided, both Paul Ehrlich and Lester Brown predicted that world population in 2000 would be more than 7 billion. In actual fact, world population as noted earlier reached only 6 billion by the year 2000.
Keep in mind that there is no predictive theory of demography–but everyone now acknowledges that the rate of world population growth is rapidly decelerating.
The United Nations World Population Prospects, 2000 revision has dropped its medium world population projection for 2025 to 7.8 billion: Only four years earlier in 1996, it had projected a world population of 8.04 billion by 2025. According to many demographers, this impressive drop indicates that world population trends are likely to track the low variant trends in which world population will top out at about 8 billion in 2040 or so and then begin to drop. Even Nature magazine earlier this month published work which concludes that it is likely that world population will never exceed 9 billion or so.
What has happened of course is that women are having many fewer children than they did–dropping from nearly 6 children per woman in the 1960s to around 2.6 and falling today. Replacement is 2.1 children per woman–a rate that all developed countries have already fallen below.
Of course, the reason that Ehrlich and others predicted demographic disaster was their devotion of Malthusian theory–that population would always outstrip the available food supplies. What confounded their predictions of doom is the advent of the Green Revolution nurtured by Nobel Peace Prize winner Norman Borlaug. The Green Revolution has been summed up as "making two blades of grass grow where only one could grow before." By the way, Ehrlich completely dismissed the Green Revolution, predicting in 1970 that it would "soon turn brown." What has been the result of the Green Revolution?
Since 1970, the amount of food per person globally has increased by 26% and as the International Food Policy Research Institute recently reported, "World market prices for wheat, maize and rice, adjusted for inflation, are the lowest they have been in the past century." According the World Bank, food production increased 60 percent between 1980 and 1997. In fact, food is cheaper and more abundant than it has ever been in all of human history. And more good news, the amount of land devoted to growing crops has barely increased over the past 30 years, meaning millions of square miles of land have been spared for nature with concomitant benefits for biodiversity. So Malthus and Ehrlich are spectacularly wrong, more food does not mean more people, in fact in turns out that more food means more old fat people.
NON-RENEWABLE RESOURCES–Another area of concern has been the depletion of so-called non-renewable resources. This thesis was most famously propounded in the 1972 Limits to Growth report to the Club of Rome and later in President Jimmy Carter's Global 2000 report. The Limits to Growth thesis got a big boost when the Arab countries unleashed their oil embargo in 1973. The Limits to Growth report in a handy table in the middle of the book projected at exponential growth rates which the reports authors expected, world supplies of gold, tin, zinc, copper, oil and natural gas would be completely exhausted by 1992. The Limits to Growth authors were not alone–In 1970, Harrison Brown, a respected member of the National Academy of Sciences, published a chart in Scientific American that looked at metal reserves and estimated that humanity would totally run out of copper by 2000 and that lead, zinc, tin, gold and silver would be gone by 1990.
Of course this didn't happen. Even the generally alarmist Worldwatch Institute acknowledges in its Vital Signs report published this past May, that "nonfuel commodities now fetch only about 46 percent as much as in the mid-1970s. Indeed, Worldwatch admits that "food and fertilizer prices are about one-fourth their 1974 peak." Even the price of crude oil, which has risen lately, "nevertheless remains at about half the zenith reached in 1980." In fact, overall non-fuel commodities cost only a third of what they did in 1900.
As everyone knows, lower prices mean that things are becoming more abundant not more scarce. The U.S. Geological Survey estimated 2 years ago, that at present rates of mining that reserves of copper will last 54 years, zinc, 56 years, silver 26 years, tin, 55 years, gold, 30 years. and lead, 47 years. What about oil? The survey estimates that global reserves could be has high a 2.1 trillion barrels of crude oil–enough to supply the world for the next 90 years. These reserve figures are moving targets–as they get drawn down, miners and drillers find new sources of supply or better ways to exploit old sources, and technologist find more efficient ways to use resources or substitute for those that are becoming less abundant. More on the role technology plays in protecting the natural world a bit later.
POLLUTION–Another concern is pollution–as one contemplates the declining air quality of Shanghai or Bangkok, one is naturally concerned. Of course, these concerns were present at the birth of the global environmental movement. On the occasion of the first Earth Day in 1970, smog choked many American cities, sludge coated the banks of many rivers, and people were also worried that we were poisoning the biosphere with new synthetic chemicals, most especially pesticides. In 1970, Life magazine reported that "Scientists have solid experimental and theoretical evidence to support … the following predictions: In a decade , urban dwellers will have to wear gas masks to survive air pollution…by 1985 air pollution will have reduced the amount of sunlight reaching the earth by one half." The Limits to Growth report in 1972 declared that "Virtually every pollutant that has been measured as a function of time appears to be increasingly exponentially." Of course, Paul Ehrlich chimed in declaring that "air pollution…is certainly going to take hundreds of thousand of lives in the next few years alone. He offered a scenario in which 200,000 Americans living in New York and Los Angeles would die in smog disasters in 1973.
So has air pollution gotten worse? Interestingly, when Americans are polled on this question the majority believe that air pollution in the U.S. is much worse than it was 30 years ago. I believe most of us in this room are old enough to know that this simply isn't so. Environmental Protection Agency data show that for the so-called criteria air pollutants, that instead of rising exponentially as predicted by the Limits to Growth report, that instead ambient levels of sulfur dioxide and carbon monoxide have fallen by 75 percent, while total suspended particulates, like smoke, soot and dust have been cut by 50 percent since the 1950s. The chief constituents of smog, nitrogen oxides are down 37 percent and ozone down by 27 percent. In 1988, the particulate standard was changed to account for smaller particles and even under this tougher standard particulates are down an additional 15 percent. Keep in mind that the U.S. population grew from 218 million in 1976 to 281 million today, the economy grew from $4.3 trillion to $10 trillion, and vehicle miles traveled increased by 121 percent. The good news is that similar pollution trends are clearly evident in all developed countries.
It is now evident that countries undergo various environmental transitions as they become wealthier. A World Bank analysis found that the first environmental transition—the availability of clean drinking water–occurs when a country's average annual per capita income reaches around $1300. Next, the concentrations of particulates and sulfur dioxide peak at per capita incomes of $3300 and $3700 respectively. Once these income thresholds are crossed, societies start to purchase increased environmental amenities such as clean water and air. BTW, some researchers believe that they may have identified another environmental transition point recently, it appears that when average per capita incomes reach about $16,000 per year, a country's forests begin expanding. Over the last decade, forest expansion has picked up in Europe– expanding at around 750,000 acres per year and in the U.S.by around 1 million acres per year.
So what about the more subtle and insidious pollution caused by ingesting synthetic chemicals? Rachel Carson's 1962 book Silent Spring brought this issue forcefully forward. Carson declared that exposures to dangerous chemicals … have entered the environment of everyone—even children as yet unborn. It is hardly surprising, therefore, that we are now aware of an alarming increase in malignant disease." By which she meant cancer. By 1970, Paul Ehrlich once again, painted a scenario in his Eco-Catastrophe article in which he envisioned a Department of Health, Education, and Welfare study finding in 1973, that because of contamination by synthetic chemicals, "Americans born since 1946 .. now had a life expectancy of only 49 years, and predicted that if current patterns continued this expectancy would reach 42 years by 1980, when it might level out." Last year, Worldwatch Institute founder Lester Brown, ominously noted, "Every human being harbors in his or her body about 500 synthetic chemicals that were non-existent before 1920." First, in reply to Brown, so what? Considering that American life expectancy has increased by 20 years, from an average of 56 years in 1920 to 71 years in 1970 to 76 years today, one might be tempted to argue that those synthetic chemicals are prolonging our lives.
There is a broad consensus that exposure to synthetic chemicals, even pesticides, does not seem to be a major health problem. In 1996, the National Research Council of the National Academy of Sciences, issued a comprehensive report called Carcinogens and Anti-Carcinogens in the Human Diet which concluded that levels of both synthetic and natural carcinogens are "so low that they are unlikely to pose an appreciable cancer risk." The National Cancer Institute reports that "increasing exposure to general environmental hazards seems unlikely to have had a major impact on the overall trends in cancer rates." The American Institute for Cancer Research concludes that "There is no convincing evidence that eating foods containing trace amounts of chemicals such as fertilizers, pesticides, herbicides and drugs used on farm animals changes cancer risk. Exposure to all manufactured chemicals in air, water, soil and food is believed to cause less than 1 percent of all cancers." In other words, Rachel Carson was wrong.
BTW, what is happening to cancer rates–both incidence and mortality are falling and have been for the last decade in the United States. Most of that decline can be traced to better health habits, most especially declines in cigarette smoking. What causes the bulk of cancer? According to Sir Richard Doll, head of the Clinical Trial Service & Epidemiological Studies Unit in Britain, smoking is associated with 30 percent of cancers, diet, especially eating lots of animals fats is associated with between 20 and 50 percent, infectins are responsible for between 10 and 20 percent and exposure to our own natural reproductive hormones accounts for about another 10 to 20 percent. Of course living longer means that you have a better chance of getting cancer.
GLOBAL WARMING–Now on to the pressing issue of global warming. Earlier this year the Intergovernmental Panel on Climate Change, the IPCC, issued its Third Assessment Report along with its famous Summary for Policymakers. That summary laid out scores of scenarios based on computer climate and econometric models and for the next 100 years in which global average temperatures were projected to rise by between 1.4 degrees centigrade and 5.8 degrees centigrade. Naturally, the press seized on the upper end of that temperature range and portrayed a world that would soon bake us all. President Bush asked a panel of climatologists under the auspices of the National Academy of Sciences to consider what is scientifically certain and what is scientifically uncertain about global warming.
The NAS report found that "Greenhouse gases are accumulating in Earth's atmosphere as a result of human activities, causing surface air temperatures to rise and subsurface ocean temperatures to rise." No one argues w|Th any of the above claims, but they do not in themselves mean that the world is headed toward a climate disaster due to humanity's activities.
Let's briefly review the data on global temperature trends. Earth's average temperature has apparently gone up by about 0.6 degrees Centigrade (C) plus/minus 0.2 C in the past century, concludes the IPCC. According to the highly accurate measurements taken from satellites of the entire troposphere (the bulk of the atmosphere from the surface to five miles up) the earth's temperature since 1979 is increasing at a rate of 0.038 degrees C per decade. Weather balloon data from the same period show an increase of only 0.03 degrees C per decade. Taking the weather balloon data from 1958 to today, the average per-decade increase in temperature is 0.1 degrees C. Interestingly, weather balloon temperature data are essentially flat between 1958 and 1976. Between 1976 and 1978, global temperatures measured by weather balloons jumped dramatically, after which they essentially became flat again. No one knows why temperatures spiked in the mid-1970s, but such a spike is not consistent with human causation. Finally, the surface temperature measures on which those most concerned about global warming rely show an increase of 0.16 degrees per decade over the same period.
These actual data on temperature increases are interesting, since to reach an average global temperature increase of 3.0 degrees C by the end of the 21st century would imply that global average temperatures must increase by at least 0.3 degrees C per decade. (The U.N.'s Intergovernmental Panel on Climate Change projected an increase ranging from 1.4 to 5.8 degrees C by 2100.) That 0.3 degree per decade increase is evidently not happening. Those concerned about global warming claim that is because aerosols like soot and sulfur dioxide are shading the planet, causing a cooling countertrend. Perhaps, but even the National Research Council report admits, "The monitoring of aerosol properties has not been adequate to yield accurate knowledge of the aerosol climate influence."
What is the empirical data telling us? Climatologist John Christy says that "the data are telling us that the actual climate is not as sensitive to CO2 forcing as the computer models say it is."
But what about all those freakish weather events of late?
The IPCC itself in its third assessment report notes that there is no evidence that hurricanes, tornadoes, floods or droughts are getting worse or more violent than in the past, despite what you might see on the nightly news reports. There is some small evidence that the amount of rain has increasing slightly, and why not, a warmer world is a wetter world. What about the glaciers at Mt Kilimanjaro, in Peru and in Glacier National Park in the U.S. Aren't they all melting away because of higher temperatures? Actually, in all those cases temperatures have not gone up–in both Peru and Glacier National Park average temperatures have been going down. By the way glaciers are growing in Norway.
Then what about sea level rise? Sea level rose by 6 inches in the 20th century and even the IPCC recently lowered its estimate for sea level rise in the 21st century to about 8 inches. Certainly not a catastrophe.
Isn't the polar ice melting? Actually temperatures in Antarctica have been cooling for several decades and the sea ice area there has expanded. In the Arctic, picture is more complicated–at the highest latitudes temperatures have been declining slightly–some parts of Greenland are warming, but others are cooling and the net effect seems to be no great change in Greenland's ice cover.
As climatologist John Christy, said, It appears that the climate models are too sensitive. What accounts for their inaccuracies? Most likely it's because they get the clouds wrong. Clouds tend to act like shades during the daylight and like blankets at night–the result seems to be that whatever warming is occurring is being channeled into the winter nights. Daytime high temperatures do not appear to be going up much if at all. Vast uncertainties about aerosols, clouds, the ocean and fossil fuel consumption trends all together make the model predictions a very chancy tool with which to plan the energy future of humanity for the next century. And make no mistake, that's what climate treaties like the Kyoto Protocol amount to.
The absurdity (not to say sheer arrogance) of that type of planning becomes clear when one imagines the same exercise taking place in 1900. The best scientific panel available in 1900–let's say including Lord Kelvin, Albert Einstein, Madame Curie–would simply not have been able to plan for today's 560 million automobiles and trucks, the ubiquitous electric lighting in hundreds of millions of houses and office buildings, fuel for thousands of jet planes, the electricity to run the world's 160 million computers, hundreds of millions of refrigerators, central heating, air conditioners, and telephones. Virtually none of these devices on this nearly endless list had even been invented by 1900. Given the increasing rate of technological innovation, we undoubtedly stand in an even worse position for trying to plan the world's energy needs of the year 2100.
SOLUTIONS–Despite the foregoing good news about the current state of the planet, it could certainly be a lot better. First, despite abundant food supplies, 800 million people are still malnourished. 1.2 billion live on less than a dollar a day. 2 billion still don't have access to clean water, sanitation and electricity. In the meantime, tropical forests are still being cut a unsustainable rates, global biodiversity appears to be imperiled, and ocean fisheries overfished. What to do? Well make polluters and other environmental despoilers pay. But not in the usual way.
I want to maintain two things will solve most environmental problems–first that all environmental problems occur in open access commons and second that richer is cleaner and therefore slowing economic growth and development will delay eventual environmental clean up.
COMMONS–First, what is an open-access commons? Ecologist Garrett Hardin made this concept famous with his Science magazine article, The Tragedy of the Commons in the 1960s. An open access commons occurs when any member of the public may use a resource without paying anyone else for it. In a commons, if a person wants to leave some of a resource behind, say a fish to breed so that he can harvest more later, he won't because he knows that someone else will simply come along and take the fish and gain all of the benefits for themselves. Open access commons exist generally as relics of a time when the resource was abundant and users few. Airsheds, rivers and lakes, fisheries, are prototypical examples. Unfortunately, Hardin and many others came to the conclusion that only top down bureaucratic management was the solution for solving the tragedy of the commons. True such management can score quick gains initially, the imposing catalytic converters on cars have reduced their polluting emissions by 95 percent. But top down can also discourage innovation that leads to more environmentally sound ways of doing things. For example, in the United States, the Clean Air Act effectively mandated that electric utilities use smokestack scrubbers to reduce their sulfur dioxide emissions when other alternatives such as a switch to burning cleaner coal, would have reduced emissions even further and more cheaply too.
A far more effective way to protect environmental goods is to take them out of the commons altogether by privatizing them. Contrast the political top down management of the U.S. and Canadian Atlantic cod fisheries which have essentially collapsed with the growing and healthy privatized fisheries of Iceland and New Zealand. Or the market in sulfur dioxide emissions permits in the U.S. which greatly reduced those emissions and much less cost once it was instituted. The fact is that if you own a resource, you're far more likely to use it efficiently.
Perversely many environmental activists fault markets for not properly valuing natural capital or ecosystem services and they continue to call for placing more resources in public hands. In effect, they want more open-access commons. But if no one has to pay for a resource, then they consider it to be free. The way to take environmental goods into account is exactly the way we take all other goods into account–we put them into the market where people and most especially corporations, have to pay for what they use.
My second proposition is that richer is cleaner which flies in the fact of Paul Ehrlich's infamous I=PAT equation published in the 1970s in Science magazine. I stands for environmental Impact–P stands for Population, A stands for Affluence (consumption nowadays) and T stands for Technology. All of them, Population, Affluence and Technology are bad for the environment. I declare now that that equation gets it almost exactly backwards. Wealthy technologically advanced market economies are seeing rapid declines in fertility, clearing air and water, expanding forests, and increasing life expectancies. I hope by now it is clear that Paul Ehrlich is a reverse Cassandra–In the Odyssey, Cassandra made true predictions that nobody believed, while Ehrlich makes false predictions that nearly everybody believes.
NEW GROWTH THEORY–How has humanity avoided the Malthusian trap?
Stanford University economist, Paul Romer describes the situation as follows: "We now know that the classical suggestion that we can grow rich by accumulating more and more pieces of physical capital like fork lifts is simply wrong. The problem an economy faces …is what economists call "diminishing returns." In handling heavy objects, a fork lift is a very useful piece of equipment. When there were few fork lifts in the economy, the return on an investment in an additional fork lift drops rapidly. Eventually, additional fork lifts would have no value and become a nuisance. The return on investment in an additional fork lift diminishes and eventually becomes negative. As a result, an economy cannot grow merely by accumulating more and more of the same kind of capital goods."
This neoclassical model of economic growth was incorporated into The Limits to Growth computer model and accounts for why the MIT researchers predicted eventual collapse as the inevitable result of continued economic and population growth.
In the last two decades, economists, largely ignored by ecologists and environmental activists, have made a conceptual breakthrough that has given them a way to rigorously and accurately describe how economic growth occurs and how, with the proper social institutions, it can go on for the foreseeable future.
"New growth theorists now start by dividing the world into two fundamentally different types of productive inputs that can be called "ideas" and "things. Ideas are nonrival goods that could be stored in a bit string. Things are rival goods with mass (or energy). With ideas and things, one can explain how economic growth works. Nonrival ideas can be used to rearrange things, for example, when one follows a recipe and transforms noxious olives into tasty and healthful olive oil. Economic growth arises from the discovery of new recipes and the transformation of things from low to high value configurations," explains Romer.
Decoding the clunky economic terminology, "rival" goods are simply things that cannot be used by two or more persons at once, e.g., cars, drill presses, computers, even human bodies and brains, hence "rivalry" to use them. "Nonrival" goods can be used by any number of people simultaneously, e.g., recipes for bread, blueprints for houses, techniques for growing corn, formulas for pharmaceuticals, scientific principles like the law of gravity, and computer programs.
To understand the potency of ideas, just consider that a few decades ago, the only thing humanity could use iron oxide (ordinary rust) for was as pigment in paintings. Now scientists and engineers have developed an elaborate set of instructions that tell manufacturers how to apply iron oxide to plastic tapes to store video and audio information and to computer diskettes to preserve digital information. Fifty years ago, silicon was used to make glass, now we have created designs that also show us how to use it to make very complicated and high valued microchips to run our computers and optical fibers to transmit vast amounts of data over the Internet. In fact, the Internet is a perfect example of how technology permits very rapid exponential growth-the number of people using the Internet is doubling every hundred days.
Ideas are a different kind of resource-they are immaterial and can be shared at nearly no cost among a great many people. Once a new technique or new discovery is made, it can be passed along to other people without much effort. It is easy to see how this works. Take the situation 10,000 years ago when some neolithic woman noticed that delicious grass seeds could be collected and covered in soil, and that she could come back later to more easily harvest the concentrated numbers of nutritious seeds. With this new knowledge, she greatly increased the capacity of the ecosystem to feed more people than hunting and gathering could support. The finite resources of the earth were not increased, just rearranged. The breakthrough was not the seeds-human beings had been gathering and eating grass seeds for tens of thousands of years-the breakthrough was the idea that they could be deliberately increased by planting them. Thus farming grass seeds resulted in increasing returns, while simply gathering and eating them immediately led to diminishing returns. As one wag put it, "both jayhawks and men eat chickens, but whilst more jayhawks mean fewer chickens, more men means more chickens."
Romer explains it this way: "Economic growth occurs whenever people take resources and rearrange them in ways that are more valuable. A useful metaphor for production in an economy comes from the kitchen. To create valuable products, we mix inexpensive ingredients together according to a recipe. The cooking one can do is limited by the supply of ingredients, and most cooking in the economy produces undesirable side effects. If economic growth could be achieved only by doing more and more of the same kind of cooking, we would eventually run out of raw materials and suffer from unacceptable levels of pollution and nuisance. Human history teaches us, however, that economic growth springs from better recipes, not just from more cooking. New recipes generally produce fewer unpleasant side effects and generate more economic value per unit of raw material."
We make ourselves better off not by increasing the amount of stuff on planet earth, that is, of course, fixed, but by rearranging the stuff we have available so that it provides us with more of what we want, food, clothing, shelter, and entertainment. As we become cleverer about rearranging material, the more goods and services we can get from relatively less stuff. This process of improvement has been going on ever since the first members of our species walked the earth. We have moved from heavy earthenware pots to protect our food and drink to more effective and safer ultrathin plastics and lightweight aluminum cans. We have shifted from smoky dangerous wood-intensive campfires to clean efficient natural gas to cook our food. If our technologies had remained stuck in the past and if somehow the world's population had nevertheless been able to grow to its current level, the impact of humanity on the natural environment would have been calamitous.
But by using better and better recipes, humanity has avoided the Malthusian trap while at the same time making the world safer, more comfortable and more pleasant for both larger numbers of people as well as for a larger proportion of the world's people.
What modern Malthusians who worry about the depletion of resources miss is that people don't want oil, they want to cool and heat their homes; they don't want copper telephone lines, they want to communicate quickly and easily with friends, family, and businesses; they don't want paper, they want a convenient and cheap way to store written information. If oil, copper, and paper become scarce, humanity will turn to other sources of energy, other methods of communication, and other ways to store information.
Brown University demographer Robert Kates notes that technological discoveries have "each transformed the meaning of resources and increased the carrying capacity of the Earth." History has clearly confirmed that the "no exhaustible resource is essential or irreplaceable," adds economist Gale Johnson. And economist Dwight Lee points out that, "The relevant resource base is defined by knowledge, rather than by physical deposits of existing resources." In other words, even the richest deposit of copper ore is just a bunch of rocks without the know-how to mine, mill, refine, shape, ship, and market it.
The surest way to get humanity to tread more lightly on the earth, clear the air, clean up the rivers and lakes and cherish forests and fisheries is to encourage rapid economic development among the world's poorest people. What new growth theory tells us is that misery and vice are not the inevitable lot of humanity, nor is the ruin of the planet a foregone conclusion. Two centuries after Malthus and 30 years after the first Earth day, we now know that the exponential growth of knowledge, not our numbers, is the real key to understanding the promising future that lies ahead for humanity and for the earth.