Earth Day, Then and Now
The planet's future has never looked better. Here's why.
Thirty Years ago, 20 million Americans participated in the first Earth Day on April 22, 1970. Fifth Avenue in New York City was closed to automobiles as 100,000 people joined in concerts, lectures, and street theater. More than 2,000 colleges and universities across America paused their anti-war protests to rally instead against pollution and population growth. Even Congress recessed, acknowledging that the environment was now on a political par with motherhood. Since that first Earth Day, the celebrations have only gotten bigger, if somewhat less dramatic: The organizers of Earth Day 2000, to be held April 22, expect 500 million people around the globe to participate in celebrations, workshops, and demonstrations. This year's theme is "clean energy" and the master of ceremonies for the big celebration on the Washington Mall is none other than Leonardo Di Caprio.
The first Earth Day was the brainchild of Gaylord Nelson, the Democratic senator from Wisconsin. The moment was obviously ripe. Nelson had proposed a national "teach-in" on the environment in September 1969 and only eight months later, everything was in place for the single largest national demonstration in American history. Dramatic events such as the Cuyahoga River bursting into flame in 1969, the blowout of an oil well off Santa Barbara, and the "death" of Lake Erie due to pollution all fed Americans' concerns. The sorry state of America's environment hit home for me when, as a 16-year-old high school student from the mountains of Virginia, I visited George Washington's home, Mt. Vernon, on a marching band trip. Bobbing in the nearby Potomac was a sign warning visitors not to come in contact with the water.
Earth Day 1970 provoked a torrent of apocalyptic predictions. "We have about five more years at the outside to do something," ecologist Kenneth Watt declared to a Swarthmore College audience on April 19, 1970. Harvard biologist George Wald estimated that "civilization will end within 15 or 30 years unless immediate action is taken against problems facing mankind." "We are in an environmental crisis which threatens the survival of this nation, and of the world as a suitable place of human habitation," wrote Washington University biologist Barry Commoner in the Earth Day issue of the scholarly journal Environment. The day after Earth Day, even the staid New York Times editorial page warned, "Man must stop pollution and conserve his resources, not merely to enhance existence but to save the race from intolerable deterioration and possible extinction." Very Apocalypse Now.
Three decades later, of course, the world hasn't come to an end; if anything, the planet's ecological future has never looked so promising. With half a billion people suiting up around the globe for Earth Day 2000, now is a good time to look back on the predictions made at the first Earth Day and see how they've held up and what we can learn from them. The short answer: The prophets of doom were not simply wrong, but spectacularly wrong.
More important, many contemporary environmental alarmists are similarly mistaken when they continue to insist that the Earth's future remains an eco-tragedy that has already entered its final act. Such doomsters not only fail to appreciate the huge environmental gains made over the past 30 years, they ignore the simple fact that increased wealth, population, and technological innovation don't degrade and destroy the environment. Rather, such developments preserve and enrich the environment. If it is impossible to predict fully the future, it is nonetheless possible to learn from the past. And the best lesson we can learn from revisiting the discourse surrounding the very first Earth Day is that passionate concern, however sincere, is no substitute for rational analysis.
Soylent Greens
Imminent global famine caused by the explosion of the "population bomb" was the big issue on Earth Day 1970. Then–and now–the most prominent prophet of population doom was Stanford University biologist Paul Ehrlich. Dubbed "ecology's angry lobbyist" by Life magazine, the gloomy Ehrlich was quoted everywhere. "Population will inevitably and completely outstrip whatever small increases in food supplies we make," he confidently declared in an interview with then-radical journalist Peter Collier in the April 1970 Mademoiselle. "The death rate will increase until at least 100-200 million people per year will be starving to death during the next ten years."
"Most of the people who are going to die in the greatest cataclysm in the history of man have already been born," wrote Ehrlich in an essay titled "Eco-Catastrophe!," which ran in the special Earth Day issue of the radical magazine Ramparts. "By…[1975] some experts feel that food shortages will have escalated the present level of world hunger and starvation into famines of unbelievable proportions. Other experts, more optimistic, think the ultimate food-population collision will not occur until the decade of the 1980s." Ehrlich sketched out his most alarmist scenario for the Earth Day issue of The Progressive, assuring readers that between 1980 and 1989, some 4 billion people, including 65 million Americans, would perish in the "Great Die-Off."
Although Ehrlich was certainly the most strident doomster, he was far from alone in his famine forecasts. "It is already too late to avoid mass starvation," declared Denis Hayes, the chief organizer for Earth Day, in the Spring 1970 issue of The Living Wilderness. In that same issue, Peter Gunter, a professor at North Texas State University, wrote, "Demographers agree almost unanimously on the following grim timetable: by 1975 widespread famines will begin in India; these will spread by 1990 to include all of India, Pakistan, China and the Near East, Africa. By the year 2000, or conceivably sooner, South and Central America will exist under famine conditions….By the year 2000, thirty years from now, the entire world, with the exception of Western Europe, North America, and Australia, will be in famine" (emphasis in original). Ehrlich and others were openly contemptuous of the "Green Revolution," underway in countries such as India and Pakistan, that had already nearly doubled crop yields in developing nations between 1965 and 1970. Ehrlich sniffed that such developments meant nothing, going so far as to predict that "the Green Revolution…is going to turn brown." Such fears took form in such popular Zeitgeist movies as Soylent Green (1973), which envisioned a future of hungry masses jammed into overcrowded cities.
The Soylent Green crowd didn't simply predict mass starvation. They argued that even trying to feed so many people was itself a recipe for disaster. As Lester Brown, a former U.S. Department of Agriculture agronomist who would later become far more prominent as the founder of the Worldwatch Institute, put it in Scientific American, "There is growing doubt that the agricultural ecosystem will be able to accommodate both the anticipated increase of the human population to seven billion by the end of the century and the universal desire of the world's hungry for a better diet. The central question is no longer `Can we produce enough food?' but `What are the environmental consequences of attempting to do so?'"
Even if somehow famine were avoided, what would the world's population be in 2000? Peter Gunter predicted 7.2 billion. Ehrlich foresaw that "by the end of the century we'll have well over 7 billion people if something isn't done." Brown agreed that "world population at the end of the century is expected to be twice the 3.5 billion of today." In the April 21, 1970, Look, Rockefeller University biologist and Pulitzer Prize-winning writer Rene Dubos made the shocking suggestion that, "To some overcrowded populations, the bomb may one day no longer seem a threat, but a release."
Time has not been gentle with these prophecies. It's absolutely true that far too many people remain poor and hungry in the world–800 million people are still malnourished and nearly 1.2 billion live on less than a dollar a day–but we have not seen mass starvation around the world in the past three decades. Where we have seen famines, such as in Somalia and Ethiopia, they are invariably the result of war and political instability. Indeed, far from turning brown, the Green Revolution has never been so verdant. Food production has handily outpaced population growth and food today is cheaper and more abundant than ever before. Since 1970, the amount of food per person globally has increased by 26 percent, and as the International Food Policy Research Institute reported in October 1999, "World market prices for wheat, maize, and rice, adjusted for inflation, are the lowest they have been in the last century." According to the World Bank's World Development Report 2000, food production increased by 60 percent between 1980 and 1997. At the same time, the amount of land devoted to growing crops has barely increased over the past 30 years, meaning that millions of acres have been spared for nature–acres that would have been plowed down had agricultural productivity lagged the way Ehrlich and others believed it would.
What's the world population? Rather than 7 billion people inhabiting the earth by 2000, there are 6 billion–nearly 30 percent fewer than predicted. That's because total fertility (the number of children a woman has over the course of her lifetime) has been dropping nearly everywhere on the planet since 1970. In fact, it has dropped from around 6 children per woman in the 1960s to around 2.8 today–and shows no signs of stopping. Total fertility rates for 79 countries, including the United States, are below the replacement level of 2.1 children per woman. If present trends continue, it looks like the U.N. low-variant population growth projection is likely, which means that world population will like-ly peak at around 8 billion in 2040 and then begin to decline. It is true that the AIDS pandemic has cut average life expectancy in more than 30 countries since 1990, most of them in sub-Saharan Africa. Despite AIDS, however, the World Health Organization expects life expectancy in the developing countries to increase from 65 years to 73 years by 2020. (It's worth noting that any effective treatment for AIDS–a vaccine, say–will most likely emerge from laboratories and pharmaceutical companies, two stock villains in the standard environmentalist morality play, in the rich countries.)
Where did the doomsters go wrong? They assumed that overpopulation drives world hunger. To the extent that such conditions exist in certain places, the real culprit was–and is–poverty. "The images evoked by the term overpopulation–hungry families, squalid, overcrowded living conditions, early death–are real enough in the modern world, but these are properly described as problems of poverty," explains Harvard population researcher Nicholas Eberstadt. "Poverty, like all other possible human attributes, is represented in individual members of a population. It is an elementary lapse in logic to conclude that poverty is a `population problem' simply because it exists."
Polluted Thinking
Pollution was the other big issue on Earth Day 1970. Smog choked many American cities and sludge coated the banks of many rivers. People were also worried that we were poisoning the biosphere and ourselves with dangerous pesticides. DDT, which had been implicated in the decline of various bird species, including the bald eagle, the peregrine falcon, and the brown pelican, would soon be banned in the United States. Students wearing gas masks buried cars and internal combustion engines as symbols of our profligate and polluting consumer society. The Great Lakes were in bad shape and Lake Erie was officially "dead," its fish killed because oxygen supplies had been depleted by rot-ting algae blooms that had themselves been fed by organic pollutants from factories and municipal sewage. Pesticides draining from the land were projected to kill off the phytoplankton in the oceans, eventually stopping oxygen production.
In January 1970, Life reported, "Scientists have solid experimental and theoretical evidence to support…the following predictions: In a decade, urban dwellers will have to wear gas masks to survive air pollution…by 1985 air pollution will have reduced the amount of sunlight reaching earth by one half…." Ecologist Kenneth Watt told Time that, "At the present rate of nitrogen buildup, it's only a matter of time before light will be filtered out of the atmosphere and none of our land will be usable." Barry Commoner cited a National Research Council report that had estimated "that by 1980 the oxygen demand due to municipal wastes will equal the oxygen content of the total flow of all the U.S. river systems in the summer months." Translation: Decaying organic pollutants would use up all of the oxygen in America's rivers, causing freshwater fish to suffocate.
Of course, the irrepressible Ehrlich chimed in, predicting in his Mademoiselle interview that "air pollution…is certainly going to take hundreds of thousands of lives in the next few years alone." In Ramparts, Ehrlich sketched a scenario in which 200,000 Americans would die in 1973 during "smog disasters" in New York and Los Angeles.
So has air pollution gotten worse? Quite the contrary. In the most recent National Air Quality Trends report, the U.S. Environmental Protection Agency–itself created three decades ago partly as a response to Earth Day celebrations–had this to say: "Since 1970, total U.S. population increased 29 percent, vehicle miles traveled increased 121 percent, and the gross domestic product (GDP) increased 104 percent. During that same period, notable reductions in air quality concentrations and emissions took place." Since 1970, ambient levels of sulfur dioxide and carbon monoxide have fallen by 75 percent, while total suspended particulates like smoke, soot, and dust have been cut by 50 percent since the 1950s.
In 1988, the particulate standard was changed to account for smaller particles. Even under this tougher standard, particulates have declined an additional 15 percent. Ambient ozone and nitrogen dioxide, prime constituents of smog, are both down by 30 percent since the 1970s. According to the EPA, the total number of days with air pollution alerts dropped 56 percent in Southern California and 66 percent in the remaining major cities in the United States between 1988 and 1997. Since at least the early 1990s, residents of infamously smogged-in Los Angeles have been able to see that their city is surrounded by mountains.
Why has air quality improved so dramatically? Part of the answer lies in emissions targets set by federal, state, and local governments. But these need to be understood in the twin contexts of rising wealth and economic efficiency. As a Department of Interior analyst concluded after surveying emissions in 1999, "Cleaner air is a direct consequence of better technologies and the enormous and sustained investments that only a rich nation could have sunk into developing, installing, and operating these technologies." Today, American businesses, consumers, and government agencies spend about $40 billion annually on air pollution controls.
It is now evident that countries undergo various environmental transitions as they become wealthier. Fortune's special "ecology" edition in February 1970 was far more prescient than the doomsters when it noted, "If pollution is the brother of affluence, concern about pollution is affluence's child." In 1992, a World Bank analysis found that concentrations of particulates and sulfur dioxide peak at per capita incomes of $3,280 and $3,670, respectively. Once these income thresholds are crossed, societies start to purchase increased environmental amenities such as clean air and water.
In the U.S., air quality has been improving rapidly since before the first Earth Day–and before the federal Clean Air Act of 1970. In fact, ambient levels of particulates and sulfur dioxide have been declining ever since accurate records have been kept. Between 1960 and 1970, for instance, particulates declined by 25 percent; sulfur dioxide decreased by 35 percent between 1962 and 1970. More concretely, it takes 20 new cars to produce the same emissions that one car produced in the 1960s.
Similar trends can be found when it comes to water pollution. The warning sign is gone from the Potomac and I can swim and fish in that river again. Lake Erie once again supports a $600 million fishing industry, and an upscale shopping and entertainment district now lines the Cuyahoga River in Cleveland. The EPA estimates that between 60 percent and 70 percent of lakes, rivers, and streams meet state quality goals. That's up from about 30 percent to 40 percent 30 years ago.
Since 1972, the United States has invested more than $540 billion in water pollution control efforts, according to the Pacific Research Center. In 1972, only 85 million Americans were served by sewage treatment plants. Since then, some 14,000 municipal waste treatment plants have been built and 173 million Americans are served by them. Similar air and water quality trends can be found in other developed countries as well.
Most environmental problems occur in what are called "open-access commons"–that is, any member of the public may use the resource without paying anyone else for it. Typically, open-access commons still exist as relics of a time when the resource was abundant relative to the number of people using it. If only you and a couple of neighbors lived along a river, you could all dump your sewage in the river because it would naturally purify itself. The same goes for forests–homesteaders could chop them down because there were millions of acres more to be had.
With open-access commons, if you don't use the resource for your own benefit, other people will and you'll simply lose out. The prototypical example of an open-access commons is the old-fashioned village sheep meadow. Because everyone in the village has the right to put sheep on the meadow, each villager has an incentive to put extra sheep on the meadow in order to enrich himself. However, if every villager chooses to add sheep, then the meadow will be destroyed by overgrazing and all villagers will suffer the consequences.
In a related way, people dump sewage into rivers or pump smoke into the air because no one "owns" a river or the air in a traditional sense. We might say that the public "owns" rivers and airsheds, but none of us individually has much of an incentive (or an ability) to stop others from emitting excessive pollutants. Such open-access commons are at the center of most instances of environmental problems today, from the deforestation of tropical rainforests to the potential loss of biodiversity to the depletion of open-sea fisheries.
There are two basic ways to address the environmental problems caused by open-access commons. The favored way has been traditional, top-down political regulation, in which an agency prescribes specific pollution-control technology and monitors output. Depending on the situation, this method can score some quick improvements–the shift from leaded to unleaded gasoline had a huge impact on air quality, for instance. But it's more typically slow, costly, and subject to the endless wrangling of interest groups seeking special exemptions and protections. What's more, because it enforces a single standard, it discourages the innovation and experimentation that often lead to new, more environmentally sound ways of doing things. For example, the Clean Air Act effectively mandated that electric utilities use smokestack scrubbers to reduce their sulfur dioxide emissions when other alternatives, such as a switch to burning cleaner coal, would have reduced emissions even further and more cheaply, too.
The other approach to open-access commons harnesses both the creativity of markets and the power of privatization. An overall level of acceptable pollution is set, a market is created through tradeable permits, and then firms are allowed to pursue various means to reach the goal. We find fast, cheap, and efficient environmental improvements where this approach has been tried. In the U.S., for instance, sulfur dioxide emissions have been cut much faster and at less cost since the creation of a (very imperfect) market for such emissions (see "Selling Air Pollution," May 1996). Fisheries in New Zealand and Iceland have dramatically rebounded since they were essentially privatized. And one of the chief reasons that forests are expanding in the U.S. and Europe is because landowners have secure property rights to them. Such gains are not mysterious: If you own a resource, you're far more likely to use it efficiently.
Perversely, many environmental activists still fault markets for not properly valuing "natural capital" or "ecosystem services" and they continue to call for placing more resources in public hands. In effect, they want more open-access commons. But if no one has to pay for the use of a resource, then they consider it to be free. The way to take environmental goods into account is exactly the way we take all other goods into account–we put them into the market where people have to pay for what they use.
Synthetic Arguments
At Earth Day 1970, many Americans feared that synthetic chemicals, especially pesticides, were killing them. No culprit was more singled out than DDT, a pesticide that had been first used in 1946. The World Health Organization originally hailed it as a miracle that had drastically reduced deaths from malaria; its inventor, Swiss chemist Paul Hermann Muller, was honored with a Nobel Prize in 1948.
By 1970, however, DDT had emerged as the symbol of all that was wrong with the modern world. DDT had been implicated in the decimation of several bird species due to egg-shell thinning. It was also alleged to cause several human cancers, including breast cancer. DDT was banned in the U.S. by the EPA in 1972; other countries soon followed suit.
Paul Ehrlich warned in the May 1970 issue of Audubon that DDT and other chlorinated hydrocarbons "may have substantially reduced the life expectancy of people born since 1945." In his "Eco-Catastrophe!" scenario, Ehrlich put a finer point on these fears by envisioning a 1973 Department of Health, Education, and Welfare study which would find "that Americans born since 1946…now had a life expectancy of only 49 years, and predicted that if current patterns continued this expectancy would reach 42 years by 1980, when it might level out."
Keying off of Rachel Carson's claims about the dangers of synthetic chemicals in Silent Spring (1962), Look claimed that many scientists believed that residual DDT would lead to an increase in liver and other cancers. Cornell University ecologist Lamont Cole warned an Earth Day audience at Kearney State College in Nebraska that, "We are releasing into the environment more than 500,000 different chemicals." "There is one good thing about the blighting of our environment, that is, that Americans don't have to worry about cannibals anymore," said social critic Herbert Muller in The New York Times. "We've all become inedible, there's too much DDT in us."
Contrary to the conventional wisdom at Earth Day 1970, there's a broad consensus that exposure to synthetic chemicals, even pesticides, does not seem to be a problem. In 1996, the National Research Council of the National Academy of Sciences, in a comprehensive report called Carcinogens and Anticarcinogens in the Human Diet, concluded that levels of both synthetic and natural carcinogens are "so low that they are unlikely to pose an appreciable cancer risk." The National Cancer Institute reports that "increasing exposure to general environmental hazards seems unlikely to have had a major impact on the overall trends in cancer rates." "Pollution appears to account for less than 1 percent of cancer," concludes University of California biologist and cancer researcher Bruce Ames.
To be sure, the total number of cancer cases in the population did go up from 1973 to 1990, but cancer death rates declined owing to better medical treatments. Cancer incidence went up for some very prosaic reasons: We smoke too much tobacco, we eat too much fat, and we sunbathe excessively. We also live longer and cancer is primarily a disease of old age. In the U.S. since the early 1990s, both the incidence of cancer and deaths from cancer have been declining, not rising. Some analysts, such as Gregg Easterbrook, have recently hinted that this decline in cancer rates is the result of reductions in the amount of toxins released into the environment. Actually, a good bit of the improvement in cancer rates can be attributed to the decline in the number of smokers in the U.S.
Never mind. Cancer is scary enough (and ubiquitous enough–about one-third of Americans will get some sort of cancer during their lifetimes) that it still serves as a good tool for frightening people about alleged environmental contamination. Just this past January, Worldwatch Institute founder Lester Brown ominously noted, "Every human being harbors in his or her body about 500 synthetic chemicals that were nonexistent before 1920." So what? Considering that American lifespans have increased by 20 years, from an average of 56 years in 1920 to 71 years in 1970 to 76 years today, one might be tempted to argue that those synthetic chemicals are prolonging our lives. Certainly, they're not causing damage. Just last year, the National Research Council issued yet another report that found no evidence that synthetic chemicals are causing higher rates of cancer, birth defects, and other problems alleged by Brown.
Meanwhile, banning DDT allowed a resurgence of malaria-carrying mosquitos worldwide. The Malaria International Foundation estimates that there are between 600 to 900 million cases of malaria a year and that about 2.7 million people die of it annually. Spraying DDT had cut malaria deaths from 4 million annually in the early 1940s to 1 million in the 1960s.
Nonrenewable Anxiety
Beyond anxiety over population, pollution, and pesticides, even more speculative concerns were on display at the first Earth Day. Many of these fears–especially the supposed depletion of nonrenewable resources, ostensibly disappearing biodiversity, and apparent global climate change due to human activity–have come to figure far more prominently in our current environmental debates.
The depletion of nonrenewable resources wouldn't take center stage until the publication of the infamous Limits to Growth report to the Club of Rome in 1972. The limits-to-growth thesis got a huge boost when oil prices spiked during the Arab oil embargo. But on Earth Day 1970, there were already intimations that this would become a major theme of subsequent celebrations.
"We are prospecting for the very last of our resources and using up the nonrenewable things many times faster than we are finding new ones," warned Sierra Club director Martin Litton in Time's February 2, 1970, special "environmental report." Ecologist Kenneth Watt declared, "By the year 2000, if present trends continue, we will be using up crude oil at such a rate…that there won't be any more crude oil. You'll drive up to the pump and say, `Fill 'er up, buddy,' and he'll say, `I am very sorry, there isn't any.'" Later that year, Harrison Brown, a scientist at the National Academy of Sciences, published a chart in Scientific American that looked at metal reserves and estimated the humanity would totally run out of copper shortly after 2000. Lead, zinc, tin, gold, and silver would be gone before 1990.
Of course this didn't happen. The prices of all metals and minerals have dropped by more than 50 percent since 1970, according to the World Resources Institute. As we all know, lower prices mean that things are becoming more abundant, not less. The U.S. Geological Survey estimates that at present rates of mining, reserves of copper will last 54 years; zinc, 56 years; silver, 26 years; tin, 55 years; gold, 30 years; and lead, 47 years. What about oil? The survey estimates that global reserves could be as much as 2.1 trillion barrels of crude oil–enough to supply the world for the next 90 years. These reserve figures are constantly moving targets–as they get drawn down, miners and drillers find new sources of supply or develop more efficient technologies for exploiting the resources.
Worries about declining biodiversity have become popular lately. On the first Earth Day, participants were concerned about saving a few particularly charismatic species such as the bald eagle and the peregrine falcon. But even then some foresaw a coming holocaust. As Sen. Gaylord Nelson wrote in Look, "Dr. S. Dillon Ripley, secretary of the Smithsonian Institute, believes that in 25 years, somewhere between 75 and 80 percent of all the species of living animals will be extinct." Writing just five years after the first Earth Day, Paul Ehrlich and his biologist wife, Anne Ehrlich, predicted that "since more than nine-tenths of the original tropical rainforests will be removed in most areas within the next 30 years or so, it is expected that half of the organisms in these areas will vanish with it."
There's only one problem: Most species that were alive in 1970 are still around today. "Documented animal extinctions peaked in the 1930s, and the number of extinctions has been declining since then," according to Stephen Edwards, an ecologist with the World Conservation Union, a leading international conservation organization whose members are non-governmental organizations, international agencies, and national conservation agencies. Edwards notes that a 1994 World Conservation Union report found known extinctions since 1600 encompassed 258 animal species, 368 insect species, and 384 vascular plants. Most of these species, he explains, were "island endemics" like the Dodo. As a result, they are particularly vulnerable to habitat disruption, hunting, and competition from invading species. Since 1973, only seven species have gone extinct in the United States.
What mostly accounts for relatively low rates of extinction? As with many other green indicators, wealth leads the way by both creating a market for environmental values and delivering resource-efficient technology. Consider, for example, that one of the main causes of extinction is deforestation and the ensuing loss of habitat. According to the Consultative Group on International Agricultural Research, what drives most tropical deforestation is not commercial logging, but "poor farmers who have no other option for feeding their families than slashing and burning a patch of forest." By contrast, countries that practice high yield, chemically assisted agriculture have expanding forests. In 1920, U.S. forests covered 732 million acres. Today they cover 737 million acres, even though the number of Americans grew from 106 million in 1920 to 272 million now. Forests in Europe expanded even more dramatically, from 361 million acres to 482 million acres between 1950 and 1990. Despite continuing deforestation in tropical countries, Roger Sedjo, a senior fellow at the think tank Resources for the Future, notes that "76 percent of the tropical rain forest zone is still covered with forest." Which is quite a far cry from being nine-tenths gone. More good news: In its State of the World's Forests 1999, the U.N.'s Food and Agriculture Organization documents that while forests in developing countries were reduced by 9.1 percent between 1980 and 1995, the global rate of deforestation is now slowing.
"The developed countries in the temperate regions appear to have largely completed forestland conversion to agriculture and have achieved relative land use stability. By contrast, the developing countries in the tropics are still in a land conversion mode. This suggests that land conversion stability correlates strongly with successful economic development," concludes Sedjo, in his chapter on forestry in The True State of the Planet, a collection of essays I edited. In other words, if you want to save forests and wildlife, you had better help poor people become wealthy.
Of course, the biggest environmental crisis facing humanity nowadays is supposed to be global warming. Not surprisingly, worries about the future climate were a common theme among alarmists on the first Earth Day. However, they couldn't agree on what direction the earth's temperature was going to take.
"The greenhouse theorists contend the world is threatened with a rise in average temperature, which if it reached 4 or 5 degrees, could melt the polar ice caps, raise sea level by as much as 300 feet and cause a worldwide flood," explained Newsweek in its special January 26, 1970, report on "The Ravaged Environment." In the service of balance, however, the magazine also noted that many other scientists saw temperatures dropping: "This theory assumes that the earth's cloud cover will continue to thicken as more dust, fumes, and water vapor are belched into the atmosphere by industrial smokestacks and jet planes. Screened from the sun's heat, the planet will cool, the water vapor will fall and freeze, and a new Ice Age will be born."
Kenneth Watt was less equivocal in his Swarthmore speech about Earth's temperature. "The world has been chilling sharply for about twenty years," he declared. "If present trends continue, the world will be about four degrees colder for the global mean temperature in 1990, but eleven degrees colder in the year 2000. This is about twice what it would take to put us into an ice age."
Watt was wrong. Global temperatures didn't fall, and fears of a new ice age dissolved like frost on an early-autumn morning. Since 1988, when government climatologist James Hansen testified before the Senate Energy and Natural Resource committee that he had detected global warming, climate doomsters have switched almost entirely to worrying about global warming. The theory is straightforward–burning fossil fuels like coal and oil puts excess carbon dioxide in the atmosphere; the carbon dioxide traps heat from the sun and re-radiates it, heating up the atmosphere.
It's generally agreed that the earth's average temperature has indeed gone up by 1 degree Fahrenheit or so in the past century. The question now is, How much man-made warming can we expect in the 21st century? Computer climate models originally predicted that atmospheric temperatures might increase between 3 to 5 degrees centigrade by 2100. However, as the models have been refined, their estimates of how much warming might occur have been declining–the range is now down to 1.5 degrees centigrade to 3.5 by 2100. A recent report from the National Research Council noted that "the surface apparently warmed by 0.25 C to 0.4 C since 1979." Remarkably, the NRC panel also estimates the change in the temperature of the atmosphere as being between 0 C to 0.2 C during the same period. In other words, the atmosphere may not have warmed at all since 1979. This is an odd conclusion because the climate computer models have never predicted that the surface would warm first or faster than the atmosphere–in fact, they predict the opposite. Consequently, this gap between surface temperatures and atmospheric temperatures calls the predictive accuracy of the models into serious question.
That doesn't give many doomsters pause. In February, climatologist Tom Karl of the National Climate Data Center issued a study suggesting that global warming is speeding up. In 1997 and 1998, argues Karl, there were 16 consecutive months in which "we were breaking the previous year's all-time global high temperature record." However, University of Virginia climatologist Patrick Michaels (who receives some funding from fossil fuel companies) points out that those 16 months of record high temperatures occurred during the big 1997-1998 El Niño in the Pacific Ocean. During El Niños, water from the western Pacific Ocean spreads eastward, dramatically warming the normally cold waters off the coast of South America and thus boosting average global temperatures. Temperatures have now dropped back to where they were before the El Niño occurred. El Niños are not predicted to be affected by any man-made global warming.
In any case, whatever global warming is occurring is apparently being channeled into winter nights. Summer daytime temperatures do not appear to be warming. Warmer winter nights are far less of a threat to the natural world and humanity than higher summer temperatures. Are our coasts about to be inundated by rising seas due to melting ice caps? The best guess from the U.N.'s Intergovernmental Panel on Climate Change is that the sea level might rise about 8 inches by 2100. While this may seem troubling, keep in mind that sea levels rose by about 6 inches over the last century.
Indeed, a far greater threat for the next century comes from environmental activists. To counteract global warming, they essentially want to plan the energy future of the entire world for the next 100 years. They are enacting the plan through the U.N. Convention on Climate Change and the Kyoto Protocol. The absurdity (and arrogance) of that type of planning becomes clear when one imagines the same exercise taking place in 1900. The best scientific panel available in 1900 would simply not have been able to plan for millions of automobiles and trucks, ubiquitous electric lighting in millions of houses and office buildings, fuel for thousands of jet planes, and millions of refrigerators, air-conditioners, and the like. Virtually none of the devices on this nearly endless list had even been invented by 1900. Given the increasing rate of technological innovation, we undoubtedly have even less chance of foreseeing the future than people in 1900.
Why So Wrong?
How did the doomsters get so many predictions so wrong on the first Earth Day? Their mistake can be handily summed up in Paul Ehrlich and John Holdern's infamous I=PAT equation. Impact (always negative) equals Population x Affluence x Technology, they declared. More people were always worse, by definition. Affluence meant that rich people were consuming more of the earth's resources, a concept that was regularly illustrated by claiming that the birth of each additional baby in America was worse for the environment than 25, 50, or even 60 babies born on the Indian subcontinent. And technology was bad because it meant that humans were pouring more poisons into the biosphere, drawing down more nonrenewable resources and destroying more of the remaining wilderness.
We now know that Ehrlich and his fellow travelers got it backwards. If population were necessarily bad, then Brazil, with less than three-quarters the population density of the U.S., should be the wealthier society. As far as affluence goes, it is clearly the case that the richer the country, the cleaner the water, the clearer the air, and the more protected the forests. Additionally, richer countries also boast less hunger, longer lifespans, lower fertility rates, and more land set aside for nature. Relatively poor people can't afford to care overmuch for the state of the natural world.
With regards to technology, Ehrlich and other activists often claim that economists simply don't understand the simple facts of ecology. But it's the doomsters who need to update their economics–things have changed since the appearance of Thomas Malthus' 200-year-old An Essay on the Principle of Population, the basic text that continues to underwrite much apocalyptic rhetoric. Malthus hypothesized that while population increases geometrically, food and other resources increased arithmetically, leading to a world in which food was always in short supply. Nowadays, we understand that wealth is not created simply by combining land and labor. Rather, technological innovations greatly raise positive outputs in all sorts of ways while minimizing pollution and other negative outputs.
Indeed, if Ehrlich wants to improve his sorry record of predictions and his understanding of how to protect the natural world, he should walk across campus to talk with his Stanford University colleague, economist Paul Romer. "New Growth Theory," devised by Romer and others, shows that wealth springs from new ideas and new recipes. Romer sums it up this way: "Every generation has perceived the limits to growth that finite resources and undesirable side effects would pose if no new recipes or ideas were discovered. And every generation has underestimated the potential for finding new recipes and ideas. We consistently fail to grasp how many ideas remain to be discovered. The difficulty is the same one we have with compounding. Possibilities do not add up. They multiply." In other words, new ideas and technological recipes grow exponentially at a rate much faster than population does.
"I'm scared," confessed Paul Ehrlich in the 1970 Earth Day issue of Look. "I have a 14 year old daughter whom I love very much. I know a lot of young people, and their world is being destroyed. My world is being destroyed. I'm 37 and I'd kind of like to live to be 67 in a reasonably pleasant world, and not die in some kind of holocaust in the next decade." Ehrlich didn't die in a holocaust, and the world is far more pleasant than he thought it would be. It is probably too much to hope that abashed humility will strike him and he'll desist in bedeviling the world with his dire and consistently wrong predictions. He's like a reverse Cassandra –Cassandra made true prophecies but no one would listen to her. Ehrlich makes false prophecies and everyone listens to him.
There's much to celebrate on the 30th anniversary of Earth Day. Indeed, one of the chief things to get happy about is that the doomsters were so wrong. Civilization didn't collapse, hundreds of millions didn't die in famines, pesticides didn't cause epidemics of cancer, and the air and water didn't get dirtier in the industrialized countries.
On the occasions when they admit things have gotten better, doomsters will claim whatever environmental progress has been made over the past 30 years is only a result of the warnings that they sounded. One of the more annoying characteristics of activists such as Ehrlich and Lester Brown is the way in which these prophets of doom get out ahead of a parade that has already started. When things get better, they claim that it's only because people heeded their warnings, not because of longstanding trends and increased efficiencies. As a result, there is always the danger that governments may actually enact their policies, thereby stifling technological progress and economic growth–and making the world worse off. Then the doomsters would be able to say "I told you so." So good or bad, they get to claim that they were right all along.
What will Earth look like when Earth Day 60 rolls around in 2030? Here are my predictions: As the International Food Policy Research Institute projects, we will be able to feed the world's additional numbers and to provide them with a better diet. Because they are ultimately political in nature, poverty and malnutrition will not be eliminated, but economic growth will make many people in the developing world much better off. Technological improvements in agriculture will mean less soil erosion, better management of freshwater supplies, and higher productivity crops. Life expectancy in the developing world will likely increase from 65 years to 73 years, and probably more; in the First World, it will rise to more than 80 years. Metals and mineral prices will be even lower than they are today. The rate of deforestation in the developing world will continue to slow down and forest growth in the developed economies will increase.
Meanwhile, as many developing countries become wealthier, they will start to pass through the environmental-transition thresholds for various pollutants, and their air and water quality will begin to improve. Certainly air and water quality in the United States, Europe, Japan, and other developed countries will be even better than it is today. Enormous progress will be made on the medical front, and diseases like AIDS and malaria may well be finally conquered. As for climate change, concern may be abating because the world's energy production mix is shifting toward natural gas and nuclear power. There is always the possibility that a technological breakthrough–say, cheap, efficient, non-polluting fuel cells–could radically reshape the energy sector. In any case a richer world will be much better able to cope with any environmental problems that might crop up.
One final prediction, of which I'm most absolutely certain: There will be a disproportionately influential group of doomsters predicting that the future–and the present–never looked so bleak.
Show Comments (7)