In February, upon introducing the Family Smoking Prevention and Tobacco Control Act, Rep. Henry Waxman (D-Calif.) said the legislation “would give FDA broad powers to regulate tobacco products and protect public health.”
In 2004 Sen. Hillary Clinton (D-N.Y.) urged us to think about children’s entertainment “from a public health perspective.” In that light, she said, “exposing our children to so much of this unchecked media is a kind of contagion,” a “silent epidemic” that threatens “long-term public health damage to many, many children and therefore to society.”
In 2003 Surgeon General Richard Carmona, declaring that “obesity has reached epidemic proportions,” offered “a simple prescription that can end America’s obesity epidemic”: “Every American needs to eat healthy food in healthy portions and be physically active every day.”
In 1999 Thom White Wolf Fassett, general secretary of the United Methodist General Board of Church and Society, applauded the work of the National Gambling Impact Study Commission, saying its report “uncovers the hidden epidemic of gambling addiction.” Later that year, two addiction specialists, David Korn and Howard Shaffer, published a paper in the Journal of Gambling Studies calling for “a public health perspective towards gambling.”
What do these four “public health” problems—smoking, playing violent video games, overeating, and gambling—have in common? They’re all things that some people enjoy and other people condemn, attributing to them various bad effects. Sometimes these effects are medical, but they may also be psychological, behavioral, social, or financial. Calling the habits that supposedly lead to these consequences “public health” problems, “epidemics” that need to be controlled, equates choices with diseases, disguises moralizing as science, and casts meddling as medicine. It elevates a collectivist calculus of social welfare above the interests of individuals, who become subject to increasingly intrusive interventions aimed at making them as healthy as they can be, without regard to their own preferences.
This tendency to call every perceived problem affecting more than two people an “epidemic” obscures a crucial distinction. The classic targets of public health were risks imposed on people against their will, communicable diseases being the paradigmatic example. The more recent targets are risks that people voluntarily assume, such as those associated with smoking, drinking, eating junk food, exercising too little, watching TV too much, playing poker, owning a gun, driving a car without wearing a seat belt, or riding a bicycle without wearing a helmet. The difference is the one John Stuart Mill urged in his 1859 book On Liberty: “The sole end for which mankind are warranted, individually or collectively, in interfering with the liberty of action of any of their number is self-protection.…The only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others. His own good, either physical or moral, is not a sufficient warrant.” Mill’s “harming principle” is obviously important to libertarians, but public health practitioners also should keep it in mind if they do not want to be seen as moralistic busybodies constantly seeking to expand the reach of government.
Under Mill’s principle, there is a strong case for government intervention to prevent the spread of a deadly microbe, extending even to such highly coercive measures as forcible quarantine or legally mandated medication. The case for intervention to prevent people from placing bets, eating ice cream, or playing Grand Theft Auto is much weaker. It requires demonstrating that such activities harm not only the people engaged in them but other people as well. And although Mill was imprecise on this point in On Liberty, harm to others has to be understood as a necessary but not sufficient condition for government intervention. To justify the use of force, the alleged harm has to be the sort that the government has a duty to prevent—that is, the sort that violates people’s rights.
The Transformation of Public Health
Public health used to mean keeping statistics, imposing quarantines, requiring vaccination of children, providing purified water, building sewer systems, inspecting restaurants, regulating emissions from factories, and reviewing medicines for safety. Nowadays it means, among other things, banning cigarette ads, raising alcohol taxes, restricting gun ownership, forcing people to buckle their seat belts, redesigning cities to discourage driving, and making illegal drug users choose between prison and “treatment.” In the past, public health officials could argue that they were protecting people from external threats: carriers of contagious diseases, fumes from the local glue factory, contaminated water, food poisoning, dangerous quack remedies. By contrast, the new enemies of public health come from within; the aim is to protect people from themselves—from their own carelessness, shortsightedness, weak will, or bad values—rather than from each other.
Although this sweeping approach is a relatively recent development, we can find intimations of it in the public health rhetoric of the 19th century. In the introduction to the first major American book on public health, U.S. Army surgeon John Shaw Billings explained the field’s concerns: “Whatever can cause, or help to cause, discomfort, pain, sickness, death, vice, or crime—and whatever has a tendency to avert, destroy, or diminish such causes—are matters of interest to the sanitarian.” Despite this ambitious mandate, and despite the book’s impressive length (nearly 1,500 pages in two volumes), A Treatise on Hygiene and Public Health had little to say about the issues that occupy today’s public health professionals. There were no sections on smoking, alcoholism, drug abuse, obesity, vehicular accidents, mental illness, suicide, homicide, domestic violence, or unwanted pregnancy. Published in 1879, the book was instead concerned with things like compiling vital statistics, preventing the spread of disease, abating public nuisances, and assuring wholesome food, clean drinking water, and sanitary living conditions.
A century later, public health textbooks were discussing the control of communicable diseases mainly as history. The field’s present and future lay elsewhere. “The entire spectrum of ‘social ailments,’ such as drug abuse, venereal disease, mental illness, suicide, and accidents, includes problems appropriate to public health activity,” explained Principles of Community Health in 1977. “The greatest potential for improving the health of the American people is to be found in what they do and don’t do to and for themselves. Individual decisions about diet, exercise, stress, and smoking are of critical importance.” Similarly, the 1978 edition of Introduction to Public Health noted that the field, which once “had much narrower interests,” now “includes the social and behavioral aspects of life—endangered by contemporary stresses, addictive diseases, and emotional instability.” (Emphasis in the original.)
In a sense, the change in focus is understandable. After all, Americans are not dying the way they once did. The chapter on infant mortality in A Treatise on Hygiene and Public Health reports that during the late 1860s and early 1870s two-fifths to one-half of children in major American cities died before reaching the age of 5. The major killers included measles, scarlet fever, smallpox, diphtheria, whooping cough, bronchitis, pneumonia, tuberculosis, and “diarrheal diseases.” Beginning in the 1870s, the discovery that infectious diseases were caused by specific microorganisms made it possible to control them through vaccination, antibiotics, better sanitation, water purification, and elimination of rats, mosquitoes, and other carriers. At the same time, improvements in nutrition and living conditions increased resistance to infection.
Americans no longer live in terror of smallpox or cholera. Despite occasional outbreaks of infectious diseases such as rabies and tuberculosis, the fear of epidemics that was once an accepted part of life is virtually unknown. The one major exception is AIDS, which is not readily transmitted and remains largely confined to a few high-risk groups. For the most part, Americans are dying of things you can’t catch: cancer, heart disease, trauma. Accordingly, the public health establishment is focusing on those causes and the factors underlying them. Having vanquished most true epidemics, it has turned its attention to metaphorical “epidemics” of unhealthy behavior.
In 1979 Surgeon General Julius Richmond released Healthy People: The Surgeon General’s Report on Health Promotion and Disease Prevention, which broke new ground by setting specific goals for reductions in mortality. In the introduction Joseph Califano, then secretary of the Department of Health, Education, and Welfare, warned that “we are killing ourselves by our own careless habits” and called for “a second public health revolution” (the first being the triumph over infectious diseases). Healthy People, which estimated that “perhaps as much as half of U.S. mortality in 1976 was due to unhealthy behavior or lifestyle,” advised Americans to quit smoking, drink less, exercise more, fasten their seat belts, stop driving so fast, and cut down on fat, salt, and sugar. It also recommended motorcycle helmet laws and gun control.
Healthy People drew on a “national prevention strategy” developed by what is now the U.S. Centers for Disease Control and Prevention (CDC). Established during World War II as a unit of the U.S. Public Health Service charged with fighting malaria in the South, the CDC today includes eight different centers, only two of which deal with the control of infectious disease. The National Center for Chronic Disease Prevention and Health Promotion, for example, includes the Office on Smoking and Health and the Division of Nutrition and Physical Activity.
The CDC’s growth can be seen as a classic example of bureaucratic empire building. Although it is easy to dismiss public health’s ever-expanding agenda as a bid for funding, power, and status, the field’s practitioners argue with evident sincerity that they are simply adapting to changing patterns of morbidity and mortality. In doing so, however, they are treating behavior as if it were a communicable disease, which obscures some important distinctions. Contrary to the impression left by all the warnings about a “methamphetamine epidemic” that is supposedly sweeping the country, or by CDC maps that show obesity spreading like a plague from state to state, behavior cannot be transmitted to other people against their will. People do not choose to be sick, but they do choose to engage in risky behavior. The choice implies that the behavior, unlike a viral or bacterial infection, has value. It also implies that attempts to control the behavior will be resisted.
Healthy People noted that “formidable obstacles” stand in the way of improved public health. “Prominent among them are individual attitudes toward the changes necessary for better health,” it said. “Though opinion polls note greater interest in healthier lifestyles, many people remain apathetic and unmotivated.…Some consider activities to promote health moralistic rather than scientific; still others are wary of measures which they feel may infringe on personal liberties. However, the scientific basis for suggested measures has grown so compelling, it is likely that such biases will begin to shift.” In other words, people engage in risky behavior because they don’t know any better. Once they realize the risks they are taking, they will change their ways.
Surely there is a measure of truth to this. The publicity surrounding the first surgeon general’s report on the health hazards of smoking, published in 1964, was followed by a more or less steady decline in the prevalence of smoking among Americans, from 42 percent of adults in 1965 to about half that today. Some of that decline, especially in recent years, probably has been due to coercive measures such as smoking bans and cigarette tax hikes, but most of it was seen in the first couple of decades, when the main tactics of the anti-smoking movement were education and persuasion. Likewise, the dramatic increase in seat belt use by Americans, from 15 percent of drivers and front-seat passengers in 1984 to 80 percent two decades later, may have been partly due to the threat of fines, but greater awareness of the safety benefits also has played an important role.
Still, some people, even after they understand the risks they’re taking, obstinately continue to take them. To Richard Carmona, it may be obvious that “every American needs to eat healthy food in healthy portions and be physically active every day.” But what if some Americans, or many, or most, refuse to get with the program?
An Unspoken Moral Premise
As early as June 1975, in its Forward Plan for Health, the U.S. Public Health Service was suggesting “strong regulations to control the advertisement of food products, especially those of high sugar content or little nutritional value.” Since then, as Americans have gotten fatter, calls for restrictions on ads, especially those aimed at children, have intensified. Anti-fat crusaders such as the Yale obesity expert Kelly Brownell are pushing “junk food” taxes to discourage consumption of cheeseburgers and potato chips, along with subsidies to encourage consumption of fruits and vegetables. You could extend that idea to the energy expenditure side of the equation, taxing products associated with sloth, such as books and TV sets, and subsidizing products associated with exercise, such as bicycles and treadmills. Other proposals include lawsuits to compel restaurant menu changes, bans on fast food near schools, and an “equal time” rule requiring stations that air ads for fast food and sugary breakfast cereals to carry propaganda urging people to eat better and exercise more.
None of this is likely to work. There is little evidence that kids like candy and ice cream, or eat more of it, because of advertising; that they see more food advertising now than they did when they were thinner; or that bans on ads aimed at children, which have been imposed in Sweden and Quebec, make kids slimmer. The price control system envisioned by Kelly Brownell and like-minded activists raises insoluble calculation problems, and since tax rates would be the same for every buyer, it would either overdeter moderate eaters or underdeter gluttons—probably both. Restaurants can sell only what people are willing to eat, and litigation will not change that. Establishing fast-food-free zones near schools would not prevent students from bringing their own fattening food to school, and it would not affect most of their meals in any case. And even if a policy of forcing stations to carry anti-obesity messages survived a First Amendment challenge (which it wouldn’t), it is doubtful that telling people what they already know—that exercise and a balanced diet are important to good health—would have much of an impact.
But I can think of a couple of policies that would make a difference. Instead of a “junk food” tax, which is inefficient and unfair because it is paid by the thin as well as the fat, why not tax people for every pound over their ideal weight? People would be required to get weighed once a year at an approved station, which would send its report to the Internal Revenue Service. If the tax were set high enough, I’m sure many people would lose weight. If that seems too complicated, how about mandatory calisthenics in the town square every morning? Assuming these policies are feasible and cost-effective, is there any basis for objecting to them “from a public health perspective”?
If not, I’d suggest that the public health perspective leaves out some important considerations. Maximizing health is not the same as maximizing happiness. The public health mission to minimize morbidity and mortality leaves no room for the possibility that someone might accept a shorter life span, or an increased risk of disease or injury, in exchange for more pleasure or less discomfort. Motorcyclists, rock climbers, and sky divers make that sort of decision all the time, and not all of them are ignorant of the relevant injury and fatality statistics. With lifestyle choices that pose longer-term risks, such as smoking and overeating, the dangers may be easier to ignore, but it is still possible for someone with a certain set of tastes and preferences to say, “Let me enjoy myself now; I’ll take my chances.” The assumption that such tradeoffs are unacceptable is the unspoken moral premise of public health. When the surgeon general declares that “every American needs to eat healthy food in healthy portions and be physically active every day,” where does that leave a guy who prefers to be fat if it means he can eat what he likes and relax in his spare time instead of looking for ways to burn calories?
It’s true that, as the anti-smoking activist William Cahan pointed out on a CNN talk show several years ago, “People who are making decisions for themselves don’t always come up with the right answer.” They don’t necessarily make tradeoffs between health and other values in an informed or carefully considered manner. Sometimes they regret their decisions. But they know their own tastes and preferences, and they have access to myriad pieces of local information about the relevant costs and benefits that no government regulator can possibly know. They will not always make good decisions, but on balance they will make better decisions, as measured by their own subsequent evaluations, than any third party deciding for them. Leaving aside the question of who is better positioned to decide whether a given pleasure is worth the risk associated with it, there is an inherent value to freedom: When it comes to how people feel about their lives, they may well prefer to make their own bad choices rather than have better ones imposed on them.
Needless to say, people make mistakes—sometimes expensive, hard-to-correct mistakes—in many areas of life. If that fact is reason enough for the government to second-guess their decisions about dangerous activities such as smoking cigarettes and riding motorcycles, why on earth should the government let people make their own choices when it comes to such consequential matters as where to live, how much education to get, whom to marry, whether to have children, which job to take, or what religion to practice? These decisions are at least as important, and the government is at least as well equipped to make them as it is to decide which health risks are acceptable.
While people are not perfect judges of their own interests, they are better judges, by and large, than government officials are apt to be. That is the utilitarian case against paternalism and in favor of individual freedom. But what if people making health-related decisions are not truly free? Some paternalists claim that surrendering to certain habits, such as drug use or gambling, is akin to selling yourself into slavery, which Mill himself said was not an acceptable use of freedom.
It’s pretty clear from Mill’s condemnation of alcohol prohibition that he did not share this view of addiction. In any case, there is abundant evidence that addiction is a pattern of behavior shaped by a complex interaction of personal and situational variables, not an automatic process in which people become “hooked” without regard to their own choices or desires. In that sense it is quite different from being clapped in chains and forced to follow another person’s commands.
Mill also made an exception for minors, of course; and so-called public health threats, such as the violent entertainment that vexes Hillary Clinton, are often described as menaces to children. People who oppose paternalism vis-à-vis adults can nevertheless support measures, such as a legally enforced cigarette purchase age, that are narrowly targeted at preventing minors from making risky decisions that are properly reserved for grownups. But child protection often becomes an excuse for restricting the freedom of adults. The aforementioned fast-food-free zones, for example, probably would not make kids noticeably thinner but certainly would make life harder for adults looking for a quick and convenient lunch while working or running errands near schools.
Sometimes politicians and activists who claim to be fighting for parents are actually complaining about the appalling stuff that parents let their kids see, do, or eat. They want to override parental prerogatives instead of reinforcing them. Sen. Clinton, for example, worries that parents do not make enough use of the “V chip” her husband championed as a way to prevent children from seeing inappropriate TV shows. But perhaps they simply are not as alarmed as she is about the state of popular culture. Likewise, activists who want to ban food marketing aimed at children do not seem to trust parents to say no when their kids demand SpongeBob SquarePants Pop-Tarts or Dora the Explorer cookies.
The Dictatorship of Health Care
Since naked appeals to paternalism do not go over very well in the U.S., and since the child protection argument is not always plausible, advocates of enlisting the government to discourage unhealthy behavior often argue that risky habits are a legitimate matter of public concern because they cost taxpayers money. This is not necessarily true. Smoking, for example, raises the cost of treating certain illnesses, but it reduces other costs, since smokers tend to die sooner than nonsmokers and therefore do not use as much health care in old age, do not spend as much time in nursing homes, and do not draw as much on Social Security. A 1997 analysis in The New England Journal of Medicine concluded that total medical spending would go up, not down, if everyone stopped smoking. And that does not even consider the increase in Social Security outlays when smoking becomes less common. The Harvard economist Kip Viscusi, among others, concludes that the long-term taxpayer savings from smoking outweigh the short-term costs. If so, the public health community’s fiscal argument suggests that the government should be encouraging smoking. Something similar might be true of obesity.
But even if certain habits do, on balance, increase taxpayer costs, the problem is not that some people do risky things; it’s that the government forces other people to pay their medical bills. Mill’s harming principle certainly would allow the government to prevent someone from picking your pocket. But in this case, it’s the government that is picking your pocket by requiring you to subsidize the health care of the poor and the elderly. It’s inevitable that some of those patients will have risky habits that contribute to the cost of providing health care for them—habits such as overeating, underexercising, smoking, drinking too much, eating soft-cooked eggs or rare hamburgers, working too hard, sleeping too little, skiing, sunbathing, riding motorcycles, and neglecting to brush and floss, to name just a few. People who don’t want to pay for the consequences of such dangerous behavior shouldn’t support taxpayer-funded health care. If we were to keep these programs but try somehow to prevent people from doing anything that would make them more likely to need medical care, we would once again be facing the prospect of a government with an open-ended license to meddle in what used to be considered private decisions. We would be creating a society of resentful busybodies, where everyone’s personal habits are everyone else’s business.
If the cost of taxpayer-funded health care were counted as the sort of harm that justifies government intervention under Mill’s principle, the exception would swallow the rule. The same goes for harms to other people that do not constitute crimes or torts, such as the emotional pain of losing a friend to smoking-related lung cancer or the burden of assisting a relative who is so fat he has trouble getting around. If costs like these were grounds for government intervention, no one anyone cared about would ever be free to run his own life. Likewise, the fact that some people who engage in a particular activity commit crimes or torts, such as gamblers who steal or drinkers who drive while intoxicated and get into crashes, is not sufficient grounds for prohibiting the activity. Such a position would, like the determination to avoid any possible impact on the public treasury, obliterate the distinction between self-harm and harm to others.
The Corruption of Medicine by
Because the public health field developed in response to deadly threats that spread from person to person and place to place, its practitioners are used to enlisting the state in their cause. Writing in 1879, John Shaw Billings put it this way: “All admit that the State should extend special protection to those who are incapable of judging of their own best interests, or of taking care of themselves, such as the insane, persons of feeble intellect, or children; and we have seen that in sanitary matters the public at large are thus incompetent.”
Billings was defending measures aimed at traditional public health targets, such as infectious diseases and toxic pollution. It’s reasonable to expect that such efforts will be welcomed by the intended beneficiaries once they understand the aim. The same cannot be said of public health’s new targets. Even after the public is informed about the relevant hazards, many people will continue to behave in ways frowned upon by the public health establishment. This is not because they misunderstood; it’s because, for the sake of pleasure, utility, or convenience, they are prepared to accept the risks. When public health experts assume these decisions are wrong, they do indeed treat adults like children.
One such expert, Boston public health official Leon S. White, reflected on the recalcitrance of risktakers in 1975, the year the federal government published its Forward Plan for Health, which set forth a “prevention strategy” that included various laws aimed at stopping people from harming themselves. “The real malpractice problem in this country today,” White wrote in The New England Journal of Medicine, “is not the one described on the front pages of daily newspapers but rather the malpractice that people are performing on themselves and each other.…It is a crime to commit suicide quickly. However, to kill oneself slowly by means of an unhealthy life style is readily condoned and even encouraged.” White’s article prompted a response from Robert Meenan, a professor at the University of California School of Medicine in San Francisco, who observed: “Health professionals…have no personal attributes, knowledge, or training that qualifies them to dictate the preferences of others. Nevertheless, doctors generally assume that the high priority that they place on health should be shared by others. They find it hard to accept that some people may opt for a brief, intense existence full of unhealthy practices. Such individuals are pejoratively labeled ‘noncompliant’ and pressures are applied on them to reorder their priorities.”
This is what H.L. Mencken had in mind when he remarked that “hygiene is the corruption of medicine by morality. It is impossible to find a hygienist who does not debase his theory of the healthful with a theory of the virtuous. The whole hygienic art, indeed, resolves itself into an ethical exhortation. This brings it, at the end, into diametrical conflict with medicine proper. The true aim of medicine is not to make men virtuous; it is to safeguard and rescue them from the consequences of their vices. The physician does not preach repentance; he offers absolution.” Whether or not you agree with Mencken’s view of the physician’s proper role, the danger of transforming a doctor’s orders into the government’s orders should be clear.
But not to everyone. Some public health theorists explicitly recognize that their aims are fundamentally collectivist and cannot be reconciled with the American tradition of limited government.In 1975 Dan Beauchamp, then an assistant professor of public health at the University of North Carolina, presented a paper at the annual meeting of the American Public Health Association in which he argued that “the radical individualism inherent in the market model” is the biggest obstacle to improving public health. “The historic dream of public health that preventable death and disability ought to be minimized is a dream of social justice,” Beauchamp said. “We are far from recognizing the principle that death and disability are collective problems and that all persons are entitled to health protection.”
Not only are all persons entitled to health protection, but they’re going to get it, whether they want it or not. Beauchamp rejected “the ultimately arbitrary distinction between voluntary and involuntary hazards” and complained that “the primary duty to avert disease and injury still rests with the individual.” He called upon public health practitioners to challenge “the powerful sway market-justice holds over our imagination, granting fundamental freedom to all individuals to be left alone.” So the right to be left alone—the right Supreme Court Justice Louis Brandeis considered “the most comprehensive of rights and the right most valued by civilized men”—turns out to be the leading risk factor for disease and injury.
Beauchamp may be unusually candid, but his vision is implicit in
the way public health is currently understood. According to John
Hanlon’s Public Health Administration and Practice,
“public health is dedicated to the common attainment of the highest
levels of physical, mental, and social well-being and longevity
consistent with available knowledge and resources at a given time
place.” The textbook Principles of Community Health tells us that “the most widely accepted definition of individual health is that of the World Health Organization: ‘Health is a state of complete physical, mental, and social well-being and not merely the absence of disease or infirmity.’ ” Especially in light of this definition, the WHO’s goal of “Health for All,” if it is meant to be a prescription for state action, has a chilling sound to it.
“Over himself, over his own body and mind, the individual is sovereign,” Mill insisted. The mandate “Health for All” replaces that principle with a legally enforceable duty to be well, a demand by the collective to keep one’s body and mind in optimal condition. A government empowered to maximize health is a not a government under which anyone who values liberty would want to live.
Senior Editor Jacob Sullum is the author of For Your Own Good: The Anti-Smoking Crusade and the Tyranny of Public Health (Free Press). This article is adapted from a talk he gave at a conference on “Public Health and the Legacy of John Stuart Mill” at Columbia University’s Mailman School of Public Health in December.
Discuss this article online.