Politics

Survey Says What?

The problem with health-care polling

|


William Hazlitt once said that "nothing is more unjust or capricious than public opinion." When it comes to health-care, our nation's pundits and activists seem determined to prove Hazlitt right. Some days it seems like everyone has a poll on their side. Participants in the health-care debate have variously claimed that the public supports a public plan and that it doesn't; that comprehensive health-reform is overwhelmingly popular and that it isn't; that people have a dim view of insurance companies and are happy with their own insurance plans; that a majority of Americans favor a single-payer health-care system and that they also support deregulating insurance markets. 
Are pundits and pollsters to blame? Or is the public just crazy? The best answer is that it's a little of both. 
Part of the problem is that the public is fickle, and opinions change over time. Cherry picking from different months or years—as I did in the previous paragraph—makes finding useful opinions easy. But a larger concern is that many prominent polls are conducted and interpreted in ways that are sure to skew the results. 
The most garish examples are what are known as push-polls. These polls, typically conducted by interested parties, artificially inflate public support or disapproval with loaded questions. At their most blatant, these blundering attempts at manipulation come off as laughable. And in the health-care debate, Republicans have been the worst offenders. One RNC survey, for example, asked whether respondents were concerned about the possibility that the government might deny health-care to Republicans because of their political affiliation. Another GOP poll ominously questioned, "Do you think Democrats in Congress should pick your doctor?" (Even more embarrassing: The largest percentage of respondents said "yes.")
Other polls are not so explicitly designed to bias the results, but still push respondents toward a particular answer. One NBC/Wall Street Journal poll, for example, asked whether interviewees favored or opposed health-care reform, and described it with borderline flack-like language:

The plan requires that health insurance companies cover people with pre-existing medical conditions. It also requires all but the smallest employers to provide health coverage for their employees, or pay a percentage of their payroll to help fund coverage for the uninsured. Families and individuals with lower- and middle-incomes would receive tax credits to help them afford insurance coverage. Some of the funding for this plan would come from raising taxes on wealthier Americans.

It's not quite a push-poll, but the description comes right out of the reformers' playbook: Play up the potential benefits and say that only the rich will have to pay. The pollsters might as well have asked whether the respondents had warm feelings towards puppy dogs and ice cream.

Another problem is deciding which questions to reference—and what they actually mean. When public plan-advocate Robert Reich referred to a poll showing that "76 percent of respondents said it was important that Americans have a choice between a public and private health-insurance plan," he wasn't making things up. But that answer came when respondents were asked how "important" it was to "give people a choice of both a public plan administered by the federal government and a private plan for their health insurance." The key words are "important" and "choice." There's a big difference between asking people if they think having a choice is important and asking them if they actually prefer a plan. And as it turns out, when asked straightforwardly about support for a government plan, 47 percent said they opposed it
The key, then, is to ask better questions, and that means getting specific. Kristen Soltis, a pollster with the Republican-affiliated Winston group, says that "what you're actually looking for is a question that at least gives a little bit of context." The most revealing polls, she says, are those that questions "whether people agree or disagree with what a policy intends to do." So rather than ask whether people support health-care reform generally, or whether they like Obama's plan, effective poll questions seek to assess support for specific proposals—and tend to explain what the proposals are designed to do. 
Indeed, questions that have asked about "Obama's plan" are particularly egregious violators of the tenets of good polling: As of yet, no such thing as an "Obama plan" exists. So when CNN asked respondents whether or not they "favor or oppose Obama's plan to reform health care" and the NBC/Wall Street Journal polling team asked whether "Barack Obama's health care plan" is a good idea, they were asking respondents to voice an opinion on an imaginary concept. 
Of course, the larger issue is with the argument that "the public" has a preference at all. Groups don't have preferences; individuals do. And when you try to ascertain what group preferences are, the result is often nonsense. One of the key insights of public choice economics has been that, although every individual in a group may hold rational, reasonable preferences, when they express themselves as a group, the results are anything but reasonable. Economist Richard McKelvey's chaos theorem explained how voting (or survey answering) can result in the public expressing paradoxical preferences: So, given three potential choices, A, B, and C, voting preferences would reveal they liked A over B, B over C, and C over A. The public, in other words, is crazy! 
Of course, for anyone following public reaction to the health-care debate, the fact that public opinion doesn't make much sense won't come as much surprise. And no matter how unreasonable, public opinion informs how Washington works. Franklin Roosevelt once quipped that "a government can be no better than the public opinion which sustains it." Explains a lot, doesn't it? 
Peter Suderman is an associate editor at Reason magazine.