Policy

Did I Accidentally Prove that Obamacare is a Success?

|

Whitehouse.gov

New York's Jonathan Chait needles me for having "accidentally" shown "how Obamacare is succeeding." To make his point, Chait summarizes and briefly quotes several posts I wrote between January and the second week of March of this year noting low or overstated enrollment figures in various parts of Obamacare, and then jumps forward to a post I wrote earlier this week arguing that a recent estimate combining all of Obamacare's coverage expansion provisions is almost certainly too high. 

"We have gone from learning that the law has failed to cover anybody to learning it would cover a couple million to learning it would cover a few million to learning that it has probably insured fewer than 20 million people halfway through year one," he concludes, as if the items he links to are all discussing comparable components of the law's coverage expansion.

But Chait's argument relies on on selective quotations from posts of mine that examine different aspects of the law that aren't really comparable, and stringing them together in a way that implies they are all referring to the same thing. And he fails to mention other posts I wrote, between March and July, noting the law's late-March enrollment surge and effects on covering the uninsured.

Mostly, then, his item serves to demonstrate is that enrollment in Obamacare's insurance exchanges was, as was widely reported, sluggish from the beginning of the year until the middle of March, and that it grew afterwards. (Until the very end, it lagged behind the administration's own projections.)

Throughout the year, I noted the low enrollment figures as they came in, arguing that the numbers weren't promising. And when the enrollment did pick up at the end of March, I also wrote items on the surge and its effects on multiple occasions—including a March 31 post noting a "last-minute sign-up surge," an April 1 post noting the White House's announcement that 7 million had signed up for coverage, an April 17 post noting the administration's report that 8 million had signed up for coverage, and a May post specifically saying that early speculation that the law might have no or negative effect on coverage was not plausible.

Chait does not mention any of these items in his post, which does not quote or link to any post I wrote between March 11 and this month. He also says my posts are useful as a "lagging indicator of Obamacare's progress." But these posts were written within a day of the information becoming public.

Chait also fails to quote my own caveats in the posts he does cite, or to provide important context for comparing the posts he quotes. For example, he quotes me saying on January 21 that "it appears possible that there has been no net expansion of private coverage at all." Note that this is presented as a possibility, not a certainty, and that I also wrote in the same post that that even if this were true, "there's still time for that to change. As the administration is keen to remind us, people who want coverage have until the end of March to sign up for coverage this year."

The March 11 post Chait quotes from, meanwhile, dealt specifically with the question of how much effect Obamacare was having on the uninsured rate, based on a Gallup survey, which turns out to be a completely different metric from what's being looked at in the next post he selects.

The Gallup survey, I said, was the "best evidence" that Obamacare was reducing the rate of the uninsured. This effect, which was measured prior to the late-March surge, is separate from the question of how many people now have coverage through some coverage-expanding provision in the law. But the total-coverage question is what the next post he quotes from (a July item taking issue with a New England Journal of Medicine (NEJM) study estimating 20 million total covered by Obamacare, through the exchanges, Medicaid, and other provisions) discussed.

The authors of that study specifically note that, unlike the Gallup survey, it was not an attempt to judge the law's effect on the uninsurance rate. As the authors of the study say, "We do not know yet exactly how many of these people were previously uninsured…"

Chait pairs the two posts without much context, implicitly suggesting that they are looking at the same thing. They are not.

The July post on the NEJM's total-coverage estimate is also not really comparable with the the February 24 post that Chait quotes from, which specifically deals with President Obama's claim that almost 7 million people had gained coverage under Obamacare just through the law's Medicaid expansion.

Obama's claim was wrong at the time. It's still wrong. I was not the only journalist to say so at the time. (The Washington Post's fact checker gave Obama's statement four Pinocchios.) By now even the administration has backed off that number for the Medicaid expansion, saying only that total enrollment in the program (which additionally counts the surge month after Obama made the 7 million claim) has increased by 6 million following Obamacare's coverage expansion, and that not all of that increase is directly attributable to the law. If you read that post on the day that it was published, you got an accurate and real-time impression that President Obama was wrong. If you read it now, you get an accurate impression that Obama was overstating the health law's coverage effects.

Obamacare has, without question, enrolled far more people since January, and the evidence is pretty strong that it has cut the rate of the uninsured by several points. But even as the coverage figures have increased, what you also see is that the president, his administration, and the law's backers have consistently relied on dubious, misleading, or incomplete metrics to overstate what can be known about the law's impact based on the information available.

Chait may see my posts noting this as inadvertent proof that Obamacare is actually a success, but to me it looks more like evidence that the administration and its supporters are desperate to convince people that it is more of one than the evidence supports.