It's been a pretty good run for pollsters this year. With the notable exception of Iowa, most of the projections about who would win 2016's caucuses and primaries have been approximately right—until Democratic voters went to the polls in Michigan yesterday, that is.
All the polling averages and forecasters expected Hillary Clinton to take the Great Lakes State in a romp. RealClearPolitics had her 21 percentage points ahead, while HuffPost Pollster had her 18 percentage points up. FiveThirtyEight put her odds of victory at 99 percent.
Instead, Bernie Sanders clinched the shocking win:
In the famous New Hampshire upset in '08, Clinton trailed Obama by 8 points. Big upset. But today, Sanders trailed Clinton by *21* and won.
— Nate Silver (@NateSilver538) March 9, 2016
There are a lot of potential explanations for what happened. One possibility is that the polls failed to get a clean read on where the true electorate was at the time the surveys were conducted. I've written a lot (including a long feature in the February issue, "Why Polls Don't Work") on what's caused the art and science of accurately measuring public opinion to get so much harder in recent years.
It's also possible that people changed their minds late in the game, either deciding at the last second whether to turn out to vote at all, or deciding at the last second to support Sanders over Clinton. Because Michigan has an open primary, it's even conceivable that a significant number of Hillary supporters, thinking their candidate was a lock (hey, that's what the polls were telling them!), opted to cross over and vote for or against Trump in the Republican primary rather than cast a ballot on the Democratic side.
It's not necessarily a polling failure when people choose to behave differently than they were genuinely planning to at the time they were interviewed, even though we tend to treat it like it is.
As other commentators have noted, Sanders managed to capture a higher proportion of black voters in Michigan (31 percent, according to the exit polls) than pre-election polls were finding. It also appears that young people—with whom Sanders is extraordinarily popular—turned out in higher numbers (voters under 30 made up 21 percent of the electorate, according to the exit polls) than pre-election polls predicted.
Are those discrepancies because the pre-election polls were wrong? It's certainly possible. Of the seven surveys completed in March and included in the RCP average, the three that found Clinton leading by the largest margins were all conducted by the same outfit, Mitchell Research & Communications, which uses "IVR," or robo-polling, instead of live interviewers. A fourth came from CBS, which partners with YouGov, a company that surveys people via the Web and therefore isn't able to use the random sampling methodology that is the basis of most modern survey research. A fifth was a product of Monmouth University, which does traditional live telephone polling but spoke to just 302 likely Democratic voters in the state.
Unlike in Iowa, where the exit polls found people breaking late and hard away from Donald Trump, Michigan voters who made up in their minds in the last week were only barely more likely to support Sanders than those who made up their minds earlier. This at least suggests the problem here has more to do with a failure by pollsters to get an accurate read of the electorate (again, no easy feat!) than it has to do with a late change of heart among voters.
The thing is, the amount and quality of the polling leading up to the Michigan primary is on par with or even better than the polling in a bunch of recent states that didn't see shocking upsets this year. Many of the Super Tuesday contests had barely any pre-election polls from which to make predictions, and yet the predictions people made weren't that far off.
Which brings us back to the conclusion we arrived at after the New Hampshire primary: Not all polls are bad, but it's nearly impossible to predict ahead of time whether the polls in a state will end up having been wrong or right.
An aside regarding Idaho: Some people have lumped the miss in the Gem State on the Republican side in with the much bigger and more significant miss in Michigan on the Democratic side. I wouldn't do that.
In Idaho it would be more accurate to say the pollster missed than the polls missed, and even that is an uncharitable interpretation of the facts. Only one guy was surveying voters there and the last poll he took happened pre–Super Tuesday. Again, a lot can change in the final days of a campaign, as Iowa should have taught us. This isn't necessarily an example of the polls failing. It might just be yet another example of the polling community walking away from a race before it's over.