"The momentum [is] all behind #Remain," tweeted out the gambling website Betfair yesterday at 10 a.m. EST. At 4:24 p.m., shortly before the last polls closed, the site was putting the odds that British voters would elect to stay in the European Union at 88 percent. An hour later, the U.K. bookmaker Ladbrokes went even further, tweeting that the odds were 12–1 against Brexit.
The bookies were wrong. By a margin of four percentage points, the country opted to withdraw from Europe's economic zone.
Proponents of betting markets as an alternative to increasingly questionable old-fashioned opinion polls have some soul searching to do, and they know it: "Those of us who do this for a living will have to face up to some tough questions today," reads a statement tweeted out by Ladbrokes late last night. "Is this just one of the inevitable, normal occasions where an outside wins, or a fatal blow to the idea of betting markets as being a useful forecasting tool? Maybe unsurprisingly, I tend to think the former, but that doesn't mean we don't have to reflect on all of their potential flaws and decide how we best interpret them in the future."
PredictWise, a prediction aggregator run by Microsoft Research Labs' David Rothschild, noted on Twitter that "one-shot elections" like this one "really should not affect models for" regularly occurring American general elections like the one that will happen this November. Still, Rothschild too admitted this particular miss by the betting markets is "certainly a good time to reflect."
One-off elections are indeed notoriously difficult to make predictions about, and events with a 75 percent probability of happening one way will nonetheless go the other way a quarter of the time. But the betting markets' showing here is troubling in part because many people (myself included) have long been hopeful that their ability to synthesize data from disparate sources makes such markets a more efficient alternative to traditional survey research precisely in those moments where normal polls are most likely to fail.
Instead, the polls were much closer than the markets in this case. The Huffington Post had Remain ahead of Leave by one-half of one percentage point, which is too close to call if ever a prediction was. Not only that, but it was the online-only (as opposed to "gold-standard" telephone-based) surveys that fared best of all. As the HuffPollster data scientist and my friend Natalie Jackson wrote:
Throughout the campaign, the polls diverged, based on whether they were conducted online or by telephone. The internet poll average estimated a 1.2-point lead for "leave," while live phone polls had "remain" up by 2.6 percentage points. … In this case, the internet polls were clearly more indicative of the victory.
Does this mean we should put all our eggs in the web survey basket from now on? Anyone who thinks so hasn't been paying attention. The real lesson here is that no two elections are alike, and no one way of predicting what will happen is always going to come out on top. For the same reason savvy politicos look at poll averages (and superstars like FiveThirtyEight's Nate Silver make use of highly sophisticated models with far more inputs than just raw survey numbers) the only good way forward is to look at the picture that all the very different information sources are collectively painting—and not to assume too much even from that.