Riffing off Bryan Caplan's excellent Cato Unbound piece on voter ignorance, Ezra Klein and Matt Yglesias both suggest that this is not a terribly big problem, since policy is determined by elites anyway. In other words, political "slack"—one of Caplan's proposed solution—is already ample, a situation which has plenty of its own drawbacks.
There's something to this, of course: Immigration and trade are two issues where agreement among people who do know some economics has kept policy more liberal than would be chosen by voters who, mostly, don't.
On the other hand, the elites who are relevant to politics are political elites, not policy experts. That means a lot of people who do, as you may recall, get chosen by voters. And this time around, they've chosen a bunch of people like Sherrod Brown (D-Ohio). (It also means lobbyists, of course, who have their own consensus driven by interests, not expertise.) We don't get a pure exercise of the general will, but if voters are receptive to misguided populist messages, no number of harrumphing editorials in East Coast newspapers will stop misguided populist candidates from getting elected.
Slightly tangentially, Ezra adds:
Bryan singles out religion as a place "where irrationality seems especially pronounced." This gets said occasionally, and it's poppycock. It might be factually wrong to believe in God, but it's certainly not irrational (using rational here in the economic sense, as an action that maximizes your utility). Studies universally find that religious belief and participation offer positive returns for individual health, happiness, finances, personal satisfaction, etc.
First, believing in God doesn't require holding a lot of other ancillary beliefs which might or might not make sense—that the world is 6,000 years old, say. So that stuff is at best neutral in terms of economic rationality and less than compelling in terms of epistemic rationality. But secondly, "rationality" characterizes processes, not outcomes, even when the processes are outcome-oriented. If I believe a stock I own will rise because a psychic told me so, my belief may be true but it will not therefore be justified, even if I know other facts which (had I thought about them) could have more justifiably led me to the same belief. By the same token, you can at least make a case that it's rational to somehow make yourself believe in religion for the health benefits. But if that's not your reason for believing—as I expect it isn't for the vast majority of people—then whether or not belief is otherwise rational, those benefits don't enter into assessing the rationality of the belief. If I stab myself in the abdomen for no good reason, the act does not become more rational if it just happens that I've lanced a swollen appendix that (unbeknownst to me) was on the verge of bursting.