IE 11 is not supported. For an optimal experience visit our site on another browser.

Which polls (and pollsters) got it right?

We heard a lot of criticism of the polls running up to Tuesday night's election.

We heard a lot of criticism of the polls running up to Tuesday night's election. Remember how conservative pundits demonized the "skewed polls" showing President Obama with a substantial lead in the days before the first debate?

But in the end, the biggest surprise was how accurately the math predicted the results.

As of Wednesday afternoon, the results of Tuesday's election (with 99% of precincts reporting) show President Obama (the winner) with 50.4% and Mitt Romney with 48.1% of the popular vote, a margin of 2.3%.

As returns continue to come in, that may slightly change, but let's go with 2.3% for now for the sake of argument.

So who got it right?

Among a dozen national polls published on the day before the election, as tracked by statistician/writer Nate Silver for his FiveThirtyEight blog, Obama led by an average of 1.6%. Of those, the most accurate was Google Consumer Surveys, which gave Obama a 2.3% lead (right on the money!).

And who got it wrong? The two least accurate were Rasmussen and Gallup, which both gave Romney a one point lead.

Meantime, over at Real Clear Politics, an average of nine national polls tracked in the final few days before the election gave Obama a slim 0.7% lead of 48.8% to 48.1%. Of those, the two that came the closest to the actual results were ABC News/Wash Post and Pew Research, both of favored Obama 50% to 47%.

Again, Rasmussen and Gallup were the least accurate, both giving Romney a one point lead: Rasmussen 49% to 48% and Gallup 50% to 49%.

I would argue that Gallup was the worst of those two because it gave Romney the magical 50% majority mark, which usually indicates a candidate is headed for victory. In addition, Gallup's daily national tracking poll had put Romney ahead by FIVE until it was temporarily suspended for Hurricane Sandy.

Overall, the big winner was not a pollster, but Nate Silver.

Using his own "secret sauce" statistical model to analyze the polls that conservatives tried to discredit, Silver gave Obama the race all along.

In his final forecast before the election, Silver said Obama had 90.9% chance of winning with a popular vote of 50.8% to 48.3% (a 2.5% margin) and 313 Electoral votes. Assuming the president carries Florida, Silver correctly predicted the electoral college outcome in all 50 states.

Hard to see how you can call that "skewed."