One of the most fascinating things about the final month of the 2012 US presidential campaign was how the accuracy of political polling became a key storyline.

Reporters shifted from covering what the polls meant to focusing on whether the polls were even right in the first place. There were poll “truthers” who claimed that pollsters were intentionally skewing the results towards Obama.

There was a war between number-crunchers like Nate Silver – who said the polls showed Obama held a clear advantage – and pundits like Joe Scarborough who claimed the race couldn’t be tighter. 

While the debate raged over public polling, a similar battle was being fought between the private campaign pollsters. Democratic polling firms were producing data similar to many of the public polls – showing Obama with slight leads in key swing states. Meanwhile, Republican polling firms said Romney was winning.  

After Obama prevailed, a number of post-mortems were written on how the GOP pollsters missed the mark. A recent article in POLITICO titled “The GOP polling debacle” opened with the following line:  “For Republicans, one of the worst parts of the GOP’s 2012 trouncing was that they didn’t see it coming.”

What happened? As Democratic pollsters, our take is twofold: First, many GOP pollsters made a classic “Moneyball” mistake just like those old-time baseball scouts in Michael Lewis’ best-selling book: they relied on their own intuition rather than the data. 

They assumed that after four years of what they viewed as a rocky Obama Presidency, that traditional Democratic constituencies would be demoralized and lack the energy of what brought them to the polls in 2008. 

They ignored the Census – showing an increasing Latino population. They ignored past electoral data – showing that young voters and African-American voters show up at relatively consistent rates.  They ignored polling data showing that Obama voters were more enthusiastic about voting for their candidate than Republicans were about voting for Romney. 

Democratic pollsters and many public pollsters went with historical precedent and data-driven decisions over intuition and gut. That turned out to be the right decision.   

Second, many GOP pollsters made a methodological mistake. One of the big challenges facing political pollsters is that the hardest to reach populations (those with cell phones and no landline in the home) also happen to be core Democratic constituencies (youth, minorities, low income voters).  Some pollsters assumed that because these groups weren’t answering the polls, they weren’t going to vote. 

That was a big mistake – it turns out that these populations voted in similar rates as in the past, but they were very hard to reach by telephone. Those pollsters that did not control for this fact got it wrong.

Getting the data wrong had clear consequences, for Mitt Romney’s communications strategy and, ultimately, his election fate. In the final presidential debate, Romney played it safe – refusing to disagree with the President on many issues and acting like a candidate who thought he was ahead. 

The Gallup poll that day showed Romney up five points. But Gallup got it wrong – they were among those pollsters who incorrectly showed a consistent lead for Romney. And on that day, Romney made a data-driven decision to play it safe. But the data was wrong and he blew his last opportunity to change the course of the campaign. 

To the business community, this may seem a narrow issue – one confined to the world of “likely voter screens” and electoral prognostication. But every day, firms make critical data-driven decisions based on their own market or public opinion research that affect their business strategies and the way that they communicate with stakeholders.

Those decisions are only as valuable as the data that goes into them and faulty assumptions can be as costly to a business as to a presidential campaign. The easiest thing is to gloss over methodology – or choose the less methodology-rigorous and cheaper research option. Next time you hire a researcher, resist that temptation and ask the hard questions – about methodology, about research assumptions, and about your research audience – to ensure that your data is on the mark.

 And when it comes to the Presidential race, or the upcoming races in 2014, we will say what we’ve said all along, “Trust Nate Silver.” This is not to place Nate Silver on his well-earned pedestal, but merely to say, look at the averages first and foremost. Don’t rely on just one pollster’s numbers. Or one poll. 

The internet has provided us access to a tremendous amount of data from multiple sources. So look at Real Clear Politics. Check out Mark Blumenthal at Pollster.com. And moreover, it’s time that we begin to make a true effort to discount some of the folks who have made the data so clearly fit their own ideological or business needs. Rasmussen, you’ve been warned.  

Jefrey Pollock, president, and Nick Gourevitch, senior vice president and director of research, are pollsters and researchers at Global Strategy Group.  In 2012, Pollock and Gourevitch polled for a wide range of political clients including Priorities USA Action, the main Super PAC in support of President Obama’s re-election.