Study: PPP Most Accurate Pollster Of 2012, Rasmussen And Gallup Among Least Accurate

Dr. Costas Panagopoulos, Director of Fordham University’s Department Of Political Science has released the result of his study of poll accuracy in the 2012 election, and the results are as follows:

1. PPP (D)*

1. Daily Kos/SEIU/PPP*

3. YouGov*

4. Ipsos/Reuters*

5. Purple Strategies

6. NBC/WSJ

6. CBS/NYT

6. YouGov/Economist

9. UPI/CVOTER

10. IBD/TIPP

11. Angus-Reid*

12. ABC/WP*

13. Pew Research*

13. Hartford Courant/UConn*

15. CNN/ORC

15. Monmouth/SurveyUSA

15. Politico/GWU/Battleground

15. FOX News

15. Washington Times/JZ Analytics

15. Newsmax/JZ Analytics

15. American Research Group

15. Gravis Marketing

23. Democracy Corps (D)*

24. Rasmussen

24. Gallup

26. NPR

27. National Journal*

28. AP/GfK

Costas based his rankings on pre-election poll numbers compared with the final results of the election. Nate Silver will likely be doing his own study in the near future and it will be interesting to compare his findings with this list. Nonetheless, I find it interesting to see where the various pollsters have turned up and wonder what the right will think about the fact that Rasmussen was among the most inaccurate pollsters of the 2012 cycle.

FILED UNDER: 2012 Election, Democracy, Public Opinion Polls, US Politics, , , , , , , , , , , , , , , ,
Doug Mataconis
About Doug Mataconis
Doug Mataconis held a B.A. in Political Science from Rutgers University and J.D. from George Mason University School of Law. He joined the staff of OTB in May 2010 and contributed a staggering 16,483 posts before his retirement in January 2020. He passed far too young in July 2021.

Comments

  1. john personna says:

    Needs a scatter chart.

  2. Jr says:

    PPP FTW!

  3. James Joyner says:

    @john personna: Yes. Ordinal rankings are not that useful without some indicator of dispersion. 15 of these firms could have produced essentially identical results.

  4. David M says:

    It’s a little early for exact positions on the chart, as the gap between Obama and Romney will keep getting larger as the vote from the west coast comes in. (In 2008 Obama gained 1% between the day after the election and the final counts.)

  5. john personna says:

    @James Joyner:

    It doesn’t seem anyone is linking back to Fordham, which might have such a chart.

    Poor netiquette too.

  6. Katharsis says:

    What do the asterisks mean? I even clicked on the link and could not find an explanation..

  7. Geek, Esq. says:

    More instructive would be state poll comparisons. Missing popular vote by 3 could be explained by sampling error.

    But, getting every single swing state wrong (except North Carolina) in the case of Rasmussen should put him out of business. As should his R+4 arbitrarily selected party ID weighting.

  8. Rob in CT says:

    You know, I knew Ras had a GOP “house effect” from years ago b/c Silver talked about it. I basically ignored Ras since. But R+4 is hilarious. I can sorta understand thinking that D+6 was optimistic. But to go 10 points in the other direction? Wow. Just wow.

  9. Tsar Nicholas says:

    Rasmussen was way off. Not even ship-to-shore close to accurate. Epic fail. His party ID call was laughably wrong. Had that one been correct Romney would have won in a overwhelming landslide.

    There needs to be some context given to the “accurate pollsters” adage. It’s all relative. It’s like talking about which radar run or which bathroom scale is the most accurate.

    PPP nailed Florida. I’ll give them that. But they missed by 2-5 net points in various noteworthy states (OH, NC, WI, AZ, and others), most often skewing well in favor of Obama, which is exactly what some of us would have predicted.

    But that said if they were the most accurate they were the most accurate. That’s pretty f’n scary, actually. The demographics of their polls didn’t match up to reality, whether using foresight or hindsight. I guess that means they did a good job of weighting or at least a better relative job of weighting than the others. Color me gobsmacked. I would have bet the farm that PPP would have missed by miles. Oh, well, c’est la vie.

  10. mantis says:

    @Tsar Nicholas:

    The demographics of their polls didn’t match up to reality the way I wish things were.

    FTFY

  11. PJ says:

    Don’t worry, Rasmussen is back pushing their lies.
    Today we learn that we learn that weighted by its failed LV model, a plurality believes both campaigns were negative.

  12. PJ says:

    Dr. Costas Panagopoulos, Director of Fordham University’s Department Of Political Science

    After Googling for the full report, it seems that Dr. Panagopoulos is the person responsible for “Rasmussen being the most accurate pollster in 2008”.

    I’ll await other studies.

  13. KariQ says:

    What’s really fascinating to me is that two of the best polls were internet polls, instead of phone polls. Zogby’s 2004 internet polling gave internet polls a bad name, but YouGov has shown that internet polling works, when done right. I have long suspected that internet polling is the future for the polling industry, and this bodes well for the future of companies like YouGov.

  14. Tillman says:

    Huh. NPR has crappy pollsters.

  15. mattb says:

    @Tsar Nicholas:
    Given the fact that you spent the last few days constantly pointing out how “toast” Obama was in all the swing states, I’m having a hard time groking your complaints about anyone’s accuracy sir.

  16. Console says:

    @PJ:

    To be fair, the guy published a full report a year later that showed Rasmussen to not be anywhere near the top pollster. But his initial report (the day after the election before all the votes were counted, like the report in this post) showed Rasmussen to be in the lead. Which is how the zombie lie about Rasmussen being most accurate in 2008 was born.

  17. dmhlt says: