Poll Reading 101

BluePhantom

Educator (of liberals)
Nov 11, 2011
7,062
1,764
255
Portland, OR / Salem, OR
As the general election heats up between Romney and Obama I am seeing a lot of posts pointing to this poll and that poll and I am noticing an equal amount of misunderstanding from all sides. As I am an admitted "poll geek" (to the point that I have a spreadsheet I wrote to analyze polling data - I know "get a life") I want to take some time to explain how to read polls and get the most from the information they offer. While I imagine there are threads on this topic from 2008 or 2010 it appears it's time for a refresher course at the very least.

Rule #1: Consider the Sample

You will generally see polls sampled in three ways. The first is "adults" (A). Polls that sample "adults" is the least reliable because only about 50%-55% of the eligible population actually turn out to vote. Polls of A might give you an idea of public perception but they don't tell you a lot about who is in the best position to win an election.

The second (and most common) is "registered voters" (RV). This is better than A polls because they disregard anyone who is not eligible and in a position to vote. Still they are not the best because only about 70% of registered voters actually go and cast a ballot. So it's better, but still slightly problematic. Quinnipiac and Gallup are examples of firms that use RV sampling.

The third (and best) is "likely voters" (LV). This considers only people who are registered to vote and meet a statistical criteria that indicates they actually will go vote. LV polls, with only a few rare exceptions (like Quinnipiac, for example) are the ones to pay the most attention to. Rasmussen and SurveyUSA are examples of firms that use a LV sampling method.

Rule #2. Understand Margin of Error

I see people all the time get so excited about a poll that shows their candidate up by 3%. In reality, from a statistical perspective that's a tie. Every poll will have a slightly different margin of error but a good rule of thumb is 4%. If a poll shows a lead of 4% or less, it's a statistical tie and could go either way.

Rule #3. Pay Attention to Timing

A poll in April about an election in November doesn't mean a whole lot. Too much can happen. The economy could dramatically recover or totally tank between those times. A scandal may break. We could get attacked and forced into a military confrontation. A candidate may get a temporary bounce from their party's convention, the selection of a running mate, or even a human interest story that captures the nation's attention. All of these things will influence the polls and voter preference. The closer to the election, the more valuable a poll becomes. This is why we experience the "October Surprise" (the dirty secret that a candidate exposes about their opponent a week or two before the election). Knowing what is happening and when can help you identify the difference between a trend that is likely to stick and a temporary bounce.

Rule #4. Know the Polling Agencies Affiliations and History

Any agency can luck out in a given year. It's important to know which firms show a history of accuracy over multiple election cycles. For example, the Washington Post was great on a few selected state polls in 2010. In 2008 however they were absolutely dreadful. Gallup has a great reputation but over the last several years they have been getting less and less accurate. Rasmussen had a surprisingly weak 2010 but in 2008 and for years prior they were absolutely deadly accurate. What changes? Sometimes their methodology, sometimes nothing....they had a bad year or a good year.

Also keep in mind that some firms are affiliated with a given party. Public Polling Policy (PPP), for example, is funded and affiliated exclusively with the DNC. Magellan Strategies, the RNC. Usually, on RealClearPolitics, thoss agencies are noted (D) or (R) for Democratic affiliated firms or Republican affiliated firms respectively. It's wise to keep in mind who is paying their bills when you consider the validity of their data. That's not to say these firms should be completely disregarded...just that it should be kept in mind.

Rule #5. READ THE FUCKING CROSSTABS

The crosstabs are information about the specifics of the polling demographics in that sample. They are usually at the very beginning or the very end of a polling report. Many liberals might be excited as hell with a poll that shows Obama with an 11% lead until they look more closely and notice that (simply by sheer chance) the polling agency reached a sample where 47% of them identified as Democrats compared to only 23% that identified themselves as Republicans. This creates what is known as the dreaded "outlier". Simply by sheer random chance the agency reached a given demographic that is out of proportion with the United States as a whole and it skewed the results to the point where the data is unreliable.

Rule #6. Trends and Averages are More Important Than Snapshots

A poll is basically a snap shot: "at this precise moment in time and according to the sample we reached, this is what the feeling is". The best way to read polls is to look at a collection of reliable polls and average the results. RealClearPolitics does this with the "RCP average" but that average does not consider all the points I have discussed. If it's a recent poll it gets counted whether it's a good poll or a bad poll.

Tracking the trends associated with the averages shows more than just what the snapshot is but where there is momentum toward one side or the other. It's the trends that matter more when election day is distant. Those snapshots only have real relevance a week or so away from election day because things can happen so fast that even a historically accurate poll can show a dramatic change in their data within a very short period of time depending on what happens to be going on at one point in time compared to the other.

So with all that said let me list in order the common agencies that, through my research and tracking on my spreadsheet, are the most valuable and the most accurate.

The Deadly Accurate Duo (First Tier)
1. SurveyUSA
2. Quinnipiac

Damned Accurate (Second Tier)
3. Rasmussen (slipped from First Tier after a shaky 2010)

Pay Close Attention To (Third Tier)
4. Mason-Dixon
5. PPP

Worth Consideration (Fourth Tier)
6. Gallup
7. Magellan Strategies
8. Strategic Vision

Consider With Care
9. ABC/Washington Post (one good year in a history of disaster does not establish confidence)
10. Fox News (historically getting more and more accurate but not there quite yet)

Best to Ignore
Pretty much everything else

By keeping these above points in mind a true "student of the polls" will be able to get a much more solid understanding of who is winning and losing, how they are winning, why they are winning, and will be able to distinguish between what is important and what is irrelevant. The ability to effectively analyze the polling data can also mean the difference between making a strong argument on a thread or being exposed as a complete tool. These concepts are vital to understand whether you are simply looking for ammo in a debate or you really want a true understanding of the political landscape and your candidate's chances for victory.
 
Last edited:
I'd like to say very nice job on this. Accurate, informative and non-partisan. I have a couple things I see differenetly, but for the most part agree with you 100%.
I tend to ignore ABC and Fox, they are just too inacurate for me.
Also I like the RCP average. I didn't follow everything in the 2010 election, but one part I followed very closely, and that was the senate race. From what I remember the RCP picked every winner except two races they labelled too close to call. They were the only one that I remember doing that well.
 
I'd like to say very nice job on this. Accurate, informative and non-partisan. I have a couple things I see differenetly, but for the most part agree with you 100%.
I tend to ignore ABC and Fox, they are just too inacurate for me.
Also I like the RCP average. I didn't follow everything in the 2010 election, but one part I followed very closely, and that was the senate race. From what I remember the RCP picked every winner except two races they labelled too close to call. They were the only one that I remember doing that well.

Yeah ABC and Fox....I will have a look at them because they have shown some hints of promise. For the most part media polls or media sponsored polls are just complete trash and I usually ignore them completely as well. Fox and ABC/Washington Post...I will have a peek at the crosstabs but I don't put a whole lot of faith in them either.

You know the other thing to keep in mind that I really wanted to include but the post was too long as it was, is that accuracy is historically harder to achieve depending on the type of election that is going on. During midterm years every polling agency suffers a bit in accuracy where in presidential years they all tend to improve. Also the closer in the poll zooms the less accurate it tends to be. So a poll for a given state in a midterm year is historically less accurate than for the nation as a whole in a presidential year.

It can get really damned complicated. :lol: But I love statistics so I crunch the numbers and the spreadsheet I wrote to do that crunching has proven to be very accurate I am proud to say. :lmao:
 
My main place for a poll is Rasmussen.

Rasmussen is historically very solid. In 2008 they were astonishingly dead on. As I pointed out in post #5 midterm years are generally less accurate and as such they had a bad 2010. Their results were correct but their margins were off by more than their historical average. That's why I moved them from the first tier to the second tier this year. It's only one bad year and it was a midterm year...but I have to drop them a notch from Quinnipiac and SurveyUSA who have no such little glitches in recent history.
 
My main place for a poll is Rasmussen.

Your sig line makes that statement deliciously ironic.
Really why because there polls are not always what you want to hear? Did you have a problem with Rasmussen in 2008 when their polls had Obama leading by about seven points which is what he won by I believe or when Obama was sworn in and their polls showed Obama with approval ratings in low 60s. What I find deliciously funny is people who call Rasmussen a biased or right wing poll now that Obama's numbers have slide but never said any of this when his numbers were high.
 
I find polling to be somewhat overrated or at least that people put way to much stock in them.
 
I would add- party sampling based on turnout models past and considered future.

example-

the wapo/abc poll last week had a D/R/I of 34/23/34 sample based on what wapo and abc saw as the 2012 election turn out model, thats ridiculous, hence obama got a +7 dem sample taking his approval from 46 to 50...
 
I find polling to be somewhat overrated or at least that people put way to much stock in them.

They can be valuable if you know what to look for and how to read them. That's the point of the OP. All too often though people just look at the spread and leap with joy or wallow in depression. There's a lot more to the story than just the final spread.
 
I would add- party sampling based on turnout models past and considered future.

example-

the wapo/abc poll last week had a D/R/I of 34/23/34 sample based on what wapo and abc saw as the 2012 election turn out model, thats ridiculous, hence obama got a +7 dem sample taking his approval from 46 to 50...

Bingo!!! I pointed that out on Rule #5 as well. I saw the flaw in poll as well and laughed my ass off when someone tried to quote it. Good call.
 
I find polling to be somewhat overrated or at least that people put way to much stock in them.

They can be valuable if you know what to look for and how to read them. That's the point of the OP. All too often though people just look at the spread and leap with joy or wallow in depression. There's a lot more to the story than just the final spread.

True a lot of times people do just pick out what they like and ignore the rest.
 
I find polling to be somewhat overrated or at least that people put way to much stock in them.

They can be valuable if you know what to look for and how to read them. That's the point of the OP. All too often though people just look at the spread and leap with joy or wallow in depression. There's a lot more to the story than just the final spread.

True a lot of times people do just pick out what they like and ignore the rest.

Well to be fair, let's be honest....it can be a pain in the ass reading crosstabs for every single poll, crunching numbers independently, etc. I do that but, as I said, I am a "poll geek" and my profession leaves me a lot of spare time to undertake such an effort. So it's hard for me to bust people's balls too much on the issue, but if someone really wants an understanding of the data, unfortunately that's what they must do.
 
I would add- party sampling based on turnout models past and considered future.

example-

the wapo/abc poll last week had a D/R/I of 34/23/34 sample based on what wapo and abc saw as the 2012 election turn out model, thats ridiculous, hence obama got a +7 dem sample taking his approval from 46 to 50...

Just as an after thought, Trajan....I am not sure if it was that poll or another one, but I was reading the crosstabs on one of them and noticed that when you added up some of the demographics it came out to something like 104%. I thought "well that's a neat little trick now isn't it? I think we can just toss this poll in the crapper." :lol:
 
I'll add:

Margin of error, is exactly that. If a poll says it has a 3% margin of error, it's equiprobable that a poll is +3 or -3 from the reported results.

A poll with a 95% confidence limit, which most political polls are has a 1 in 20 chance of its results margin of error being outside the statistical MOE.

For those reasons, I prefer those sites that look at multiple polls, such as RCP and Pollster to view trends.
 
Phantom I just posted the most recent fox poll. What's do you take away from it?

It's an outlier. The party identification flies in the face of other larger surveys. The result also files in the face of other FOX polls. To me it looks like FOX chose not to weigh the results in terms with known party identification.
 
Phantom I just posted the most recent fox poll. What's do you take away from it?

It's an outlier. The party identification flies in the face of other larger surveys. The result also files in the face of other FOX polls. To me it looks like FOX chose not to weigh the results in terms with known party identification.

Thanks idiot, but we wanted an educated response.
 
I would add- party sampling based on turnout models past and considered future.

example-

the wapo/abc poll last week had a D/R/I of 34/23/34 sample based on what wapo and abc saw as the 2012 election turn out model, thats ridiculous, hence obama got a +7 dem sample taking his approval from 46 to 50...

Just as an after thought, Trajan....I am not sure if it was that poll or another one, but I was reading the crosstabs on one of them and noticed that when you added up some of the demographics it came out to something like 104%. I thought "well that's a neat little trick now isn't it? I think we can just toss this poll in the crapper." :lol:


yup, and this is far from the first one. that's why ( and this is partial answer even though she asked you) to Grampa Murked U's post, I don't quote or depend on media polls, not fox, wall. st., abc, cba, cnn, AP etc.

gallup, ras, do polls as a living,and they do it every 4 days, they have to be above question so they have a dog in that fight not a dog in the political fight.

Just for shits and grins, if you look at the first qurater of gallup and ras side by side there have been some wild swings and opposition, gallup had obama lower for a while than ras, did big time ( and gallup uses anonymous sampling compared to ras LV)....of course you didn't hear any of that posted becasue in the a lot of leftys minds ras leans right....so when gallup has obama at a lower Appr. than ras, :eusa_shhh:
 

Forum List

Back
Top