6
   

When does a poll sway opinion?

 
 
Reply Thu 25 Sep, 2008 11:58 am
Many conservatives believe there is a liberal slant in the mainstream media, while libs/progressives tend to discard this argument.

But looking into the recent Washington Post/ABC News poll, one has to wonder if these 'opinion polls' are actually used to measure opinion or sway opinion.

I, like most others, read the news articles about the poll results, indicating Obama had opened up a 9 point lead over McCain. But I always like to look at the sampling information in these polls; in this case, I'm glad I did.

First off, the poll was conducted with a "random sampling" of 916 registered voters. Fair enough. But in their sampling, the Post interviewed what was eventually self-identified as leaning 54% dem and 38 % repub.

Huh? This was the net total. It was 38%-28% dem as "generally speaking, how do generally think of yourself as."

This certainly isn't reflected in national party registration figures. Maybe in California.

The kicker? The post had an oversampling of 163 black voters. Now, isn't this far above the 13.4% of the general African-American population? Isn't it far above that part of the African American population registered to vote? Other polls have been pretty consistent in their measuring of overwhelming black support for Obama, as had been in evidence during the dem primary.

What area codes did the Post use for the "random sampling"?

Seriously, doesn't anyone else, regardless of who you are backing, see anything wrong with this?

The Post/ABC poll is the only one showing Obama with such a lead; most of the others have Obama up by about 3%.

Where is the objective editor or publisher or producer here who is supposed to ensure a lack of bias in reporting? None of these figures were mentioned in any of the news articles I read.

Are we being informed or manipulated here?

Here is the info; look for yourself:

http://www.washingtonpost.com/wp-srv/politics/polls/postpoll_092308.html?sid=ST2008092303897&s_pos=list
 
CoastalRat
 
  2  
Reply Thu 25 Sep, 2008 12:04 pm
@A Lone Voice,
It is why I generally don't pay attention to polls. If you give me the result you want, I can craft a poll that would return that result.

I think polls are often being used to help sway opinion rather than to inform. I think that is what polls were originally created to do.
Cliff Hanger
 
  1  
Reply Thu 25 Sep, 2008 12:07 pm
@A Lone Voice,
I see your point, but I don't think surveying 163 black people as being too high above the norm.

Generally speaking, I'd say these polls are geared toward most of us who are too busy or distracted to actually find out who or how they surveyed. We've become trained seals when it comes to ingesting this information-- all we do is accept it and go about our busy lives until the new poll comes out a day later.

0 Replies
 
cicerone imposter
 
  1  
Reply Thu 25 Sep, 2008 12:08 pm
@CoastalRat,
Exactly, how did you arrive at that conclusion? Polls have been pretty accurate, and most tells us how accurate they believe they are. Polls and actual results have told us that polls can be relied upon more often than not.

Polls are static, so to that extent they cannot be 100% in predicting the future, but there's no better way to gauge trends.
0 Replies
 
Foxfyre
 
  3  
Reply Thu 25 Sep, 2008 12:31 pm
@A Lone Voice,
IMO, there is absolutely a frequent bias, and since most political polls are conducted by left leaning organizations/entities, that bias will be left more often than right. I think our phone is flagged for polls or something because we seem to get an awful lot of them. A scientific poll is carefully designed to not supply any 'weighting' of questions so as to encourage a particular response and it is obvious when you get one of these. Some political polls are scientific polls but many are not and those are also obvious.

Awhile back, the pollster asked me if I strongly approved, somewhat approved, somewhat disapproved, strongly disapproved of immigrants. I advised I would have to know the immigrant. She said, "Excuse me?" I said I would have to know the person to know whether I approved of him or her. The pollster said the question didn't pertain to individual immigrants but to immigration. I explained that it wasn't asked that way. So the pollster changed the question to immigration--(this was a dead giveaway that we're dealing with a political poll and not a scientific poll). Again I advised that I would have to know whether she meant legal immigration or illegal immigration.

She thanked me and hung up. Didn't even ask for my demographics. Smile

Both parties and all high level candidates are doing internal polling all the time and I can't be sure, but I'm pretty sure both do tend to skew polls in their favor, at least those they put out for public consumption - or - they publish ONLY those polls that put their guy in a favorable light. And, IMO, it is pretty much a given that Democrat surrogates in the MSM do so too. The rationale is to persuade those who jump on whatever bandwagon appears to be winning.
cicerone imposter
 
  1  
Reply Thu 25 Sep, 2008 02:07 pm
@Foxfyre,
Support your claim that "there is absolutely a frequent bias...?"

Not with your opinion which continues to lose credibility, but by outside, reliable, sources.
0 Replies
 
A Lone Voice
 
  2  
Reply Thu 25 Sep, 2008 09:34 pm
I wonder....

If Fox had conducted a poll where they underrepresented dems and black voters and showed McCain leading by 4%, if other networks might have examined the sampling and reported it?

All you libs/progressives who I'm always trying to wrest a bit of intellectual honesty from:

Doesn't this bug you just a bit?

I noticed this topic had a score of 5 earlier and was up high in the topics for the day, and now it is a 0. I guess that answers my question...

cicerone imposter
 
  2  
Reply Fri 26 Sep, 2008 10:14 am
@A Lone Voice,
For yourself? amen.
A Lone Voice
 
  3  
Reply Fri 26 Sep, 2008 11:51 am
@cicerone imposter,
No, it answers my question about the intellectual honesty libs/progressives show here. They all agree with one another and sing praises of the left, yet run and hide -or ignore - any evidence that confronts the storybook they have wrote for one another.

The right does it too, but that's why I stay away from those websites, as I stay away from the extreme left sites.

I just expected better here...
A Lone Voice
 
  1  
Reply Sat 27 Sep, 2008 04:35 pm
Here were the polls at the time. Note how far off the Post/ABC poll are.

C'mon, libs/progressives. Don't be afraid to address this.

There has to be some intellectual honesty amongst some of you...

Polling Data
Poll Date Sample Obama (D) McCain (R) Spread
RCP Average 09/21 - 09/26 -- 47.9 43.6 Obama +4.3
Gallup Tracking 09/24 - 09/26 2759 RV 49 44 Obama +5
Rasmussen Tracking 09/24 - 09/26 3000 LV 50 44 Obama +6
Hotline/FD Tracking 09/24 - 09/26 914 RV 48 43 Obama +5
GW/Battleground Tracking 09/21 - 09/25 1000 LV 46 48 McCain +2
CBS News/NY Times 09/21 - 09/24 LV 48 43 Obama +5
FOX News 09/22 - 09/23 900 RV 45 39 Obama +6
Marist 09/22 - 09/23 689 LV 49 44 Obama +5
NBC News/Wall St. Jrnl 09/19 - 09/22 1085 RV 48 46 Obama +2
ABC News/Wash Post 09/19 - 09/22 780 LV 52 43 Obama +9
LA Times/Bloomberg 09/19 - 09/22 838 LV 49 45 Obama +4
Ipsos-McClatchy 09/18 - 09/22 923 RV 44 43 Obama +1
CNN/Opinion Research 09/19 - 09/21 697 LV 51 47 Obama +4
F&M/Hearst-Argyle 09/15 - 09/21 1138 LV 45 47 McCain +2

Link: http://www.realclearpolitics.com/epolls/2008/president/us/general_election_mccain_vs_obama-225.html

(Scroll down, I just went back to Sept 21.)
0 Replies
 
cicerone imposter
 
  2  
Reply Sat 27 Sep, 2008 04:42 pm
@A Lone Voice,
It's because you really don't know what you are talking about. Here's what I have said:

1. The debate yesterday was a toss-up.
2. I wish we had two different people running for president.

That smashes your whole post about "intellectual honesty." The ball's back in your court.
A Lone Voice
 
  2  
Reply Sat 27 Sep, 2008 04:46 pm
@cicerone imposter,
Huh?

This thread has nothing to do with the debate. The poll I'm citing is already a few days old.

I think you have your posts mixed up?

I don't want to confuse you any more than you are, but here is my original post:

Quote:

Many conservatives believe there is a liberal slant in the mainstream media, while libs/progressives tend to discard this argument.

But looking into the recent Washington Post/ABC News poll, one has to wonder if these 'opinion polls' are actually used to measure opinion or sway opinion.

I, like most others, read the news articles about the poll results, indicating Obama had opened up a 9 point lead over McCain. But I always like to look at the sampling information in these polls; in this case, I'm glad I did.

First off, the poll was conducted with a "random sampling" of 916 registered voters. Fair enough. But in their sampling, the Post interviewed what was eventually self-identified as leaning 54% dem and 38 % repub.

Huh? This was the net total. It was 38%-28% dem as "generally speaking, how do generally think of yourself as."

This certainly isn't reflected in national party registration figures. Maybe in California.

The kicker? The post had an oversampling of 163 black voters. Now, isn't this far above the 13.4% of the general African-American population? Isn't it far above that part of the African American population registered to vote? Other polls have been pretty consistent in their measuring of overwhelming black support for Obama, as had been in evidence during the dem primary.

What area codes did the Post use for the "random sampling"?

Seriously, doesn't anyone else, regardless of who you are backing, see anything wrong with this?

The Post/ABC poll is the only one showing Obama with such a lead; most of the others have Obama up by about 3%.

Where is the objective editor or publisher or producer here who is supposed to ensure a lack of bias in reporting? None of these figures were mentioned in any of the news articles I read.

Are we being informed or manipulated here?

Here is the info; look for yourself:





You'll need to explain yourself with more than a couple of sentences if you hope for me to follow your train? of thought...
cicerone imposter
 
  1  
Reply Sat 27 Sep, 2008 05:46 pm
@A Lone Voice,
Here's a clue: they've already taken polls on the debate. Wonder of wonders, and it's been less than 24-hours since.
0 Replies
 
engineer
 
  2  
Reply Sat 27 Sep, 2008 06:34 pm
@Foxfyre,
Foxfyre wrote:

IMO, there is absolutely a frequent bias, and since most political polls are conducted by left leaning organizations/entities, that bias will be left more often than right.

FiveThirtyEight.com has a breakdown of "house effects," consistent biases for specific polling outfits. There are roughly equal sites biases each direction, so it's not a "liberal" bias as you believe. It's more in how questions are phrased, how they handle independents, how they adjust for demographics, etc.
ebrown p
 
  3  
Reply Sat 27 Sep, 2008 06:49 pm
Lone Voice,

Your reasoning tells doesn't tell us that the polls are biased, rather it tells us that you lack a basic understanding of statistics, and are generally detached from reality.

First having a one day sampling of about 16% African Americans in a sample of just over 1000 adults is nowhere near out of the ordinary. We are talking about 3 percentage points.

Second, it is a fact that more Americans self-identify as Democrat this year (as they have for a while). More than that most Americans express values that you would consider liberal (although the average American is, by definition, "moderate").

cicerone imposter
 
  1  
Reply Sat 27 Sep, 2008 06:56 pm
@ebrown p,
Here's some poll numbers that seem to contradict what we've been hearing lately:

Method............... Obama McCain
Latest Poll Per State 238 ... 266
Poll of Polls .......... 270... 265
Survey USA .......... 159..... 269
Rasmussen Reports 228 259
Quinnipiac............. 131 51
Research 2000.......... 42 95
Zogby ................... 335 131
National Average.. 45.2% 43.5%
Weighted Nat'l Avg 44.8% 46.0%

Do we need to "read between the lines" here? LOL
0 Replies
 
nimh
 
  1  
Reply Sat 27 Sep, 2008 08:55 pm
@A Lone Voice,
A Lone Voice wrote:

First off, the poll was conducted with a "random sampling" of 916 registered voters. Fair enough. But in their sampling, the Post interviewed what was eventually self-identified as leaning 54% dem and 38 % repub.

Huh? [..] This certainly isn't reflected in national party registration figures. Maybe in California.

Yeah. If you dial a random thousand people, that doesnt necessarily mean you always get a representative thousand people. There's always going to be outliers.

On the party ID in particular, though, note that pollsters usually dont ask about party registration (what are you registered as), but just about what the respondent considers him/herself as. Making for a much more fluid indicator. If the climate is very favourable to one party -- or even if things are going very favourably for one party in this particular week -- the number of self-described partisans of that party may swing up too.

But again, there's also just the random statistical variation which means you might just get an oversampling of one party's supporters in your sample this week - and an oversampling of the other party's backers in another week.

That's where weighting comes in. That's a whole debate among pollsters: to weight or not to weight? Proponents of weighting the data argue that a pollster has the responsibility to do his best to assure his sample is representative for the overall population, and to nip statistical aberrations in the bud that way. From what I understand they do this regularly enough for other indicators (age, gender, etc). So it doesnt matter if their sample has an apparent oversample of 65+ voters, because they weight the total results of that age group and other age groups to better represent what their actual share of the voters will be.

Even for something as unambiguous as age, though, this already has pitfalls: what if turnout among young voters is much higher, proportionally, this year than in previous years? Then weighting the data on the basis of track records will make you miss the impact of an important development this year.

OK, move that discussion to the question of party ID and you can see the controversy. Weighting the results by party ID, so that the preferences of the Democrats in your sample are weighted on the basis of what you'd expect the share of Democrats to be this year in the actual elections, will help you smooth out any volatile deviations if your sample ends up having a disproportionate number of Dems. But what if a lot more people simply are identifying themselves as Dems right now; what if the apparent oversampling of Dems in your sample signals that hey, there's a ground swell towards Democratic self-identification going on? Then weighting your data makes you miss that.

I dont know that there's a definite answer to who is right. I dont even know whether the ABC/WaPo poll weights their results. But keep these things in mind next time you look at the party ID breakdown of a poll and rush to concluding that it must be a fixed poll of some sort. There's a fair chance of standard statistical variation yielding occasional weird party ID samples; and on top of that, the pollster might well already have taken that into account in calculating its public numbers by weighting accordingly.

Quote:
The kicker? The post had an oversampling of 163 black voters. Now, isn't this far above the 13.4% of the general African-American population? Isn't it far above that part of the African American population registered to vote?

This is really the same question.

On the one hand, you can speculate that African-Americans will in fact come out to vote in much higher proportion than usual; in higher proportion than their share of all registered voters, as fewer registered black voters fail to turn out on election day than of registered white voters; in higher proportion than their share of the population.

On the other hand, if the sample includes a number of African-Americans far surpassing their share of the population, the pollster might well already have taken that into account, by weighting down the impact of that share of respondents in the total end result. Or it might not, and the public numbers you're looking at are marked by that statistical aberration and constitute an outlier - which still doesn't necessarily involve some kind of political plot.

What you might even come across is that a pollster deliberately oversamples respondents from a minority group, in order to make the total from that group large enough to be able to make statistically reliable separate conclusions about their voting behaviour; and then weights the impact of their share of the total sample of respondents down again to fit the likely share of that group among actual voters.

All of which to say that, where you see signs of deliberate rigging of opinion polls, there could be any number of legitimate methodology issues at work.
0 Replies
 
nimh
 
  1  
Reply Sat 27 Sep, 2008 09:04 pm
A Lone Voice wrote:
The Post/ABC poll is the only one showing Obama with such a lead; most of the others have Obama up by about 3%. [..]

Are we being informed or manipulated here?

You are being informed, BUT ... it is true that you do need to keep your mind alert when reading the breathless news media stories about polls. You need to keep a number of things in mind, and most of them have nothing to do with political bias, but with the character of news reporting. E.g.:

  • Each TV station and newspaper wants to tout their own poll, so they devote whole articles to the results of their poll, pretending like their poll is the only one who got it right. In fact, it's a rare case that any comparison with other contemporary polls is made at all; it's like theirs is the only one that exists. A question of pure business interest.

  • But polls have margin of errors, and when it comes to the margin between the two candidates, that MoE doubles. Eg, when the poll says Obama leads by 49% to 45% and the MoE is +/-3%, that means that even just within the parameters of this poll, Obama could be, what - anywhere between 46% and 52%, and McCain anywhere between 42% and 48%. Meaning that the poll is in fact indicating something in between a 2% McCain lead and a 10% Obama lead. (Am I doing this right?).

  • And that's just still the MoE as calculated on a 95% probability basis, or whatever the statistical terms are (I'm a layman, obviously). There's still a 5% chance as well that either of the individual candidate's numbers is more than 3% off up or down.

    Basically, polls simply suffer from random statistical variation. Meaning that their numbers will to some extent fluctuate randomly from iteration to iteration.

  • But despite that all, the journalist writing the news story about the poll will pretend that 49% to 45% is The Deal, Man! That's Where The Race Stands. Rarely will he reference the MoE-related relativity of those numbers, let alone the possibility that their current poll just is an outlier.

  • So what to do? Well, there's different opinions about that too, and I'm no expert. But I follow the much-given advice to always look at any one individual poll's results in context. What did the previous couple of polls from that pollster say? Does this one suddenly something totally different from the previous ones? Does the same sudden shift occur in other, contemporary polls? If not, then it could just be more or less of an outlier. That still doesnt mean that the pollster was deliberately trying to fool or manipulate you - outliers just happen. They cant help that either.

In short, before assigning devious motivations to the ABC/WaPo pollsters, or to any one pollster, you should check out how they do in general. Do they generally deviate from all the other pollsters, or is it just this one that stood out? And if they deviate from the other pollsters, is it a steady pattern or are the deviations erratic? If they are erratic, it could mean that their methodology is just very sensitive; that they have opted against weighting in any way, making their results more volatile; or that they are just not delivering serious work.. If the deviations are systematic, always showing a certain number three or four percent higher than where other pollsters have it, it could mean they are putting a finger on the scale for reasons of politics, but it could just as easily mean that their methodology is a tad different.

For example, Fox New polls on presidential job approval always have Bush's approval a couple of percent higher than most other polls, and some combative liberals have called out "bias". But a look back in the archive shows that Fox News also systematically polled President Clinton's approval rating a couple percent higher than the other polls. So probably they just ask a slightly different question or the like, no bias involved.

Comparing this ABC/WaPo poll with previous ones, I dont see any particular sign of the ABC/WaPo polls disproportionally favouring Obama in general. Far as I know it was just this one that really stood out, so a more likely explanation is just that it was something of an outlier.

Again, how to deal with this kind of noise? Everyone will suggest their own way, but my solution is to always check the sites that aggregate all the different polls. Pollster.com, or Realclearpolitics.com, or Fivethirtyeight.com - that's a neutral, a conservative and a liberal one, but they're all collecting the same polling data, and all making their own kind of effort to deduct overall trends. It's important because in my experience, this week it may be the ABC/WaPo poll that seems off-kilter, but next week it will be the CBS/NYT one, or the Fox one, or the Quinnipiac one. Avoiding getting all worked up about one or the other poll lurching off is easy if you keep your eye on such aggregators. It's definitely, in my opinion anyhow, a more practical and sensible solution than trying to identify nefarious plots behind the numbers, ID'ing "right" polls and "wrong" polls, or throwing up your hands and concluding that they're all ****.
0 Replies
 
nimh
 
  2  
Reply Sat 27 Sep, 2008 09:19 pm
@A Lone Voice,
A Lone Voice wrote:

No, it answers my question about the intellectual honesty libs/progressives show here. They all agree with one another and sing praises of the left, yet run and hide -or ignore - any evidence that confronts the storybook they have wrote for one another.

Dude. I just hadnt seen the thread earlier. Considering that Engineer or Ebrown both also just posted valid points, I'm sure I'm not the only one.

A Lone Voice wrote:

I noticed this topic had a score of 5 earlier and was up high in the topics for the day, and now it is a 0. I guess that answers my question...

My "Electing President Biden" thread, which goes into the whole complicated obscurity of the election rules in case of an even split in the Electoral College has a score of only +1 -- despite having had 15 replies, which means it should have been somewhere at +10 or so. I guess that means that either the left or the right or both are running and hiding from the evidence about how messed up their electoral system is? Or could it be that, you know, that kind of inside baseball is just not a question that engages a lot of people?

My "Campaign commercials: the good, the bad and the ugly" thread must also have had a lot of thumbs down, cause it's also back down to +1. Must that mean that you people are just all running and hiding from the evidence of the dirty attacks that are waged in your name? (Define "you" according to preference.) Or could it be that, you know, most people are just not interested in the unpleasant stuff?

(Little do they know it's got lots of funny vids too..)

Anyway. People dont like inside baseball, and questioning the motivations and methodologies behind polls, let alone one individual poll, falls under that. People dont like partisan finger-pointing, and they like conspiracy theories even less -- and accusing the liberal MSM of fiddling the polls to sway voters to Obama falls under both. So you're gonna get a low score if even just alone because of that.

Bottom line, again: Less conspiracy thinking, more level-headed reality checks.
nimh
 
  1  
Reply Sun 28 Sep, 2008 08:51 am
@engineer,
engineer wrote:
FiveThirtyEight.com has a breakdown of "house effects," consistent biases for specific polling outfits. There are roughly equal sites biases each direction, so it's not a "liberal" bias as you believe. It's more in how questions are phrased, how they handle independents, how they adjust for demographics, etc.

Great point, just wanted to add to it. That fivethirtyeight.com page summarises the house effects of different pollsters when it comes to their state polls - no chart, alas it's all narrative, but to cut to the chase, it observes that:

  • Rasmussen's polls have a slight, Republican-leaning house effect.
  • Strategic Vision has a pretty recognizable Republican-leaning house effect.
  • Mason-Dixon tends to have a fairly consistent lean toward McCain as well.
  • Washington Post / ABC and New York Times / CBS have both had a little bit of a Dem-leaning effect.
  • Quinnipiac's polls have been fairly Obama-friendly, but not enough to show up as statistically significant.

Again, to underscore all the stuff I wrote here last night, the post also emphasizes that:

Quote:
[It] is VERY important to distinguish house effects from either "bias" or "partisanship". Those things can cause house effects, but far more often they are, in Franklin's words: "[D]ifferences ... due to a variety of factors that represent reasonable differences in practice from one organization to another." [..]

We don't know whether Mason-Dixon is right or wrong -- and they very well could be right, since they are a pretty good pollster! But it is the case that, in states where you have a Mason-Dixon poll, the numbers are going to lean more toward McCain [..]. Likewise, say you have a pollster like Selzer, which is a very good polling firm, but has had a pretty strong Obama-leaning house effect so far. [..]

To repeat, house effects are not necessarily bad -- but we can make our model [nimh: substitute that by "our understanding of polls and our ability to interpret them"] even more robust by understanding and accounting for them.


House effects in national polls

Meanwhile, that post in turn links to a post on pollster.com. While the 538 post focused on state polls, the pollster.com one focuses on national polls. The house effects it finds there are somewhat different (underscoring how they are primarily just related to methodological questions); plus it's got a chart!


http://4.bp.blogspot.com/_MRs_Nt465oE/SLHIxiiuBjI/AAAAAAAADAM/FWLKcTVUql0/s400/HouseEfx.png


The post is very much worth reading in full, but here's some relevant excerpts:

Quote:
Who does the poll affects the results. Some. These are called "house effects" because they are systematic effects due to survey "house" or polling organization. It is perhaps easy to think of these effects as "bias" but that is misleading. The differences are due to a variety of factors that represent reasonable differences in practice from one organization to another.

For example, how you phrase a question can affect the results, and an organization usually asks the question the same way in all their surveys. This creates a house effect. Another source is how the organization treats "don't know" or "undecided" responses. Some push hard for a position even if the respondent is reluctant to give one. Other pollsters take "undecided" at face value and don't push. [..] And organizations differ in whether they typically interview adults, registered voters or likely voters. The differences across those three groups produce differences in results. [..] Not to mention the vagaries of identifying who is really likely to vote.

Finally, survey mode may matter. Is the survey conducted by random digit dialing (RDD) with live interviewers, by RDD with recorded interviews ("interactive voice response" or IVR), or by internet using panels of volunteers who are statistically adjusted in some way to make inferences about the population. [..]

The chart above shows the house effect for each polling organization that has conducted at least five national polls on the Obama-McCain match-up since 2007 [until the date of posting, August 24]. The dots are the estimated house effects and the blue lines extend out to a 95% confidence interval around the effects.

The largest pro-Obama house effect is that of Harris Interactive, at just over 4 points. The poll most favorable to McCain is Rasmussen's Tracking poll at just less than -3 points. [..] We are looking at effects on the difference between the candidates, so that +4 from Harris is equivalent to two points high on Obama and two points low on McCain. Taking half the estimated effect above gives the average effect per candidate. [..]

The house effects are calculated so that the average house effect is zero. This [..] doesn't mean the pollster closest to zero is the "best". It just means their results track [the average trend most closely]. That can also happen if a pollster gyrates considerably above and below our trend, but balances out. [..]

[To give an example], the Democracy Corps poll is conducted by the Democratic firm of Greenberg Quinlan Rosner Reserch in collaboration with Democratic strategist James Carville. Yet the poll has a negative house effect of -1. Does this mean the Democracy Corps poll is biased against Obama? No. It means they use a likey voter sample, which typically produces modestly more pro-Republican responses than do registered voter or adult samples. Assuming that the house effect necessarily reflects a partisan bias is a major mistake.


Back to basics

Now let's get back to basics. You come across a poll - like that ABC/WaPo poll - that diverges sharply from what other polls are saying, or that has a candidate lurching upward for no apparent reason. You are suspicious - is it fixed? The breakdown of the sample reveals some statistical anomalies. Further evidence that it's all just an attempt to manipulate you? Or just - well - a statistical anomaly? An example of the kind of random statistical variation you're going to get in any poll from iteration to iteration?

Well, when you're pondering this, look at this graphical representation from the above-quoted pollster.com post of how different individual polls bounce around, with some extent of randomness, the overall trendline based on all polling results. Here's how the ABC/WaPo polls stacked up, for example, until late August:


http://1.bp.blogspot.com/_MRs_Nt465oE/SLHIn-eyn1I/AAAAAAAAC_8/P0zbgpXC-g8/s400/HouseFXPollsterCompare-1.png


That's some bouncing around, huh? The ABC/WaPo poll is obviously one of the more volatile ones. But compare the red line indicating the results of the ABC/WaPo with the blue line indicating the overall trend in the polls. Of the 8 polls it did in this timeframe, it came out showing Obama doing better than in most polls 5 times, and McCain doing better 3 times. Somewhere near the end, it had Obama cratering right when other polls had him up. So deliberate partisan manipulation doesnt seem like a logical explanation.

Below are graphs showing how some other polls have bounced around.
Bottom line, I guess: if you come across an individual poll that seems all out of whack, dont freak, just wait till the next one. And in the meantime use sites like pollster.com, realclearpolitics.com, etc., to check out what the overall trend of all polls says. That way you wont ever be taken hostage by some aberrant sample in one or the other poll.

http://3.bp.blogspot.com/_MRs_Nt465oE/SLHGosuhAqI/AAAAAAAAC98/GR1-24-4-u0/s400/HouseFXPollsterCompare-22.png

http://2.bp.blogspot.com/_MRs_Nt465oE/SLHGoX2B-9I/AAAAAAAAC90/ZhfXy4mIKjU/s400/HouseFXPollsterCompare-21.png

http://4.bp.blogspot.com/_MRs_Nt465oE/SLHGoBLUSQI/AAAAAAAAC9k/DjVeXcAlbXw/s400/HouseFXPollsterCompare-19.png

http://2.bp.blogspot.com/_MRs_Nt465oE/SLHHednNnUI/AAAAAAAAC-U/SCaXOXa7jq8/s400/HouseFXPollsterCompare-15.png

http://4.bp.blogspot.com/_MRs_Nt465oE/SLHHeD-Um5I/AAAAAAAAC-M/ZSUppTM3UhI/s400/HouseFXPollsterCompare-14.png

http://4.bp.blogspot.com/_MRs_Nt465oE/SLHID8NRK6I/AAAAAAAAC-8/iIpxs2JEDBs/s400/HouseFXPollsterCompare-10.png

http://3.bp.blogspot.com/_MRs_Nt465oE/SLHIDwdMD3I/AAAAAAAAC-0/shDVatFQB2U/s400/HouseFXPollsterCompare-9.png

http://2.bp.blogspot.com/_MRs_Nt465oE/SLHIZYG9U_I/AAAAAAAAC_c/AyWeMy0XS-w/s400/HouseFXPollsterCompare-4.png

http://2.bp.blogspot.com/_MRs_Nt465oE/SLHIZCEOpWI/AAAAAAAAC_U/mvE2ik1EI8o/s400/HouseFXPollsterCompare-3.png
0 Replies
 
 

Related Topics

Obama '08? - Discussion by sozobe
Let's get rid of the Electoral College - Discussion by Robert Gentel
McCain's VP: - Discussion by Cycloptichorn
Food Stamp Turkeys - Discussion by H2O MAN
The 2008 Democrat Convention - Discussion by Lash
McCain is blowing his election chances. - Discussion by McGentrix
Snowdon is a dummy - Discussion by cicerone imposter
TEA PARTY TO AMERICA: NOW WHAT?! - Discussion by farmerman
 
  1. Forums
  2. » When does a poll sway opinion?
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.04 seconds on 12/22/2024 at 05:31:42