Ha! That was funny. What kind of movies did he say in that funny line? "Even if 100% of the people polled like f*cking [something] movies, they still suck!"
And yeah, Luntz.. he's up there with Mark Penn in the hallway of sleazy pollsters. Pollsters who see polls as a political tool, rather than as a measurement instrument.
That's the point though, isnt it? Luntz, the intro says, has "run more than 500 polls for corporations and lobbying groups". And quizzed about that work, he says, "the key in survey research is to [..] ask a question in the way that you get the right answer [..]". Which is indeed exactly what corporations and lobbyists will use polls for.
And the video shows in a very funny way how that can be done exactly. Or rather, they have Luntz explain it: "What you will find is that two virtually the same questions, with just a single change of wording, you get a very different reaction".
But then the guys in the video make a big jump, and conclude: "polling is bullshit". Period. All polling, apparently. No distinction, end of story, f*ck em already.
Thats a pity, because in another dimension, they could have taken Luntz' words and the little street experiment as the basis for a kind of manual on how to use polls. How to read them and evaluate them.
For example: dont ever go on just the one poll. If you really never
can know for sure about it, why trust this one company about what people think? However, if 10 polls by 10 different polling companies show the same numbers, or the same trends, than you can be sure you're not looking at concocted numbers. I mean, if you're not living in Russia. Even hypothetically, there's just no way they can all be "bought" and bought by the same party even, too.
Second example: follow the money. Who commissioned the poll? Any poll commissioned and thereafter publicized by some corporation that's to do with questions about its work or products is suspect. Any poll commissioned by a lobbying group is suspect. Because yes, there are plenty pollsters willing to get the wanted results for a fee.
Zogby, for example, has been criticized for mixing independent work with doing polls for lobbies, forcing the reader to pay attention to what kind of poll this latest one is.
But polls by the network media about the presidential elections? Do CBS, CNN or NBC push their pollsters to get a favourable result for one or the other candidate? Hardly. If only already because it's much too dangerous a game in town. There's a ton of pollsters out there polling the same questions - it's not like you're doing a poll about the quality of L'Oreal shampoo and you can lie your head off because there's no rival polls revealing how off you are anyway. There's a lot of directly comparative material, and if one of the networks would consistently get numbers off from everyone else's, it loses credibility, people pay less attention to it, and down the line that hurts their bottom line. Same if they consistently turn out to have one candidate or party up higher than the actual results end up being.
And what about polls done by the pollsters independently, not commissioned at all? Thats another chunk of the polls coming out now. The commercial rationale behind them is to acquire prestige and thus new business for the pollster, and this is the better achieved the closer they get to the actual outcome. Again, its just against their bottom line to 'push' their polling numbers in a certain direction.
This is of course most true for established, respected pollsters - Gallup, say. If Gallup was to be found out to manipulate its numbers to achieve political ends, it would be a big blow to its prestige, and thus a money-loser. But a small, relatively unknown pollster on the other hand, like, say, ARG, has a lot less to lose. For an obscure pollster the potential profit involved in manipulation on behalf of a client can more easily outweigh the potential loss involved in risking one's reputation, since there isn't much of one to lose anyhow. So there's a rationale behind trusting blue chip polls over more obscure ones.
The vid could have made, in another dimension, other points too. For example: look at sampling size. The video makes an overly simplistic point about how 10% of passing car drivers think Luntz is an ass based on 10 passing cars. But no serious poll is based on fewer than several hundred randomly contacted respondents. Yet of course you'd be more comfortable with a poll with 1,000 respondents than with one with 350.
Question wording, again an important point. But here too, the poll reader isnt powerless - he can evaluate the worth of a poll, rather than just assume that all polls are BS. Some pollsters publish their whole question form online. You can review the exact questions they used, the choices people were given. Other pollsters publish only topline results. That has partly to do with commercial interest - it's easy for a university to play open card about polls that are only a side activity for them anyhow, while a commercial polling company may be edgy about betraying "the secrets of their success" to rivals. But one inspires more trust than the other, of course.
Blumenthal and Franklin have done great work at pollster.com with their "Disclosure Project", during which they pushed all pollsters doing polling on the primaries in IA, NH, SC and nationally to reveal their basic methodological data and answer questions about how they work. And they publicized who has co-operated and who hasnt, too (naming and shaming..).
See the project's results for Iowa late last year. The results also show that one determiner of transparency is whether a pollster is a
member of the American Association for Public Opinion Research (AAPOR). Members have to abide to a
Code of Professional Ethics & Practice, and transparency is part of that.
So the short story is: yes, polls can be manipulated. They can easily be made to say what you want them to say. But that doesnt mean that most polls are - certainly not in as crowded a field as nation-wide election polling. In a competitive field where many pollsters are asking the same questions, the temptation of manipulating polls for a fee is counterweighed by the pressures of competition that force a pollster to stay in line with the mainstream of polling. And the consumer has ways to review the risks, too - by ignoring commisioned polls and looking only at independent and news media-sponsored ones for example. Or by prioritising polls by AAPOR members. Or just by looking at pollsters' track records. Polling is an approximate science, so any poll can fail to even remotely approach this or that election result, but if a pollster is consistently among those who are furthest off, that's a warning flag.
Polling is not BS; it's just an economy like any other, like, say, banking: there are predatory lenders trolling for bait, there are blue chip banks that, forced by competition and brand prestige, can usually be counted on to be reliable, and there's stuff in between.