Getting it wrong

Getting it wrong

The recent election results in the UK have called into question the accuracy of opinion polling, especially since the polling got the election results so wrong. Is polling a legitimate science, or just some black art? Do people often say one thing and do another? How does the enforcement of politically correct speech skew polling, and why does it always seem to be skewed in one direction?

You’ll hear my thoughts, but don’t forget to let me know your thoughts.

Mentioned links:

Conservative Voters Give Pollsters Politically Correct Answers . . . and Then They Vote

Massive NBC Prediction Fail: Network Wrong as Conservatives Surge to Power

Getting some shopping done? If you're going to shop at Amazon, please consider clicking on my affiliate link. Thanks!

On Apple devices, you can subscribe to the podcast via iTunes.

If you're on Android, listen with Google Podcasts.

Stitcher Radio is another possibility for both Apple and Android devices. If you do download Stitcher to your phone, please use the promo code “ConsiderThis” to let them know where you heard about it.

Browser-based options are the Blubrry Network and Player.fm.

And if you have some other podcatcher or RSS reader, click here to get the direct feed and paste it wherever you need it.

I would love it if you would spread the word about the podcast! Click the Facebook, Twitter, and other icons (or all of them!) at the bottom of this post to recommend "Consider This!" to your social media audience.

Show transcript

The science of polling the general public has had its good and bad times, and it appears it’s going through one of those rough patches at the moment. A friend of mine refers to polls as “cricket races”; basically a snapshot of where things are in a particular race, that has as much bearing on our lives as a race amongst crickets. If it’s a slow news day, release the results from a poll, and call it news.

Some might put the word “science” in the phrase “science of polling” in scare quotes, not convinced that it’s much of a science at all. I do have some respect for those whose lives are in various statistical occupations. It seems like a black art, but, for example, one pharmaceutical client I worked for years ago had a Quality Assurance group that tested the products coming into the warehouse before they could be shipped out, and they explained quite a bit to me.  I couldn’t relate what they said now – I really can’t remember it all – but basically, given a good random sample, they could give you a good reading on whether or not the batch that just came in was good enough to ship out. Yeah, the only way to be totally sure was to test it all, but to get close enough to 100% sure without going overboard, there was a lot of science backing up their procedures.

Sampling people, on the other hand, is nowhere near as straightforward as sampling pharmaceuticals. People can say one thing, and yet do another. Which apparently happened in a big way over in the UK recently, when the conservative Tories trounced the liberal Labor Party in national elections, gaining their first outright majority since 1992. This even though Nate Silver, the US polling expert, had a look at all the UK polls and proclaimed that a Tory win of a majority of seats in Parliament was “vanishingly small when the polls closed – around 1 in 500.”

So much for that prediction. But the predictive value of polls is lessened when the pollsters themselves hide some of their results. It happened in the UK, and it happens quite a bit, apparently. No pollster wants to publish results that wind up being way out of line with those from other polls. No one wants to be the outlier, but that’s what happened in the UK. A last-minute poll by one group got the percentages virtually dead on to what the voting results were, but they didn’t publish it, “chickening out”, as the group’s CEO explained. It’s a herd mentality that we see in news coverage as well.

What this herd mentality gets us are reporters that cover stories because other reporters are covering them. And because journalists tend to lean liberal and vote Democratic, it’s stories on the Left that get more coverage. That’s where the herd migrates. As I’ve mentioned before, that situation is fertile ground for a network like Fox News that will get the scoop on stories on the Right, because the herd is busy elsewhere.

So then, do pollsters migrate that way? Turns out, yes, they do. In the UK, polling companies have consistently exaggerated the liberal side of the equation since the 1970s. Try as they might to correct for this, it just keeps happening. Is this a liberal bias, exactly? Well, my question would be that, if it isn’t bias, wouldn’t you think that errors like this would happen in both directions and equalize themselves out? But it so often seems to get tilted one direction. If it’s not bias, then what is it?

In the UK, they have a group of voters they call “shy Tories”; people who give politically correct answers to pollsters, but then vote conservative anyway. This is one of the problems with the speech police that shun those who don’t tow the PC line; polls get skewed. This is not just a problem in the UK, either. Silver notes that it’s happening more and more across the globe. Not surprisingly, the Western World’s culture keeps getting more and more liberal, and while those on the Right may wish to say the right to avoid the hassle, at least voting is still done by secret ballot.

But can “shy conservatives” really explain decades of error that are so lopsided in showing an advantage to the Left? The big question is; if it isn’t pollster bias, how would all this look any different if it was?

Given such problems with polling, whatever the reason, how does this affect our politics? James Taranto of the Wall Street Journal has speculated that liberal bias in news reporting can cause liberal politicians to underestimate the problems that their policy decisions are causing. The same thing goes for polling that tends to hide the conservative displeasure at those policies. Liberal politicians get an overly rosy view of how things are going, and thus, as Taranto suggests, make unforced errors regarding policies or campaigning because they don’t have a true picture.

Skewed polling can cut both ways, and is a disservice to all voters. An inflated poll number for a liberal politician could cause liberal voters to stay home, thinking their guy is inevitable, or it could cause conservative voters to stay home, thinking their guy is already doomed. Or both these conditions could occur, depressing voter turnout in general, and giving the few who do show up control over the many.

Of course the lesson there is, forget the polls and get out there and vote.

Another lesson is that commentators aren’t the best people to be telling political parties what lessons to learn. Here’s NBC’s Chuck Todd noting some of them.

[Chuck Todd from the video above]

If the Republicans should have learned a lesson if the UK conservatives lost, what should they learn if the UK conservatives won a major victory? Since the UK liberals lost, should the Democrats learn something? Anyone? Anyone? Bueller?

One problem I have with polls is when they ask people questions for which those people have no expertise. “What is the current state of the national economy?” is a pointless question to ask of those of us who have no economic expertise. It’s about like asking “How far is the Sun from the Earth?” In either case, the poll result doesn’t change reality, and reality can be discovered by means much more reliable than a poll.

Filed under: ElectionsMediaPartisanshipPolling