• (425) 677-7430
  • info@cascadestrategies.com

There are Problems with Self-reported Data that Our Industry Needs to Address

Share

We need more than blue-ribbon commissions to “explore” the problem

How many times do we have to be shown there’s something wrong with self-reported data before we actually do something about it?

Below are just three instances of failure in self-reported polling data (wherein the respondent either tells the interviewer how he/she is going to vote or marks in an online or mobile survey how he/she is going to vote).

polling failures

The examples are from the 2016 US Presidential Election, the 2015 UK Parliamentary General Election, and the recent Ossoff-Handel Special Election for the US House of Representatives in Georgia.  But there are many more examples.

Spectacular failures like these inevitably lead to blue-ribbon panels of high-paid industry executives who promise solemnly to “get to the bottom of the problem” and “address waning public confidence in surveys.”  The most recent example is the weighty-sounding Evaluation of 2016 Election Polls in the United States by AAPOR (American Association for Public Opinion Research).  What does that report say?

Predictably, it says (in essence) “we were more accurate than we are given credit for” (see pages 2-4 of the report).  But when you actually read the analysis (which nobody does), there’s a curious phenomenon mentioned on page 25.  We explain this finding below.

page from AAPOR report on 2016 political polling

A critical finding is treated as an artifact

It’s a busy page, but let us tell you what it means.  It’s the result of a “callback study” by Pew.  A callback study is basically an opportunity to ask survey respondents “Hey!  Why didn’t you vote the way you told us you were going to vote?”  The high black bar on the right indicates clearly that Trump benefited far more than Clinton did from this error in self-report, and by a historically large margin.

But there the matter is dropped, as if to say “keep moving folks…nothing here to see.” Nobody asks why errors in self-reporting lead to a bigger benefit for Trump than for Clinton.

Why nobody asks such a damning question

The reason nobody asks is that the answer is too frightening: current survey methods do a poor job of accurately reading a deeply felt public mood.  It simply hurts industry leaders too much to say it.  It means, fundamentally, that their daily efforts and their livelihoods are directed toward something that doesn’t work.

What works better?

What the industry needs is not more bromides and blue-ribbon panels, but a new methodology that actually does a good job of accurately identifying the public mood.

Biometric measurement accurately reads the public mood

We’ve already discussed in this blog what biometric measurement discovered in the 2016 election.  But why did biometric measurement make that discovery while standard methods did not?

biometric measurement

It’s because biometric measurement avoids self-reporting error – which is precisely the problem the AAPOR report mentions then ignores.  We will keep saying this to resistant audiences in the hope that somebody, somewhere will remove the wax from their ears.

It seems that a few are starting to listen.  While we have shown the quotation from Katja Cahoon of the Beacon Insight Group before, it’s appropriate to do so again because of its gravity and its relevance to this debate:

Katja Cahoon quotation

 

Tell us what you think

What’s your view? If you’d like to comment on this topic, please click on “Add Comments” below. We’d love to hear from you!

Add comments

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

background

Tell us how we can help you

Cascade Strategies can serve your market research needs from the most straightforward to the most sophisticated project. Don’t hesitate to contact us to tell us about your next project, or your overall research needs in general. You can call (425) 677-7430 and ask for Jerry, Nestor, or Ernie. Or send us an email at info@cascadestrategies.com. We’ll get back to you quickly!

subscribe