• (425) 677-7430
  • info@cascadestrategies.com
Viewing posts categorised under:

Burning Questions

There are Problems with Self-reported Data that Our Industry Needs to Address

jerry9789
2 comments
Burning Questions

We need more than blue-ribbon commissions to "explore" the problem How many times do we have to be shown there's something wrong with self-reported data before we actually do something about it? Below are just three instances of failure in self-reported polling data (wherein the respondent either tells the interviewer how he/she is going to vote or marks in an online or mobile survey how he/she is going to vote). polling failures The examples are from the 2016 US Presidential Election, the 2015 UK Parliamentary General Election, and the recent Ossoff-Handel Special Election for the US House of Representatives in Georgia.  But there are many more examples. Spectacular failures like these inevitably lead to blue-ribbon panels of high-paid industry executives who promise solemnly to "get to the bottom of the problem" and "address waning public confidence in surveys."  The most recent example is the weighty-sounding Evaluation of 2016 Election Polls in the United States by AAPOR (American Association for Public Opinion Research).  What does that report say? Predictably, it says (in essence) "we were more accurate than we are given credit for" (see pages 2-4 of the report).  But when you actually read the analysis (which nobody does), there's a curious phenomenon mentioned on page 25.  We explain this finding below. page from AAPOR report on 2016 political polling

A critical finding is treated as an artifact

It's a busy page, but let us tell you what it means.  It's the result of a "callback study" by Pew.  A callback study is basically an opportunity to ask survey respondents "Hey!  Why didn't you vote the way you told us you were going to vote?"  The high black bar on the right indicates clearly that Trump benefited far more than Clinton did from this error in self-report, and by a historically large margin. But there the matter is dropped, as if to say "keep moving folks...nothing here to see." Nobody asks why errors in self-reporting lead to a bigger benefit for Trump than for Clinton.

Why nobody asks such a damning question

The reason nobody asks is that the answer is too frightening: current survey methods do a poor job of accurately reading a deeply felt public mood.  It simply hurts industry leaders too much to say it.  It means, fundamentally, that their daily efforts and their livelihoods are directed toward something that doesn't work. What works better? What the industry needs is not more bromides and blue-ribbon panels, but a new methodology that actually does a good job of accurately identifying the public mood.

Biometric measurement accurately reads the public mood

We've already discussed in this blog what biometric measurement discovered in the 2016 election.  But why did biometric measurement make that discovery while standard methods did not? biometric measurement It's because biometric measurement avoids self-reporting error - which is precisely the problem the AAPOR report mentions then ignores.  We will keep saying this to resistant audiences in the hope that somebody, somewhere will remove the wax from their ears. It seems that a few are starting to listen.  While we have shown the quotation from Katja Cahoon of the Beacon Insight Group before, it's appropriate to do so again because of its gravity and its relevance to this debate: Katja Cahoon quotation  

Tell us what you think

What's your view? If you'd like to comment on this topic, please click on "Add Comments" below. We'd love to hear from you!

Read more

National attitudes changed massively from 2014 to 2016

jerry9789
2 comments
Burning Questions

Many of the responses to our last blog ("How Has the Mood of the Country Changed in the Past Two Years?"), included comments like "a cheery mood is not what I'm hearing," and "please run your national survey again." Fair enough.  We will indeed run the survey again this year. But our major point was not about outcomes; it was about a large-scale tendency to avoid a simple, dispassionate reading of social science data.  If you can do that, you can clearly see that the warning signs were there, and abundant.  Our industry (market research) simply chose not to read the data as they naturally fell. There's a marvelous mea culpa from Katja Cahoon of Beacon Insight Group that really should be read by all.  Very rarely do market researchers own up to how they are interpreting data; cognitive dissonance is hardly mentioned at all.  But Katja's confession is remarkable.  (A snippet appears below.) Magic happens when you interpret data dispassionately Let us give you a quick example of the magic that happens when you read social science data dispassionately.  Trends start to move in the same direction, and there is cross-validation of findings from disparate sources.  (Imagine that!) An NBC News/WSJ tracking poll asked respondents whether they agreed with the statement "the economic and political systems in the country are stacked against people like me."  The degree of agreement with this statement declined significantly from 2014 to 2016 (see below). graphic of NBC News WSJ poll

What our research on the mood of the country found

We saw the same trend in our tracking research.  We asked respondents how strongly they agreed with the statement "As far as getting ahead financially, the game is rigged."  Like the NBC News/WSJ poll, we found that overall agreement with this statement declined from 2014 to 2016 (see below).

bar chart for the game is rigged question (Our poll uses a 10-scale and the NBC News graphic removes the "I'm not sure" responses, but the trend is in the same direction.) The identification of this trend does not mean, of course, that there was no mood of disgruntlement in the nation at all; it simply means that there was a building undercurrent of expectation that things would somehow improve in the near future. The Washington Examiner went into a little more depth on the trend, saying:

"The pollsters asked, "During the next twelve months, do you think that the nation's economy will get better, get worse, or stay about the same?" Forty-one percent said they expect the economy to get better, versus just 21 percent who expect it to get worse and 36 percent who expect the economy to stay the same. That 41 percent, plus 42 percent who expected better times in the Journal's poll last month, are the highest expectation numbers in the Journal's polling since October 2012, right before Barack Obama was re-elected."

(The Washington Examiner, February 26, 2017) Many other fair-minded researchers saw the trend Our firm is not unique in seeing a trend like this.  Actually, quite a number of pollsters and market researchers saw it.  It's just that media reporting on their findings was light, so the reading/viewing public never really knew.

Our view

We will certainly keep an eye on this trend, especially for any notable departure from it.  But it's almost undeniable that a roughly optimistic trend is there.  In our next post we will look at even more attitudes that changed from 2014 to 2016. We still cling to the notion that, while the political barbs are flying back and forth, it's possible for sober analysts and social science researchers to read the data honestly.  But at the same time it seems that when researchers interpret the data dispassionately, there's no scribbling reporter nearby ready to present their story to the world. There seems to be more media appetite for piling onto an existing narrative, regardless of what the data say. quotattion from katja cahoon of Beacon Insight Group

It's pretty much as Titus Bond of the research firm The Remington Research Group put it:  "A lot of pollsters, they want to be at the same number as everyone else."

Tell us what you think

What's your view? If you'd like to comment on this topic, please click on "Add Comments" below. We'd love to hear from you!

Read more

How Has the Mood of the Country Changed in the Past Two Years?

jerry9789
7 comments
Burning Questions

There's been a spate of articles recently indicating that the country's in a sour mood. Major left and right factions seem to be in unyielding disagreement with each other, leading to a sense of deadlock, hopelessness, and dread. Hmm...that's not what our research reveals. Just as our mid-2016 research was at variance with what the major polls were telling us, our research once again differs from the norm. Let us explain. optimists increased

What our research on the mood of the country found

In 2014 we asked a panel of 500 US respondents 28 questions that in effect invited them to be cynical (if they wanted to).  For example, we asked them how strongly they agreed with certain assertions such as:           "This country is going down the tubes."           "There's been a serious breakdown of basic moral codes in this country." In that year just short of half (44%) agreed with those negative propositions.  We called them "Cynics."  This was the largest group among the four we identified using an advanced cluster analysis.  The other groups were:           Optimists            26%           Modernists          15%           Libertarians        15% As you'd expect, the Optimists largely disagreed with the dark statements.  (We'll tell you in a subsequent blog what the Modernists and Libertarians are all about.)

The Twist in 2016

In 2016 we asked 500 US respondents the same questions and got a surprising result.  The Optimist group (those who disagreed with the negative statements) expanded to 42%, while the Cynic group shrank to 23%.  This suggested a positive shift in the overall mood or attitude of the country, more consistent with the findings of the Conference Board on overall Consumer Confidence than with the findings of large-scale political polls and surveys.  (See the far right of the chart below.) consumer confidence index                               Copyright 2016 Bloomberg News, The Conference Board

Our Interpretation

We don't believe there's a somber mood in this country to begin with. We didn't believe this in the summer of 2016, either, as our research did not indicate such a dark or negative mood. Instead, we think properly interpreted social research findings (i.e., research findings interpreted without bias) indicate that citizens and consumers are quite upbeat. negative statements

Who's talking, the media or the people?

A lot of this comes down to "who's reporting" when you hear about a mood of anger, bitterness, and divisiveness. Is the grim mood being reported by the news media or the people themselves? A properly executed survey analyzed without bias is a way to faithfully sample the mood of the people. You can't necessarily count on the news media to have the same sense of responsibility as an objective researcher when analyzing the results of social surveys, as many media organizations have strong agendas.

Seek Truth

It's possible to read social survey results fairly and accurately. Organizations like Cambridge Analytica and Dornsife/USC have proven in the last election cycle that survey results interpreted in an unbiased way can be quite accurate and predictive. Our results are closer to what these two research organizations found and are quite far away from what's being reported by pollsters tied to news media. What's your view? If you'd like to comment on this topic, please click on "Add Comments" below. We'd love to hear from you!

Read more

Do voters choose candidates like breakfast cereal?

jerry9789
2 comments
Burning Questions

In ordinary conversation people say they do.  But is there any science behind that idea? Two professors at Texas Tech University have adapted Cascade's BioNimbus Virtual Shopping System to "Biopolitics."  The idea is to see if the candidates' subtle inflections (gestures, mannerisms, etc.) can move undecided voters to their side -- much the same way a promotion, a new price, a game, or a cartoon character can cause consumers to buy the product.  Using BioNimbus, they have made a remarkable discovery, summarized here. Feel free to comment below on this interesting development in the emerging science of Biopolitics.

Read more

Are You Using Biometrics for Market Research?

jerry9789
2 comments
Burning Questions

Whether it's brainwaves, skin temperature and conductance, heart rate, perspiration, pupil dilation, facial muscular contraction, or something else, the various methods of measuring biometric response for market research seem to be here to stay. Are you using any of these methods for market research?  If not, tell us you're not (and maybe why). If you are, tell us which methods you're using and for what purpose (e.g., ad testing?  website testing?).  Are you satisfied that you're learning what you wanted to learn...and at a reasonable cost?  Let us know!

Read more

Are you un-siloing your marketing data?

jerry9789
3 comments
Burning Questions

These days many people are trying to rescue their various forms of marketing data from the separate, often uncommunicative systems where they reside ("silos") and integrating these data streams into a coherent framework for decision making.  Is your company doing that -- un-siloing your marketing data? If so, please tell us about that experience.  Are the efforts to create marketing data integration running smoothly?  Have unexpected problems arisen?  Are you building a master marketing dashboard for monitoring and adjustment?  Is that working out well?  What can others who may be planning such marketing data integration projects learn from your experience?  Please tell us below.  We welcome your comments!

Read more

Do Customer Satisfaction Surveys Contain Too Many Ringers?

jerry9789
2 comments
Burning Questions

Some people conducting customer satisfaction surveys say they can't get accurate information because of the dominance of "ringers" - people who are either very angry or very pleased with their experience with the brand, product, or service. This leaves out the great middle - people who've had an experience with the brand but don't have extremes of feeling one way or the other. This can significantly harm the accuracy of the customer satisfaction study. Have you had such a experience? If so, please tell us about it. Even if you haven't had a personal experience of this type, what's your opinion? Do you think customer satisfaction surveys tend to be dominated by ringers? We look forward to hearing your point of view!

Read more

Are biometrics worth the trouble?

jerry9789
4 comments
Burning Questions

The world of biometric research for marketing is exciting, in that it holds the promise of understanding the emotional connection to products and brands. But is it worth the trouble? There's disagreement. Many say the insights are intriguing but ultimately not worth the additional pain in time, money, and a sort of black-box obscurity in methods. Others say the benefits outweigh these costs, particularly since we are developing a superior science for the future (i.e., some sacrifice is worth it). What's your view? Please tell us below. We invite your opinion!

Read more

Has big data benefited you?

jerry9789
2 comments
Burning Questions

Sure, big data's a buzzterm, an object of interest, and a serious discipline -- all three are true.  But does dig data produce practical, understandable, common-sense wins for you and your company?  If so, tell us about those wins.  If not, tell us why you think big data doesn't produce tangible benefits at the simple, practical level.  We'd like to hear your thoughts!

Read more

Are Physical Focus Groups Still Relevant?

jerry9789
7 comments
Burning Questions

With the advent of so many alternative market research methods -- qualitative and quantitative -- there’s certainly room for debate on this topic.  Tell us what you think!  If you think yes, physical focus groups are still relevant, tell us why, and what are their best uses.  If you think no, physical focus groups are no longer relevant, tell us what’s better than a physical focus group and why that’s better.  Thanks for commenting!

Read more

Welcome
to Cascade Strategies

A highly innovative, award-winning market research and consulting firm with over 24 years’ experience in the field. Cascade provides consistent excellence in not only the traditional methodologies such as mobile surveys and focus groups, but also in cutting-edge disciplines like Predictive Analytics, Deep Learning, Neuroscience, Biometrics, Eye Tracking, Virtual Reality, and Gamification.
Snap Survey Software