Welcome visitor you can log in or create an account

800.275.2827

Consumer Insights. Market Innovation.

blog-page
conjoint-modern-market-research-In my last blog I referenced an article about design elements that no longer serve a purpose and I argued that techniques like Max-Diff and conjoint can help determine whether these elements are really necessary or not. Today I’d like to ask the question “What do we as researchers use that are still useless?”
 
For many years the answer would have been telephone interviewing.  We continued to use telephone interviewing long after it became clear that web was a better answer. The common defense was “it is not representative”, which was true, but telephone data collection was no longer representative either. I’m not saying that we should abandon telephone interviewing…there are certainly times when it is a better option (for example, when talking to your clients customers and you don’t have email addresses). I’m just saying that the notion that we need to have a phone sample to make it representative is unfounded.
 
I think though we need to go further. We still routinely use cross tabs to ferret out interesting information. The fact that these interesting tidbits might be nothing more than noise doesn’t stop us from doing so. Further, the many “significant differences” we uncover are often not significant at all…they are statistically discernable, but not significant from a business decision making standpoint. Still the automatic sig testing makes us pause to think about them.
 
Wouldn’t it be better to dig into the data and see what it tells us about our starting hypothesis? Good design means we thought about the hypothesis and the direction we needed during the questionnaire development process so we know what questions to start with and then we can follow the data wherever it leads. While in the past this was impractical, we not live in a world where analysis packages are easy to use. So why are we wasting time looking through decks of tables?
 
There are of course times when having a deck of tables could be a time saver, but like telephone interviewing, I would argue we should limit their use to those times and not simply produce tables because “that’s the way we have always done it”.  
Hits: 38 0 Comments

My daughter was performing in The Music Man this summer and after seeing the show a number of times, I realized it speaks to the perils of poor planning…in forming a boys band and in conducting complex research.  

For those of you who have not seen it, the show is about a con artist who gets a town to buy instruments and uniforms for a boys band in exchange for which he promises he’ll teach them all how to play. When they discover he is a fraud they threaten to tar and feather him, but (spoiler alert) his girl friend gets the boys together to march into town and play. Despite the fact that they are awful, the parents can’t help but be proud and everyone lives happily ever after.

It is to some extent another example of how good we are at rationalizing. The parents wanted the band to be good and so they convinced themselves that they were. The same thing can happen with research…everyone wants to believe the results so they do…even when perhaps they should not.

I’ve spent my career talking about how important it is to know where your data have been. Bias introduced by poor interviewers, poorly written scripts, unrepresentative sample and so on will impact results AND yet these flawed data will still produce cross tabs and analytics. Rarely will they be so far off that the results can be dismissed out of hand.

The problem only gets worse when using advanced methods. A poorly designed conjoint will still produce results. Again, more often than not these results will be such that the great rationalization ability of humans will make them seem reasonable.

...

Lilly Allen Market Research representativenessSome months ago, Lily Allen mistakenly received an email containing harsh test group feedback regarding her new album. Select audience members believed the singer to be retired and threw in some comments that I won’t quote. If you are curious, the link to her Popjustice interview will let you see them in a more raw form. Allen returned the favor with some criticism on market research itself:

“The thing is, people who take part in market research: are they really representative of the marketplace? Probably not.” –Lily Allen

The singer brings up a valid concern. One of the many questions I pondered five months ago when I first took my current researcher-in-training position with TRC. Researchers are responsible for engaging a representative sample and delivering insights. How do we uphold those standards to ensure quality? Now that I have put in some time and have a few projects under my belt, I have assembled a starter list to address those concerns:

Communicate: All Hands on Deck

In order to complete any research project, there needs to be a clear objective. What are we measuring? Are we using one of our streamlined products, such a Message Test Express™, or will there be a conjoint involved? This may seem obvious, but it is also critical. A team of people is behind each project at TRC; including account executives, research managers, project directors, and various data experts. More importantly, the client should also be on the same page and kept in the loop. Was the artist the main client for the research done? My best guess is no, the feedback given was not meant to be a tool to rework the album.

Purpose

Was the research done on Lily Allen’s album even meant to be representative? Qualitative interviews can produce deep insights among a small, non-representative, group of people. This can be done as a starting point or a follow-up to a project, or even stand alone, depending on the project objectives.

...

As most anyone living on the East Coast can attest, the winter of 2013-2014 was, to put it nicely, crappy. Storms, outages, freezing temperatures…. We had a winter the likes of which we haven’t experienced in a while. And it wasn’t limited to the East Coast – much of the US had harsher conditions than normal.

Here in the office we did a lot of complaining. I mean a lot. Every day somebody would remark about how cold it was, how their kids were missing too much school, how potholes were killing their car’s suspension… if there was a problem we could whine about, we did.

Now that it’s spring and we’re celebrating the return of normalcy to our lives, we wonder… just what was it about this past winter that was the absolute worst part of it? Sure, taken as a whole it was pretty awful, but what was the one thing that was the most heinous?

Fortunately for us, we have a cool tool that we could use to answer this question. We enlisted the aid of our consumer panel and our agile and rigorous product Message Test Express™ to find the answer. MTE™ uses our proprietary Bracket™ tool which takes a tournament approach to prioritizing lists. Our goal; find out which item associated with winter was the most egregious.

Our 200 participants had to live in an area that experiences winter weather conditions, believe that this winter was worse or the same as previous winters, and have hated, disliked or tolerated it (no ski bums allowed).

...
Recent comment in this post - Show all comments
  • Ed Olesky
    Ed Olesky says #
    Now this is a research topic relevant to all!

Critics vs. TV Viewers

Posted by on in A Day in a (MR) Life

In the last episode of my blog, we compared the list of best TV shows of 2012 for two groups: 45 TV critics, as compiled by Metacritic, and 542 average TV viewers who ranked shows using our Bracket™ prioritization tool. The two groups had six shows in common on their Top 20 lists, including two from AMC: “Breaking Bad” and “The Walking Dead.”

We wondered whether access to more content (through having basic cable or premium channels) would correlate with viewers’ opinions of the top shows.

Before we get to that, it’s interesting to note that the TV critics didn’t favor premium channel or basic cable programming. In fact, fully half of their 20 “best” shows of 2012 aired on the “standard” networks. TV viewers only had one more network show in their own top 20 than the critics did.

  Critic Top 20 List TV Viewer Top 20 List
Standard network shows (ABC, CBS, NBC, Fox, PBS) 10 11
Basic cable shows 6 6
Premium channel shows (HBO, Cinemax, Starz, Showtime) 4 3
Total 20 20

 

So, do TV viewers with premium channels choose more premium channel shows than those without? Well, that’s a little complicated.

...
Tagged in: Prioritization

Want to know more?

Give us a few details so we can discuss possible solutions.

Please provide your Name.
Please provide a valid Email.
Please provide your Phone.
Please provide your Comments.
Enter code below : Enter code below :
Please Enter Correct Captcha code
Our Phone Number is 1-800-275-2827
 Find TRC on facebook  Follow us on twitter  Find TRC on LinkedIn

Our Clients