Welcome visitor you can log in or create an account

800.275.2827

Consumer Insights. Market Innovation.

blog-page

election bias new product researchSo I certainly do not follow politics closely, even during a presidential election year, which I guess could also be read as I don’t know very much about politics. But that small disclaimer aside, watching the news coverage of the recently passed Iowa Caucuses and upcoming New Hampshire primary, something struck me as peculiar in this process. These events happen in succession, not simultaneously. So first is the Iowa Caucus, then the New Hampshire primary, followed by the Nevada and South Carolina primaries, and so on with the other states.  And after each event is held the results are (almost) immediately known. So the folks in New Hampshire know the outcome from Iowa. The folks in Nevada and South Carolina know the outcomes from Iowa and New Hampshire. 

Doesn’t this lead to inherent and obvious bias? That’s the market researcher side talking. In implementing questionnaires we wouldn’t typically make known the results from previous respondents to those taking the survey later. This would surely have some influence on their answers that we wouldn’t want. We need a clean, pure read (as best as we can with surveys) as to consumer opinions and attitudes. Any deviation from this would surely compromise our data. 

But then again, is this always the case? Could there be situations in which some purposely predisposed informational bias is beneficial? I say yes! Granted one needs to be cautious and thoughtful when exposing respondents to prior information, but sometimes in order to get the specific type of response we want, a little bias is helpful. If asking about a particular product or product function, we may provide an example or guide so they can fully understand the product. E.g. 10 GB of storage is good for X number of movies and X number of songs. 

But circling back to the notion of letting respondents see the answers from previous respondents, even within the same survey, this could be quite helpful in priming folks to start thinking creatively. If we wish to gather creative ideas from consumers, it’s easy enough to ask them outright to jot something down. But it’s difficult to come up with new and creative ideas on the fly without much help. And responses we get from such tasks validate that point as many are nonsense, or short dull answers. So instead, we could show a respondent several ideas that have come up previously, either internally or from previous respondents, to jumpstart the thinking process and either edit/add onto an existing idea, or be stimulated enough to come up with their own unique idea. And truth is, it works! We at TRC implement this exact new product research technique with great success in our Idea Mill™ solution, and end up with many creative and unique ideas that our client companies use to move forward.

So while the presidential process strikes me as odd since any votes cast in other states following the Iowa Caucus may be inherently biased, there are opportunities where this sort of predisposition to information can work in our favor.

Hits: 111

Future new product researchDecember and January are full of articles that tell us what to expect in the New Year. There is certainly nothing wrong with thinking about the future (far from it), but it is important that we do so with a few things in mind. Predications are easy to make, but hard to get right, at least consistently.


First, to some extent we all suffer from the “past results predict the future” model. We do so because quite often they do, but there is no way to know when they no longer will. As such, be wary of predictions that say something like “last year neuro research was used by 5% of fortune 500 companies…web panels hit the 5% mark and then exploded to more than 50% within three years.” It might be right to assume the two will have similar outcomes, or it might be that the two situations (both in terms of the technique and in terms of the market at the time) are quite different.


Second, we all bring a bias to our thinking. We have made business decisions based on where we think the market is going and so it is only natural that our predictions might line up with that. At TRC we’ve invested in agile products to aid in the early stage product development process. I did so because I believe the market is looking for rigorous, fast and inexpensive ways to solve problems like ideation, prioritization and concept evaluation. Quite naturally if I’m asked to predict the future I’ll tend to see these as having great potential.


Third, some people will be completely self-serving in their predictions. So, for example, we do a tremendous amount of discrete choice conjoint work. I certainly would like to think that this area will grow in the next year so I might be tempted to make the prediction in the hopes that readers will suddenly start thinking about doing a conjoint study.   


Fourth, an expert isn’t always right. Hearing predictions is useful, but ultimately you have to consider the reasoning behind them, seek out your own sources of information and consider things that you already know. Just because someone has a prediction published, doesn’t mean they know the future any better than you do. 

...

Curious mind new product ResearchI recently finished Brian Grazer’s book A Curious Mind and I enjoyed it immensely. I was attracted to the book both because I have enjoyed many of the movies he made with Ron Howard (Apollo 13 being among my favorites) and because of the subject…curiosity.

I have long believed that curiosity is a critical trait for a good researcher. We have to be curious about our clients’ needs, new research methods and most important the data itself. While a cursory review of cross tabs will produce some useful information, it is digging deeper that allows us to make the connections that tell a coherent story. Without curiosity analytical techniques like conjoint or max diff don’t help.

The book shows how Mr. Grazer’s insatiable curiosity had brought him into what he calls “curiosity conversations” with a wide array of individuals from Fidel Castro to Jonas Salk. He had these conversations not because he thought there might be a movie in it, but because he wanted to know more about these individuals. He often came out of the conversations with a new perspective and yes, sometimes even ideas for a movie.

One example was with regards to Apollo 13. He had met Jim Lovell (the commander of that fateful mission) and found his story to be interesting, but he wasn’t sure how to make it into a movie. The technical details were just too complicated.

Later he was introduced by Sting to Veronica de Negri.  If you don’t know who she is (I didn’t), she was a political prisoner in Chile for 8 months during which she was brutally tortured. To survive she had to create for herself an alternate reality. In essence by focusing on the one thing she still had control of (her mind) she was able to endure the things she could not control. Mr. Grazer used that logic to help craft Apollo 13. Instead of being a movie about technical challenges it became a movie about the human spirit and its ability to overcome even the most difficult circumstances.

...

bias in market research two soccer playersIn new product market research we often discuss the topic of bias, though typically these discussions revolve around issues like sample selection (representativeness, non-response, etc.) but what about methodological or analysis bias? Is it possible that we impact results by choosing the wrong market research methods to collect the data or to analyze the results?


A recent article in the Economist presented an interesting study in which the same data set and the same objective was given to 29 different researchers. The objective was to determine if dark skinned soccer players were more likely to get a red card than light skinned players. Each researcher was free to use whatever methods they thought best to answer the question.


Both statistical methods (Bayesian Clustering, logistic regression, linear modeling...) and analysis techniques (some for example considered that some positions might be more likely to get red cards and thus data needed to be adjusted for that) differed from one researcher to the next. No surprise then that results varied as well. One found that dark skinned players were only 89% as likely to get a red card as light skinned players while another found dark skinned players were three times MORE likely to get the red card. So who is right?


There is no easy way to answer that question. I'm sure some of the analysis can easily be dismissed as too superficial, but in other cases the "correct" method is not obvious. The article suggests that when important decisions regarding public policy are being considered the government should contract with multiple researchers and then compare and contrast their results to gain a fuller understanding of what policy should be adapted. I'm not convinced this is such a great idea for public policy (seems like it would only lead to more polarization as groups pick the results they most agreed with going in), but the more important question is, what can we as researchers learn from this?


In custom new product market research the potential for different results is even greater. We are not limited to existing data. Sure we might use that data (customer purchase behavior for example), but we can and will supplement it with data that we collect. These data can be gathered using a variety of techniques and question types. Once the data are collected we have the same potential to come up with different results as the study above.

...

wawa app market research surveyAbout a decade ago, if someone would have mentioned the words "mobile app", anyone would have looked at them with a very puzzled expression. Nowadays, we hear about these apps everywhere. There are commercials for them on television, ads in magazines, billboard posts, etc. It's truly amazing to see how advanced technology has become and what can be accomplished by using it.

In this technology-based era, the smartphone is becoming increasingly popular among a wide variety of ages. In my opinion, the biggest perk of smartphones is that we almost always have access to the Internet. Being that the Internet is one of the most efficient tools that retailers and businesses use to create, retain, and obtain business, why wouldn't they capitalize on the popularity and functionality of smartphones and use it to their advantage to do even more creating, obtaining and refining of their business? One of the best ways for a company to remain competitive in this smartphone era is to create a mobile app specific to the company.

Take Wawa for example. For those who are not on the East coast and may be unfamiliar with Wawa, it is a wonderful place that offers gasoline, freshly prepared foods, snacks, coffee and more. Okay, yes, ultimately it's a convenience store/gas station. However, to many of us on the East coast, it's much more. Anyway, if you download the Wawa app, you can link it up with your credit card or a Wawa gift card, which means you don't even have to bring your wallet into the store. The app includes a rewards system, in which you receive points for your purchases, which can be used to receive a free coffee or tea, or something of similar value. While Wawa offers many benefits to its customers through its mobile app, such as locating a nearby Wawa, checking gasoline prices or having easy access to nutrition info, it also gives app users the chance to provide feedback by means of an open-end suggestion form. It would benefit the company to implement a survey within the app instead of an open-end feedback form to gain insights about customers' transactions, experiences, and their overall opinions.

Fielding surveys within mobile apps provides a quick and easy way to reach customers and gain useful feedback. So, how do you get app users to actually participate in the survey? Simple. When the app is first opened or closed, add a pop-up message with a link to the survey that encourages the user to take the survey. Also, go ahead and add the survey as an item on the app's navigation menu. While it's not ideal to conduct surveys on mobile devices that contain something as intricate as conjoint analysis, companies can still create a simple survey that can be used to gain valuable insights about current products, potential products, customer satisfaction and an abundance of other consumer-related topics.

In order to create the best experience for the app user and get the most out of the data that is collected, companies should consider these five tips when developing a mobile survey:

...

Want to know more?

Give us a few details so we can discuss possible solutions.

Please provide your Name.
Please provide a valid Email.
Please provide your Phone.
Please provide your Comments.
Enter code below : Enter code below :
Please Enter Correct Captcha code
Our Phone Number is 1-800-275-2827
 Find TRC on facebook  Follow us on twitter  Find TRC on LinkedIn

Our Clients