Welcome visitor you can log in or create an account

800.275.2827

Consumer Insights. Market Innovation.

blog-page

New Product Research

when not to use conjointAt the beginning of my research career I grew accustomed to clients asking us for proposals using a methodology that they had pre-selected. In many cases, the client would send us the specs of the entire job, (this many completes, that length of survey) and just ask us for pricing. While this is certainly an efficient way for a client to compare bids across vendors, it didn’t allow for any discussion as to the appropriateness of the method being proposed.  
Today most research clients are looking for their research suppliers to be more actively involved in formulating the research plan. That said, we are often asked to bid on a “conjoint study.”  Our clients who’ve commissioned conjoint work in the past are usually knowledgeable about when a conjoint is appropriate, but sometimes there is a better method out there. And sometimes the product simply isn’t at the right place in the development “chain” to warrant conjoint.
Conjoint, for the uninitiated, is a useful research tool in product development. It is a choice-based method that allows participants to make choices between different products based on the product’s make-up. Each product comprises various features and levels within those features. What keeps respondents from choosing only products made up of the “best” features and levels is some type of constraint – usually price.   
We look to conjoint to help determine an optimal or ideal product scenario, to help price a product given its features, or to suggest whether a client could charge a premium or require a discount.  It has a wide range of uses, but it isn’t always a good fit:  

  1.  When the features haven’t been defined yet. One problem product developers face is having to “operationalize” something that the market hasn’t seen yet. You need to be able to describe a feature, what its benefits are, and its associated levels in layman’s terms. We can’t recommend conjoint if the features are still amorphous.   
  2. When there are a multitude of features with many levels or complex relationships between the features. The respondent needs to be able to absorb and understand the make-up of the products in order to choose between them. If the product is so complex that it requires varying levels of a lot of different features, it’s probably too taxing for the respondents (and may tax the design and resulting analysis as well). Conjoint could be the answer – but the task may need to be broken up into pieces.   
  3. When there are a limited number of features with few levels. In this case, Conjoint may be overkill. A simple monadic concept test or price laddering exercise may suffice.   
  4. When pricing is important, but you have absolutely no idea what the price will be. Conjoint works best when the product’s price levels range from slightly below how you want to price it to slightly above how you want to price it.  If your range is huge, respondents will gravitate toward the lower priced product scenarios and you won’t get much data on the higher end. It may also confuse respondents that similar products would be available at such large price differences.
Hits: 1560 0 Comments

bias in market research two soccer playersIn new product market research we often discuss the topic of bias, though typically these discussions revolve around issues like sample selection (representativeness, non-response, etc.) but what about methodological or analysis bias? Is it possible that we impact results by choosing the wrong market research methods to collect the data or to analyze the results?


A recent article in the Economist presented an interesting study in which the same data set and the same objective was given to 29 different researchers. The objective was to determine if dark skinned soccer players were more likely to get a red card than light skinned players. Each researcher was free to use whatever methods they thought best to answer the question.


Both statistical methods (Bayesian Clustering, logistic regression, linear modeling...) and analysis techniques (some for example considered that some positions might be more likely to get red cards and thus data needed to be adjusted for that) differed from one researcher to the next. No surprise then that results varied as well. One found that dark skinned players were only 89% as likely to get a red card as light skinned players while another found dark skinned players were three times MORE likely to get the red card. So who is right?


There is no easy way to answer that question. I'm sure some of the analysis can easily be dismissed as too superficial, but in other cases the "correct" method is not obvious. The article suggests that when important decisions regarding public policy are being considered the government should contract with multiple researchers and then compare and contrast their results to gain a fuller understanding of what policy should be adapted. I'm not convinced this is such a great idea for public policy (seems like it would only lead to more polarization as groups pick the results they most agreed with going in), but the more important question is, what can we as researchers learn from this?


In custom new product market research the potential for different results is even greater. We are not limited to existing data. Sure we might use that data (customer purchase behavior for example), but we can and will supplement it with data that we collect. These data can be gathered using a variety of techniques and question types. Once the data are collected we have the same potential to come up with different results as the study above.

...

Last year Time Magazine featured a cover story about fat…specifically that fat has been unfairly vilified and that in fact carbs and sugars are the real danger. They were not the first with the story nor will they be the last. The question is, how will this impact the food products on the market?

The idea that carbs and sugar were the worst things you could eat would not have surprised a dieter in say 1970. It was in the 1980’s that conventional wisdom moved toward the notion that fat caused weight gain and with that heart disease and thus should be avoided. Over time the public came to accept this wisdom (after all the idea that fat causes fat isn’t hard to accept) and the market responded with a bunch of low fat products. Unfortunately those products were higher in sugar and carbs and the net result is that Americans have grown heavier.  

If the public buys into this new thinking we should expect the market to respond. To see how well the message has gotten out, we conducted a national survey with two goals in mind:

  • Determine awareness of the sugar/carbs being worse than fat thinking.
  • Determine if it would change behavior.

About a third of respondents said they were aware of the new dietary thinking. While still a minority, a third is nothing to be sneezed at. Especially when you consider that the vast majority of advertising still focus on the low fat message and food nutrition labels still highlight fat calories at the top. It took time for the “low fat” message to take hold and clearly it will take time for this to take hold as well.

Already there is evidence of change. Those aware of the message prior to the survey were far more likely to recommend changes to people’s diets (38%) than those who were not aware prior to the survey (11%). Clearly it takes more than being informed in a survey to change 30 years of conventional wisdom, but once the message takes hole, expect changes. In fact, two thirds of those aware of the message before doing the survey have already made changes to behavior:

...

Message testing advice"Become the known." My parents have given me plenty of great advice over the years, but this is my dad's favorite. If a new restaurant opens in town, he's on a first name basis with the owner within a week; at a large social gathering, he'll make a new friend in no time. While in these situations I usually prefer to remain just a face in the crowd, he encourages me to step forward and make myself known.

Recently, someone sent me a list of 50 pieces of advice being shared on social media (although "become the known" didn't make the cut!). This got me thinking – what are the best pieces of advice out there?

I set out to answer this question using TRC's online panel and our message testing Bracket™ technique. Through this tournament-style approach, we asked 500 respondents ages 25+ to choose the best (and worst) pieces of advice from this list of 50 items. Click here to see the full list. Our results were calculated at the respondent level, then aggregated and normalized on a 100-point scale.

So, what advice did our participants like best overall? The top 10 pieces of advice, in order of relative performance, were:

1. Show respect for everyone who works for a living, regardless of how trivial their job.
2. Remember, no one makes it alone. Have a grateful heart and be quick to acknowledge those who helped you.
3. Never waste an opportunity to tell someone you love them.
4. Never deprive someone of hope; it might be all they have.
5. Take charge of your attitude. Don't let someone else choose it for you.
6. Don't burn bridges. You'll be surprised how many times you have to cross the same river.
7. Count your blessings.
8. Choose your life's mate carefully. From this one decision will come 90 percent of all your happiness or misery.
9. Never give up on anybody. Miracles happen every day.
10. Loosen up. Relax. Except for rare life-and-death matters, nothing is as important as it first seems.

...
Recent comment in this post - Show all comments
  • michele
    michele says #
    When I was in sixth grade, my school offered a typing course. My mother is a registered nurse, and at that time didn't need (or ha

big league research conjointWhile there is so much bad news in the world of late, here in Philly we’ve been captivated by the success of the Taney Dragons in the Little League World Series. While the team was sadly eliminated, they continue to dominate the local news. It got me thinking about what it is that makes a story like theirs so compelling and of course, how we could employ research to sort it out.

There are any number of reasons why the story is so engrossing (especially here in Philly). Is it the star player Mo’ne Davis, the most successful girl ever to compete in the Little League World Series or perhaps the fact that the Phillies are doing so poorly this year or maybe we just like seeing a team from various ethnicities and socio-economic levels working together and achieving success? Of course it might also be that we are tired of bad news and enjoy having something positive to focus on (even in defeat the team fought hard and exhibited tremendous sportsmanship).

The easiest thing to do is to simply ask people why they find the story compelling. This might get at the truth, but it is also possible that people will not be totally honest (for example, the disgruntled Phillies fan might not want to admit that) or they don’t really know what it is that has drawn them in. It might also identify the most important factor but not make note of other critical factors.

We could employ a technique like Max-Diff and ask them to choose which features of the story they find most compelling. This would provide a fuller picture, but is still open to the kinds of biases noted above.

Perhaps the best method would be to use a discrete choice approach. We take all the features of the story and either include them or don’t include them in a “story description” then ask people which story they would most likely read. We can then use analytics on the back end to sort out what really drove the decision.  

...

Want to know more?

Give us a few details so we can discuss possible solutions.

Please provide your Name.
Please provide a valid Email.
Please provide your Phone.
Please provide your Comments.
Enter code below : Enter code below :
Please Enter Correct Captcha code
Our Phone Number is 1-800-275-2827
 Find TRC on facebook  Follow us on twitter  Find TRC on LinkedIn

Our Clients