Welcome visitor you can log in or create an account

800.275.2827

Consumer Insights. Market Innovation.

blog-page

pets pricing-researchIn a recent survey we conducted among pet owners, we asked about microchip identification. We found that cat owners and dog owners are equally likely to say that having their pet microchipped is a necessary component of pet ownership. That’s the good news.

The bad news is that when it comes time to doing it, the majority haven’t taken that precaution. 69% of the cat owners and 64% of the dog owners we surveyed say they haven’t microchipped their companion.

Why is microchipping so important?  Petfinder reports that The American Humane Association estimates over 10 million dogs and cats are lost or stolen in the US every year, and that 1 in 3 pets will become lost at some point during their lifetime. ID tags and collars can get lost or removed, which makes microchip identification the best tool shelters and vets use to reunite pets with their owners.

One barrier to microchipping is cost – it runs in the $25 to $50 dollar range for dogs and cats. Not a staggering amount, but pet ownership can get expensive – with all the “stuff” you need for your new friend, this can be a cost some people aren’t willing to bear. Vets, shelters and rescue groups sometimes discount their pricing when the animal is receiving other services, such as vaccines. Which begs the question, if vets want their patients to be microchipped, what’s the best way for them to price their services to make this important service more likely to be included?

It seems that pet microchipping would benefit from some pricing research. Beyond simply lowering the price, bundle offers may hold more appeal than a la carte. Then again, a single package price may be so high that it dissuades action altogether. Perhaps financing or staggered payments would help. And of course, discounts on other services, or on the service itself, may influence their decision. All of these possibilities could be addressed in a comprehensive pricing survey. We could use one of our pricing research tools, such as conjoint, to achieve a solid answer.

...

Brand PerceptionsIs the Mini Cooper seen as an environmentally friendly car? What about Tesla as a luxury car? The traditional approach to understanding these questions is to conduct a survey among Mini and Tesla buyers (and perhaps non-buyers too, if budget allows). Such studies have been conducted for decades and often involve ratings of multiple attributes and brands. While certainly feasible, they can be expensive, time consuming and can get outdated over time. Is there a better way to get at attribute perceptions of brands that can be fast, economical and automated?

Aron Culotta and Jennifer Cutler describe such an approach in a recent issue of the INFORMS journal Marketing Science, and it involves the use of social media data – Twitter, in this case. Their method is novel because it does not use conventional (if one can use that term here) approaches to mining textual data, such as sentiment analysis or associative analysis. Sentiment analysis (social media monitoring) provides reports on positive and negative sentiments expressed online about a brand. In associative analysis, clustering and semantic networks are used to discover how product features or brands are perceptually clustered by consumers, often using data from online forums.

Breaking away from these approaches the authors use an innovative method to understand brand perceptions from online data. The key insight (drawn from well-established social science findings) is that proximity in a social network can be indicative of similarity. That is, understanding how closely brands are connected to exemplar organizations of certain attributes, it is possible to devise an affinity score that shows how highly a brand scores on a specific attribute. For example, when a Twitter user follows both Smart Car and Greenpeace, it likely indicates that Smart Car is seen as eco-friendly by that person. This does not have to be true for every such user, but at “big data” levels there is likely to be a strong enough association to extract signal from the noise.   

What is unique about this approach to using social media data, is that it does not really depend on what people say online (as other approaches do). It only relies on who is following a brand while also following another (exemplar) organization. The strength of the social connection becomes a signal of the brand’s strength on a specific attribute. “Using social connections rather than text allows marketers to capture information from the silent majority of brand fans, who consume rather than create content,” says Jennifer Cutler, who teaches marketing at the Kellogg School of Management in Northwestern University.

Sounds great in theory, right? But how can we be sure that it produces meaningful results? By validating it with the trusted survey data that has been used for decades. When tested across 200+ brands in four sectors (Apparel, Cars, Food & Beverage, Personal Care) and three perceptual attributes (Eco-friendliness, Luxury, Nutrition), an average correlation of 0.72 shows that social connections can provide very good information on how brands are perceived. Unlike with survey data, this approach can be run continuously, at low cost with results being spit out in real time. And there is another advantage. “The use of social networks rather than text opens the door to measuring dimensions of brand image that are rarely discussed by consumers in online spaces,” says Professor Cutler.

...

pollsters-went-wrongThe surprising result of the election has lots of people questioning the validity of polls…how could they have so consistently predicted a Clinton victory? Further, if the polls were wrong, how can we trust survey research to answer business questions? Ultimately even sophisticated techniques like discrete choice conjoint or max-diff rely upon these data so this is not an insignificant question. 

 
As someone whose firm conducts thousands and thousands of surveys annually, I thought it made sense to offer my perspective. So here are five reasons that I think the polls were “wrong” and how I think that problem could impact our work.

 

 

5 Reasons Why the Polls Went 'Wrong'


1) People Don’t Know How to Read Results
Most polls had the race in the 2-5% range and the final tally had it nearly dead even (Secretary Clinton winning the popular vote by a slight margin). At the low end, this range is within the margin of error. At the high end, it is not far outside of it. Thus, even if everything else were perfect, we would expect that the election might well have been very close.  

...

2016 election sample representativenessI always dread the inevitable "What do you do?" question. When you tell someone you are in market research you can typically expect a blank stare or a polite nod; so you must be prepared to offer further explanation. Oh, to be a doctor, lawyer or auto mechanic – no explanation necessary!

Of course, as researchers, we grapple with this issue daily, but it is not often we get to hear it played out on major news networks. After one of the debates, I heard Wolf Blitzer on CNN arguing (yes arguing) with one of the campaign strategists about why the online polls being quoted were not "real" scientific polls. Wolf's point was that because the Internet polls being referenced were from a self-selected sample their results were not representative of the population in question (likely voters). Of course, Wolf was correct, and it made me smile to hear this debated on national TV.

A week or so later I heard an even more, in-depth consideration of the same issue. The story was about how the race was breaking down in key swing states. The poll representative went through the results for key states one-by-one. When she discussed Nevada she raised a red flag as to interpreting the poll (which has one candidate ahead by 2 - % points). She further explained it is difficult to obtain a representative sample in Nevada due to a number of factors (odd work hours, transient population, large Spanish speaking population). Her point was that they try to mitigate these issues, but any results must be viewed with a caveat.

Aside from my personal delight that my day-to-day market research concerns are newsworthy, what is the take-away here? For me, it reinforces how important it is to do everything in our power to ensure that for each study our sample is representative. The advent of online data collection, the proliferation of cell phone use and do-it-yourself survey tools may have made the task more difficult, but no less important. When doing sophisticated conjoint, segmentation or max-diff studies, we need to keep in mind that they are only as good as the sample that feeds them.

Hits: 3964 0 Comments

Recycling market researchIn my previous blog, we determined that people with access to recycling services don’t necessarily recycle. And men were far less likely to recycle regularly than women.

One problem potential recyclers face is there is no federal standard for what is collected and how. Services vary from one contractor to the next. Items deemed recyclable in one municipality may not be the next town over. As a general rule, bottles, cans, and newspapers are curbside-recyclable. Also as a general rule, prescription drugs, electronic devices, CFL bulbs and batteries are not – they shouldn’t go in the trash either - they require special handling.  But does the average consumer know this? We asked our online panelists who have access to recycling services how they believe their trash/recycling haulers would like them to handle certain items. And here’s what we learned:

  • Knowledge of recycling the Big-3 (glass bottles – aluminum cans – newspapers) is quite high. At least 80% of our panelists with access to recycling services know each of these should be recycled as opposed to trashed. And men and women are equally knowledgeable.
  • Word has spread that electronics do not belong in the trash. But our consumers are divided as to where they should go – 35% believe their contractor wants them in their recycling bin while just 46% believe electronics require special arrangements.
  • When we get to other items, things get a bit murky:
    1. Our panelists are as likely to believe that batteries can go out in the trash or recycling (45%) as believe batteries require special arrangements (41%). The rest aren’t sure.
    2. 19% aren’t sure what to do with compact fluorescent light bulbs.
    3. 22% believe that prescription drugs can be put out in the trash. 17% aren’t sure.
  • Meanwhile, some items that are traditionally “trashed” make consumers take pause – 26% of our consumers believe their hauler wants them to recycle linens and towels.

Focusing solely on those who say they recycle, women are more likely than men to know what goes where…

Recycling Market Research part2

Ladies, you may want to re-think having your gents handle the trash and recycling - or give them a quick lesson on what you've learned!

Recent Comments - Show all comments
  • Sheridan
    Sheridan says #
    How many people were in the "panelists"? I mean, 80% of the panelists know the Big-3 but 80% of how many? Thanks!
  • Michele Sims
    Michele Sims says #
    Thanks for your question! We surveyed 507 adults in the US.
Hits: 4992 2 Comments

Want to know more?

Give us a few details so we can discuss possible solutions.

Please provide your Name.
Please provide a valid Email.
Please provide your Phone.
Please provide your Comments.
Invalid Input
Our Phone Number is 1-800-275-2827
 Find TRC on facebook  Follow us on twitter  Find TRC on LinkedIn

Our Clients