Playing To Win

Should We Listen to Customers?

You Need to Ask a Better Question

Roger Martin

--

Copyright: Roger L. Martin

There is lots written and debated about the advisability of listening to customers when formulating strategy. I think the framing of the question as whether to listen to customers is not helpful and that is why I am writing my 47th Playing to Win/Practitioner Insights (PTW/PI) piece on Should We Listen to Customers: You Need to Ask a Better Question. (Links for the rest of the PTW/PI series can be found here.)

Source of the Question

The question arises because of the provocative statements of two of America’s greatest all time business leaders and game changers, Henry Ford and Steve Jobs.

Ford is reputed to have said: “If I had asked people what they wanted, they would have said faster horses.” Interestingly, there is no evidence that he ever mouthed or wrote those words, but it has been repeated so many times that people generally believe that he must have. (Parenthetically, it is like Peter Drucker’s “culture eats strategy for lunch.” Despite extensive forensic work by Drucker archivists, there is not a shred of evidence he ever said or wrote that phrase, but because people think the great business philosopher did, they assume culture must indeed eat strategy for lunch.)

And Jobs really said both “It sounds logical to ask customers what they want and then give it to them. But they rarely wind up getting what they really want that way” and “It’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them.

These pointed views from legendary company-builders on the dangers of listening to customers have caused people to wonder and speculate about the merits of listening to customers. And the concern has been exacerbated, I believe, by the revelation that some of the most beloved customer research techniques have been found utterly wanting. For example, the only thing that focus groups will reliably determine is what the members of the focus group think that the focus group leader wants them to say. And many products or services that pass a statistically significant customer research test fail when they are brought to the market in which the research was performed. Plus, the modern critique is: Why perform customer research on a sample to predict behavior when you can just do A/B testing and do what they respond better to, even if you have no idea why they did?

We Need a Better Question

I believe we need a better question than: should we listen to customers? And we need a better answer than: just do A/B testing.

I start from the premise that every successful product/service derives from insights about customers. Ford’s insights about the working-class Americans he wanted to serve helped him determine that it was necessary to offer them a car in any color they wanted as long as it was black in order to hit the price point necessary to make the car widely affordable. And Jobs famously had powerful insights about the needs, wants, and motivations of computer and personal electronics users — not to mention animated movie watchers. And while the inventions behind Teflon, Penicillin, Post-It Notes and Viagra were acknowledged customer-insight-free accidents, the decision to invest in what turned out to be successful commercialization was based on powerful customer insights. Otherwise, these inventions would have never made it out of the lab.

To me, the more useful question is: What particular customer insights would equip you to make the decisions you seek to make? The answer to that question will inform in what way, you should ‘listen to customers.’ That is because you need to employ a technique for acquiring those insights that is consistent with the nature of the insight you are seeking.

What Customers Are Doing

If you seek to hone and refine what you are currently offering, then A/B test and adjust accordingly. You don’t have to understand why customers react the way they do, just that they do more of what you want. That is a form of listening, although ‘watching’ might be a more accurate description.

What Customers Have Done

If you are trying understand what customers have already done, you can use quantitative sampling techniques for listening. You can ask, where have you shopped for x? Or are you aware of brand y? And if so, have you purchased it in the past? However, this methodology assumes truthfulness and self-awareness, both of which are at best tenuous assumptions.

An additional limitation of this technique is that the questions must be sufficiently simplistic for the answers to be added up and assessed quantitatively. You can’t ask how you felt when you bought x, unless you specify the (say) five categories of feeling into which they need to wedge their answer — which you can guarantee is not how they actually felt but only an approximation. The attraction is that you can add the answers up and make a statistical inference about the answers. The downside is that the statistically significant answers may well be stupid. The bottom line is that you can’t ask a simplistic question, which you must do with a quantitative instrument, while expecting a profound answer.

What Customers Might Do

If you attempt to ask a sophisticated question, such as whether you would buy this new thing that you haven’t seen before, there is an even bigger problem. Customers can’t give you accurate feedback because they don’t know the true answer. But when you ask, they will give you an answer and typically will express confidence about it. From his comments, it appears Jobs was particularly worried about this problem. Customers don’t know the answer that you would need from them in order to make your decision. So don’t ask them a question that they can’t answer. And for sure don’t ask it in a quantitative survey.

If you need insights about customers’ likely reaction to an offering that they haven’t yet experienced, the best way is for the person who desires customer insight to interact directly with potential customers from whom the insights need to be extracted. That means interacting personally without an intermediary. The quality and utility of the interpretation is based on the skills and capabilities of the interpreter. The task of gaining useful customer insights in this context is tough enough already without playing proverbial ‘telephone’ through intermediaries, such as market research firms. That is why I prefer 10 in-person interactions between the customer and the business decisionmaker (e.g., CEO) than a rigorously designed 10,000-person customer survey. The latter puts too many unhelpful filters between the decisionmaker and the customer.

The task in this situation is to build a model of customers by interacting with them, both watching them and asking them questions about what you see them do. It will be a profoundly imperfect model. But it is a model of how they think, what they care about, what annoys them, what makes them happy. All the customer savants that I have met habitually build a model of their customer which they use to make their bets. When I ask them, they can readily explain the bets they make based on their model of the customer. They learn continuously by watching how their bets play out and updating their model accordingly — what a nerd would call Bayesian updating.

The very best figure out how to make little bets in order to learn along the way. Four Seasons Hotels founder Isadore Sharp didn’t bet the entire chain on what became ‘the Four Seasons Model.’ He built a model of luxury hotel guests by speaking directly with and watching them in action, which he used to create the chain’s legendary London Park Lane hotel. Only when he was able to observe its huge success did he use it as the model for every other new Four Seasons hotel.

This also reinforces a brilliance of iterative prototyping, an aspect of the design thinking movement that was popularized by David Kelley and Bill Moggeridge of IDEO. Iterative prototyping gave them progressively clearer insights about what customers will actually do by exposing them multiple rounds of increasingly higher resolution prototypes. This form of insight allows model improvement by way of repeated Bayesian updating.

What Customers Might Want

Yet another kind of desired insight from customers is an idea for a new product or service. Both Ford and Jobs express considerable pessimism about the ability to get that sort of insight from customers. Here I partially agree and partially not.

I agree in that developing an idea for a new product/service is your job, not theirs. Customers typically don’t have motivation for investing the time necessary to dream up a new offering to meet their needs. Plus they often lack the required skills, such as expertise in the technology involved or manufacturing/service design experience. Consequently, I would never survey customers to find out what new product/service they would imagine or favor. On average, customers won’t provide interesting or valuable answers to that question.

That having been said, while you shouldn’t want the views of a statistically significant sample of customers because the vast majority will not have invested enough time and thought into the question, the outliers can be interesting. Sourcing a new idea is a bit like finding a buyer for your house. It doesn’t matter what the median potential buyer of your house thinks it is worth. What really matters is the one buyer who absolutely loves your house. Similarly, what matters is the one fabulous new-product idea. That is why I like the idea of listening to customers by way of crowdsourcing as one source of new ideas. It doesn’t matter that the vast majority of the ideas aren’t of any value. Only a tiny few ideas need to be any good for crowdsourcing to be a winning tool. That is why I like customer crowdsourcing platforms like Lego Ideas. I wouldn’t ever count on this technique for all my new product/service ideas, but I would certainly utilize it as one tool.

Practitioner Insights

When developing strategy, you should listen to customers but the techniques you use to pay attention must vary depending on what you are seeking to accomplish. Think of your task as carefully matching the insight sought to the technique used.

If you are honing and refining and don’t care why, then listen by way of A/B testing. If you are trying to understand prior behaviors, listen by way of quantitative sampling. If you are trying to predict future actions, observe customers yourself — not through intermediaries — to build a model and then figure out how to make bets to test and refine your customer model. If you are trying to come up with a new offering, supplement observing and modeling customers by crowdsourcing ideas from them.

--

--

Roger Martin
Roger Martin

Written by Roger Martin

Professor Roger Martin is a writer, strategy advisor and in 2017 was named the #1 management thinker in world. He is also former Dean of the Rotman School.

Responses (6)