Playing To Win

How to Prevent Data Analytics from Wrecking Your Strategy

The Three Key Principles

Source: Roger L. Martin, 2022

I seem to have created a lot of angst with my critical views on data analytics, about which I have written both earlier in this series and in my new book A New Way to Think. Because it feels that the topic warrants some further exploration, I have dedicated my 32nd Year II Playing to Win/Practitioner Insights (PTW/PI) piece to How to Prevent Data Analytics from Wrecking Your Strategy: The Three Key Principles. You can find the previous 84 PTW/PI here.

Angst & Challenge

People seem to find it very upsetting when I write about the dangerous misuse of data analytics, using the technique in ways that Aristotle, its inventor, specifically warned against. I can readily imagine why. My critique undermines the confidence in what they are doing now and rather than affirming their efforts to achieve expertise in the use of the techniques, it implicitly questions the wisdom of that intense investment. The fundamental belief in data analytics, the widely held conviction that all decisions should be made based on rigorous data analysis, renders my point of view largely heretical.

Those I upset turn quickly to criticism, quizzing, in the absence of using data analytics, how do I make decisions? Do I just randomly make choices? Do I employ a chimp to throw darts — and decide accordingly? I find the questions/critique a bit juvenile — if I don’t embrace data analytics I must be acting stupidly. But that is the nature of discussion in the modern world!

The answer is that all humans — me included — make decisions based on data and have since the dawn of time. Our conscious mind can’t not consider whatever data is at its disposal as it contemplates decisions or responses. We are inference machines, and it is not possible for us to not build inferences from data. Some of that data is current and other data has filtered into our subconscious over the eons so that we have a spider sense about a lot of things. We don’t eat strange-colored plants in the forest because generations of our forefathers learned that they can be poisonous and have embedded that sense of danger into our subconscious — and an automatic definition of what plant colors are ‘strange.’ Similarly, we won’t stand in the middle of an atrium that has dark archways on the second floor because our forefathers didn’t want to expose themselves to attack from enemies hiding above. Instead, we huddle around the edges instinctively — in perceived safety. And our fight-or-flight mechanism is triggered by things that our subconscious declares to be dangerous.

The only important questions are: which data do you use and with what techniques you interpret it? On that front, I personally attempt to avoid and counsel others to steer clear of three typical modern errors in the selection and interpretation of data.

1) Don’t Limit the Data Sources for Inferences

Analysis driven people simply avert their eyes to any data that is not in a form usable in the analytical framework(s) that they intend to deploy. To be included in formal data analysis, a datapoint needs to have a quantified manifestation. It can be a Yes/No (i.e., a zero or a one), or a number on a rating scale, or a specific quantity ($ of purchases, pages viewed, minutes spent, etc.). If it isn’t a specific quantity, it is treated as unusable and not included in the decision framework.

For example, when in the face of the existential threat presented by the Airbus 320Neo, Boeing made the competitive and economic calculation as to whether it should build a new competitive jet from scratch or modify the existing Boeing 737 to make it as fuel efficient as the Neo, it couldn’t quantify the disappointment both its loyal carrier customers and trusting fliers would feel when the kludgy 737 Max design started killing innocent people. Presumably, Boeing could estimate potential lawsuit costs, but not the complete destruction of trust. So, that potential for disappointment wasn’t in the analysis — even though its manifestation will seriously damage Boeing for decades.

Nor can a restaurant quantify either the intensity of delightful surprise guests may feel or its monetary value. But when I got to know Will Guidara, who was the General Manager of Manhattan’s Eleven Madison Park (EMP) when it became the #1 ranked restaurant in the world, it was clear that delightful surprise was a key guest variable for him in which he invested, despite being unable to put a number on it. And he would say it was the feeling of utter delight in pleasant surprises that propelled a great dining experience to the level of ‘unreasonable hospitality’ that made EMP #1, which he writes about in his forthcoming book.

Instead of ignoring data that doesn’t conform nicely, be a data omnivore. Use all types of data from the most qualitative to the most quantitative to power better inferences. AG Lafley tells a great story about being an omnivore in perhaps the most important decision of is pre-CEO career at P&G. He was trying to determine whether to propose a $250 million investment to compact the fluffy powdered detergent of that era so that consumers would benefit from smaller detergent boxes in their laundry rooms and retailers from using detergent shelf space more efficiently. He commissioned a big piece of classic quantitative research, and the results came back as inconclusive — not positive enough to go forward. But rather than restrict himself to only the quantitative data, he asked the market researchers to deliver a dozen banker’s boxes full of the original surveys to his office so that he could spend days reading through the verbatims that respondents wrote after answering all the standard quantitatively measurable questions. That qualitative data, which most in his position would have ignored, tipped the balance in favor of him proposing the investment — which turned out to be a huge success, happily for both AG and the company.

2) Don’t Flatten Data into its Most Unidimensional Form

For data that does have a quantifiable dimension, data analytic enthusiasts will squeeze all the nuances out of the original data and reduce it to a unidimensional form — a singular numerical representation. That step is necessary for the enthusiast to manipulate the resulting set of numbers using the chosen analytical method — e.g., determining mean, slope, standard deviation, statistical significance, etc.

But enormously valuable information is lost when data is flattened down in this way. All the nuances of the verbatims would have been lost had AG not insisted on factoring them in his thinking. And think how annoying it feels to you when you are asked to simplify your customer experience to one number. Maybe it is just me but it kind of creeps me out to be added in with everyone else as a single number.

I need to reach back in time to give Fred Kofman credit for helping me understand and conceptualize this point. The Argentinian writer, philosopher, academic and ex-LinkedIn and Google executive and I were speaking back-to-back at a conference at then-Arthur Anderson’s and now-Accenture’s legendary St. Charles training center outside Chicago, Illinois. It had to be sometime in the 1990–1996 period because I looked it up and that was the period during which Fred was a Management Information Systems professor at MIT, which was his title while speaking at the conference. He was both brilliant and hilarious — describing himself as philosopher trapped in the body of an accountant. But I remember him most for arguing that the dialect of numbers — and he was talking about this in the context of financial statements — is an extremely primitive language. If you can’t understand a company in more sophisticated terms, you are left with the only option of understanding it by the numbers on its financial statements. Again, in a hilarious and memorable turn of phrase, he analogized communicating in numbers to chimps communicating with one another. Because they lack a more sophisticated and nuanced language like English (or his own mother tongue, Spanish), all chimps can do is grunt at one another. To Fred, speaking in numbers has the equivalent sophistication of chimps grunting to one another.

I realized that he was entirely right. When you reduce the data to its most unidimensional form, what you gain in statistical significance, you lose in sophistication and depth of understanding.

3) Don’t be Convinced by Data that Something is Undoable

As I have discussed before, a key danger of data analytics is that its use will convince the analyst that a thing is undoable when it is, in fact, doable. The way this happens is the following. As of the time of its analysis, all data used in the analysis is from the past — that is by definition because you couldn’t be analyzing data if it weren’t in the past. It wouldn’t yet exist. Because if this, data analysis cannot tell you anything definitive about the future. It can only convince you — entirely implicitly — that the future will be an extrapolation of the past. If the data shows a pattern that is trending upwards, the data analysis will suggest that it will continue to trend upwards. If it is trending down, the downward trend will continue. If it is flat, the future will be flat.

Because there is so much confidence in rigorous data analysis, it is viewed as a window on the truth. Anything new can never be and will never be ‘proven’ to be true. Rigorous analysis can prove that customers are currently unhappy. In this way, data analysis has real utility because to change something, you need to understand how it works today — and data analytics can tell you that.

But the only way to create a future that is distinct from the past is to free yourself from the steely grip of data analytics. The more you depend on it, the more it will lead you to believe that the future will be an extrapolation of the past and anything that deviates from that extrapolation is not possible — so says the analysis. Instead, the future is created by imagining possibilities that are divorced from ‘what the data proves.’

Practitioner Insights

Free yourself from the strictures imposed on you by the modern world of data analytics. Don’t let it blind you to any part of the full spectrum of data — from the most qualitative to the most quantitative. Data on one end of that spectrum is not inherently superior to data on the other. It is all information on which inferences can be made and insights can be generated.

In addition, the inferences will be more thorough, sophisticated, and powerful to the extent that all aspects of the data are used, not just the quantifiable dimension(s). Lean into the qualitative dimensions of data. Interpreting qualitative data is a valuable skill that takes practice to develop — and now is the time to start.

Finally, never let ‘the data’ convince you that something can’t be done. Some things can and some things can’t — but in business, the determining factor is not what the quantitative data from the past ‘says.’ More likely, the determining factor will be the combination of imagination and the willpower to make it happen. And after (and only after) it does, the data analytics enthusiasts will say that the data shows that it was obvious — but only because you will have created new data by your actions.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Roger Martin

Professor Roger Martin is a writer, strategy advisor and in 2017 was named the #1 management thinker in world. He is also former Dean of the Rotman School.