‘Undoing’ the teaching of analytics

Rethinking mental models of decision-making.

Analytics challenges can destroy mental models that decision-makers have used historically in decision-making.

By Peter C. Bell

The last 10 or so years have been transformational for business analytics. Corporate leaders such as Jeff Bezos at Amazon, analytical innovators in sports such as Theo Epstein (Chicago Cubs and Boston Red Sox) and leading-edge analytical firms such as IBM, Walmart, UPS and FedEx have moved analytics from “a best kept secret” to top-of-mind for many major corporations. This increased awareness means that we no longer need to explain or “sell” the importance of data and analytics to students taking our courses.

Despite this increasing awareness, leaders of many successful organizations, perhaps a large majority (in 2013, Bain & Company found that “only 4 percent of companies were really good at analytics”) remain primarily experiential/intuitive decision-makers and don’t have much data nor the capabilities or time to perform much serious analytics. As a result, post-experience students arriving to MBA or EMBA programs often come from organizations that don’t possess strong analytics capabilities, but these entering students have often achieved apparent success as decision-makers using their “gut feel” or intuition. Experience and intuition will likely continue to be the key driver of decision-making for these students; they are unlikely to ever perform any serious analytics themselves. After teaching such students for 40 years or so, I became reasonably confident that I was delivering our analytics materials in a way that would influence them to become more analytical, and hopefully more consistently successful as decision-makers. But recently, my teaching has been disrupted.

Disruptive Technology

Peter Bell dropped “Moneyball” movie night after reading Michael Lewis’ new book, “The Undoing Project.”

Analytics is often a disruptive technology. Analytics challenges and can sometimes destroy the mental models that decision-makers have used historically in their decision-making. I have seen mid-career general managers who have (successfully) managed firms for some years react negatively and sharply to analytics ideas that contradict long-held beliefs that have proved successful for them in the past. The way that analytics is introduced to these students becomes very important in setting expectations for the analytics course but also for the remainder of their MBA or EMBA program. In engineering or mathematics schools, analytics will likely be presented as a problem-solving tool, but business school analytics instructors cannot be too disruptive. We cannot denigrate the role of experience and intuition (aka “the gut”) in decision-making since our colleagues teaching strategy, marketing, entrepreneurism and leadership are developing non-analytical management and decision-making skills. Consequently, I frame analytics as an input to the decision-making process, not the output of it. I emphasize that analytics does not make decisions; rather analytics helps intelligent, experienced, knowledgeable managers make better decisions. In support of this view, I make use of several cases where the output of the analytics is some kind of risk/return tradeoff where the final discussion leans on the intuition/experience of the students to arrive at an appropriate tradeoff to lead to a recommendation. In written work based on these cases, it’s rare that there is any consensus: Conservative students will recommend low risk actions while more aggressive risk-takers will opt for decisions that might end up really well but could be a disaster.

Since 2011, I started my EMBA analytics core class with a movie night. The evening before the first class, the school provided pop and popcorn, and the class sat and watched “Moneyball” (Columbia Pictures, 2011). This was a compromise solution – my first choice was to have the students read the book (“Moneyball: The Art of Winning an Unfair Game” by Michael Lewis, W.W Norton and Company, 2003), but this added too much to an already long pre-program reading list. The downside of the movie is that it’s a baseball movie and not all students are into baseball, although some are happy to watch Brad Pitt for 190 minutes. The book is much stronger on the analytics and not so much about baseball.

The following day during the first analytics class we discuss the movie and the messages it contains for managers. While this discussion tends to be quite freewheeling, we always spend time discussing two key scenes. The first where Billy Beane confronts the scouts with the news that the recruiting model that they have built their careers on is not going to be used anymore. Predictably the scouts react strongly

(“You don’t put a team together on a computer, Billy. Baseball isn’t just numbers, it’s not science. If it was, anybody could do what we’re doing. But they can’t because they don’t know what we know. They don’t have our experience and they don’t have our intuition. …. There are intangibles that only baseball people understand. You are discounting what scouts have done for 150 years.”)
This from the head scout just before he is fired.

For me the most telling (apocryphal?) scene appears near the end when John Henry Jr., the owner of the Boston Red Sox, is trying to recruit Beane to manage the Sox. Henry, not an analytics guy but a keen and involved observer of the development of the game, tells Beane:

“The first guy through the wall always gets bloodied. This is threatening not just their way of doing business but in their minds it’s threatening the game, but really what it’s threatening is their livelihood, threatening their jobs. Threatening the way that they do things and every time that happens … the people who are holding the reins … they go **** crazy. … Anybody who is not tearing their team down right now and rebuilding it using your model – they’re dinosaurs.”
While I resist presenting any form of summary or “answer,” I want students to recognize that analytics is an alternative way of thinking and managing that is here to stay and growing in influence and importance. However, analytics can be disruptive, and this presents a host of management challenges and issues.

“The Undoing Project”

I have now dropped “Moneyball” movie night after reading Michael Lewis’ new book, “The Undoing Project” (W.W. Norton and Company, 2016). I am an analytics guy. I think like an analytics guy, talk like an analytics guy and play golf like an analytics guy. After reading “The Undoing Project” I have come to realize that in my analytics teaching I need to pay more attention to how the rest of the world thinks so the content of this book has now replaced “Moneyball” as a major discussion item in my first analytics class, and ideas from this book are now appearing in many of my classes. Lewis connects “The Undoing Project” to “Moneyball”: The first two chapters update baseball analytics post “Moneyball” and discuss the spread of analytics into basketball and other sports, but Lewis then suggests that the remaining chapters of “The Undoing Project” detail the development of the basic analytical ideas that made the 2002 Oakland A’s possible. “The Undoing Project” is not an easy read, but it is an immensely interesting one that has disrupted the way I think about teaching analytics to our post-experience students.

Peter Bell wants students to recognize that analytics is an alternative way of thinking and managing that is growing in influence and importance.

Lewis documents the great friendship between psychologists Amos Tversky and Nobel Laureate Daniel Kahneman (T&K), their investigations into human decision-making and the many experiments they conducted to try to understand the failure of human intuition. Reading “The Undoing Project” provides the strongest case I have seen of the importance of analytical data-driven decision-making, but the book also leads me to believe that we can improve our students’ decision-making skills if we can sensitize them to the kinds of basic errors that people make when confronted with the two main factors that make decision-making difficult: uncertainty and complexity.

Lewis reports T&K’s lifelong pursuit into how human decision-makers actually respond to uncertainty and complexity. For example, Lewis describes T&K’s experiments on people’s reaction to uncertainty. People appear to equate uncertainty with ignorance; if we are asked to estimate a value, even something we have no reason to know, we will invariably claim too great a precision. In analytical terms, our prediction intervals are way too narrow. In business decision-making, the implication is that we consistently and perhaps severely underestimate risk if we rely on human judgment and not data for our probabilities. T&K concluded that humans process new information poorly; we “mistake the smallest part of a thing for the whole” and “even people trained in statistics and probability theory failed to intuit how much more variable a small sample could be than the general population.” This might explain why business and government managers often “panic” when they receive an exception report when the best action might just be to wait and see if the exception happens again.

T&K also conducted many experiments to explore the phenomenon of “haloing”: If you ask the class to write down the last two digits of their phone number and then estimate the takeoff weight of a Boeing 777, there will be a positive correlation between the two sets of values. In business, this bias can occur when a manager who has made a successful decision makes the same decision over again even if the circumstances have changed. Through sensitizing our students to this behavior, we can also make the point that data and analytics provide a vehicle to remove halo bias from our decision-making.

In analytics, we use probabilities very naturally to describe uncertainty, but T&K conducted experiments that suggest that people understand probability very differently. When asked which is more likely: A) a natural disaster in California that kills a thousand people, or B) an earthquake in California that kills a thousand people, about a third of people will chose option B. I tried this in class and was surprised by the result (as Lewis reports were T&K).

Among many other topics of interest, Lewis explores expert intuition and reports results of many experiments that demonstrate that human expertise often isn’t very expert. Experts appear to look for “cues” and often form an opinion based on just one or two recognized cues. He concludes that in most cases, expert judgment is not very expert and can be replicated by simple models. One lesson here is that you cannot simplify all complex decision situations by focusing on one or two key factors (“cues”); you need analytics to help you cope with the complexity.

Pedagogy & Persuasion

There are several pedagogical methods that are useful in trying to persuade our students that analytical thinking can improve their decision-making skills. For example, we sometimes have our students attack a difficult case before we provide them with the appropriate tools to conduct a reasonable analysis, thus forcing them to be more intuitive and setting up a demonstration of the weakness of their intuition. The theory of this approach is often being expressed as a need for students to “see the wall before they will buy the ladder.” For example, one can challenge students to come up with a decent solution to simultaneous decision problems before they see Excel Solver. Reading “The Undoing Project” made me aware of how bad we humans are at intuitive and experiential decision-making and has provided me with many small experiments that I can do or report in the classroom that will raise students’ awareness of pitfalls their intuition may land them in.

The experiments reported by Lewis in “The Undoing Project” make a strong case in support of a view that I have argued in the past: that “much of the benefit of analytics arises from the analytical problem-solving approach, and while the ‘advanced analytics’ is the cherry on the top, in some (perhaps many) situations, it might be quite a small cherry” (OR/MS Today, 2016). I am convinced that sensitizing our students to the kinds of systematic errors we all tend to make when confronted by complexity and uncertainty will add benefit of our data-driven analytical problem-solving approach and help them develop into better decision-makers. I recommend “The Undoing Project” to all analytics instructors. I think it will disrupt the way you teach analytics, too.

Peter C. Bell (pbell@ivey.ca) is a professor at the Richard Ivey School of Business, Western University, in London, Ontario, Canada.