Moneyball and the Many Ways of Knowing

There is a scene in Moneyball, the 2011 film based on Michael Lewis’ novel, that captures a critical tipping point in how we understand the world around us.

Moneyball tells the story of Billy Beane, general manager of the Oakland Athletics. Beane pushed aside decades of conventional baseball wisdom in favor of analytics, found a competitive edge, and turned his low-budget baseball team into a championship contender. Relying on data analysis to make decisions might sound commonplace today, but the scene below shows just how revolutionary Beane’s approach was:

How ridiculous does Grady, the scout airing his grievances, sound? You don’t put a team together with a computerIt’s not science … You and ‘Google boy’ … There are intangibles that only baseball people understand!

Grady sounds like he is from the stone age – I can barely tell if I am listening to a MLB scout or a character from the Flintstones. In today’s age of “big data,” it’s mind-boggling to consider that multi-million dollar decisions were shaped by this kind of thinking just 15 years ago.

Moneyball marked the beginning of a seismic shift in how we make decisions that has stretched well beyond the world of sports. The so-called Moneyball approach – relying on data and analytics over intuition and experience – has transformed criminal justice, healthcare, music, government, and business. You name an industry, there’s a “Moneyball” for that. I can’t even pick a taco stand without crunching the numbers on Yelp. We define truth not just by what we see – we define it by what we can analyze and support with statistics. Moneyball is a symbol for the age of data, and a fundamental shift in how humans “know” what they know.

But what if we have gone too far?

I recently came across an editorial from 1990 titled Many Ways of Knowing, by Ann Hartman. Hartman, a leading practitioner, educator, and researcher in the field of social work, explains how no one method of research, of learning, can capture all there is to know:

We need large-scale studies in which variables can be reduced to measurable units and the results translated into the language of statistical significance. We need in-depth “thick descriptions,” grounded in context, of a single case, a single instance, or even a brief exchange. For example, large-scale studies of trends in marriage today furnish helpful information about a rapidly changing social institution. But getting inside one marriage, as in “Who’s Afraid of Virginia Wolfe?” richly displays the complexities of one marriage, leading us to new insights about the pain, the joys, the expectations, the disappointments, the intimacy, and the ultimate aloneness in relationships. Both the scientific and the artistic methods provide us with ways of knowing… There are indeed many ways of knowing and many kinds of knowers…

Some seekers of truth may take a path that demands distance and objectivity, whereas others rely on personal and empathic knowledge. Some will find the validation of their findings through statistical analysis and probability tests. Others will find it through the intensity and authenticity of “being there” … We must not turn our backs on any opportunities to enhance our knowledge… No one way of knowing can explore this vast and varied territory.

The Moneyball revolution was born from an over-reliance on intuition and first-hand experience over objective analysis and data. Billy Beane carefully reviewed all the information at his disposal, saw this imbalance, and exploited it. The world has followed suit in emphasizing data and analytics at the expense of more traditionally qualitative evidence. We need to be careful, however, not to abandon intuition and first-hand experience all together. The data is not always right – just ask the Atlanta Falcons (too soon?) or Hillary Clinton (yep, definitely too soon).

The best outcomes are often reached by finding that difficult balance between data and experience, analytics and instinct. Top venture capitalists like hideous-cowboy-shirt-wearing Chris Sacca don’t pick investments just based on balance sheets and financial statements; they need to meet the team and trust the company’s founder before they make a decision. High-stakes investors like Sacca realize that the numbers don’t tell the full story, and they are not alone. A comprehensive review of academic research on intuition and decision making, published in the Academy of Management Review, stated that:

…effective managers do not have the luxury of choosing between analysis and intuition—real expertise involves the use of both types of decision making… the ability to switch between “habits of mind” and “active thinking” is the ultimate skill…

While we may be able to debate what the “ultimate skill” really is (have you seen this? Or this?), the takeaway is clear. Analytics is an immensely valuable tool for making meaning of the world, but this tool is not a silver bullet.

This is not an attempt to discredit data or dismiss science. Throwing analytics out the window for being imperfect is, well, fucking stupid. This is a reminder to think critically and consider all the evidence available, whether that evidence is a statistic or something more intangible. The true lesson of Moneyball was not the importance of data or on base percentage. In reality, the Moneyball approach is to value all available information, even if it is not the accepted truth, and be bold enough to reach your own conclusions. No one way of knowing is enough.

3 thoughts on “Moneyball and the Many Ways of Knowing”

  1. I think anyone who says we need a balance between analytics and intuition don’t understand analytics. Sometimes I think people see analytics as a magic bullet which will give you the answer (and then when it doesn’t blame analytics). But it will only give you the answer to the question you ask, and intuition is about asking the right questions, a large part of analytics. In fact the goal of analytics should be to quantify the intuition. That way you can have empirical proof that the intuition was correct as well as remove the fact that our brains are generally pretty poor at estimating just about anything.

    And sometimes there’s something you can’t quantify, for example you mentioned meeting the team before investing in a company. I can’t think of an easy way to quantify this without risking a ton of money on investments you would normally pass on. In those cases, you settle with your intuition, but not because you’re striking a balance, because you have no other option.

    And finally, there’s the issue that in general people underestimate luck. Analytical methods for predicting the winner of last election by the team over at 538 had Trump at 20% to win, betting markets were similar. A lot of people see any number below 40% as 0%, but it’s not. 20% still means that if the election were held 100 times, Trump would win 20. And we’re living in one of those 20 universes. It doesn’t mean the methods they used were perfect, it could have been wrong. But it also could have been correct, and it seems people are quick to dismiss analytical methods that give a low probability for something occurring, and then that thing occurring. Low probability events happen far more than people think they will because our brains are absolutely terrible at estimating or understanding probability. We’re much better at seeing patterns that don’t exist.

    1. Agreed! Don’t think I would argue with anything you said in your first two paragraphs. I think the biggest issue is what you hinted at in the third paragraph. People suck at thinking in terms of probability. If you predict something will happen 30% of the time, and that thing happens, it does not mean your prediction was wrong – e.g., the election. Low probability events happen. With the benefit of hindsight, we can say Derek Jeter had a ~30% chance of getting a hit in any given at-bat in his career. That “low probability event” happened almost 3500 times – if he goes 3-for-4 in a particular game, that does not invalidate our forecast.

      I do think, however, that some people have fallen in love with having a dataset or statistic to quote at the cost of even considering other forms of evidence. Some things are hard to quantify, and some statistics are based on faulty assumptions or fail to completely capture a phenomenon. In those instances, taking the time to consider “softer” forms of evidence can add a lot of value to the process of making meaning of the world. It’s all about being open to all the different types of information available and taking what you can from each one.

Leave a Reply

Your email address will not be published. Required fields are marked *