Is Your Intuition Drowning In Math?

01.07.19

by Jason Raguso, Vice President, Data Sciences, Gongos, Inc.

When it comes to math and data science, we live in remarkable times. For a hundred bucks you can have your genome mapped at www.23andme.com.  Just a decade ago, this was unthinkable.  Computing power, boundless data and advanced statistical techniques have never been more accessible. The combination of these three has yielded unbelievable progress in the areas of life science, particle physics and predictive modeling.  Why then – if we are so capable of mapping our DNA, splitting atoms and detecting fraud – are executives and analysts more frustrated than ever when it comes to making great business decisions?

Despite the sophistication of information technology systems in Global 1000 corporations, executive decision-makers often characterize their organizations as “data rich and information poor.”  While executives and consumers both use emotion to make choices, analytics teams employ rational data-driven approaches to make sense of these choices. This “paradox of logic” manifests in different ways. The growing information at our disposal is fueling even more contradiction, and decision-making just feels harder. Time and time again, strong executive intuition says one thing while math says another.

The promise of math advancing decision-making makes fundamental sense. We can solve incredibly complex problems using mathematics, so it’s natural to want to apply math’s explanatory ability to business.  However, in adopting the technologies to accommodate new forms of data and advanced math, we’ve unconsciously skipped over the steps required to make analysis meaningful. Somehow we’ve gotten ahead of ourselves and our intuition needs to catch up.

The Rise and Fall of The First Golden Age of Decision Support: 1970 – 2008

Information technology in the 1970s and 1980s was heavily focused on automating essential routine business functions such as billing, labor and materials management. Most technology lived in a centralized data store and “dumb terminals” which allowed for basic read/edit functionality.  These green screens, though, were horribly inflexible. Enter the 1990s, where localized computing got easier. The Personal Computer enabled a department-level business executive, outside of the IT group, to “grab data and play with it.”

Throughout the next decade, a golden age of empowerment ensued. Department-level skills for analyzing data became both sophisticated and personalized.  Analysts accessed cuts of data from centralized systems and developed new tools within Excel and PowerPoint. These were glorious times for enabling decisions. Competitive advantages for business units hinged on having a good relationship with IT and a couple of members of your team being “good at Excel”.  When customized to suit executive needs, data was incredibly effective.

The Millennium brought about Datacubes and Enterprise Data Warehouses as the promise of the Internet and connectivity was harnessed.  Shortly thereafter, dashboards, OLAPs and simulators all became commonplace, albeit at varying levels.  And then, something happened.  Suddenly everyone was second guessing the veracity of their data. We began hearing “We need a single version of the truth” or “Which system did you pull that from?”  Fifteen minutes of every meeting was spent discussing why the data was valid. Like Adam and Eve in the Garden of Eden, we woke up and realized we were naked.

For the most part information was still moving at a glacial pace at this point. Large refreshes of data occurred in batch processes overnight or during the weekend and, IT departments could help keep data versions under control to keep everyone aligned.  Processes needed to be centralized and standardized.  Stricter data governance initiatives were enacted and battle lines were drawn to prevent rogue Access databases from being built.  During this centralization, business executives ceded control of what information they required to technologists.  And, in the blink of an eye, the first golden age of decision support was over.

All Sorts of Data in Real-Time: 2008 to Present

As they say, the only constant is change.  Meetings were no longer filled with queries about the validity of data.  Instead we heard questions about new forms of data such as, “Can we include our media spend by campaign to this analysis?” and “What is the value of a Facebook Like?”  We began feeling like we were missing something—a nagging sense that either the content or latency was inadequate.  Seemingly overnight, we went from waiting for data, to data waiting for us.

Naturally, our reaction was to recalibrate systems to capture, assimilate and analyze new forms of data in hopes of answering these new questions. The potential of data to empower and transform our decision-making was all too compelling.

But, these questions didn’t arise because data became “bigger.” The idea that data suddenly became big is erroneous. The nature of the data changed, as well as our ability to capture structured and unstructured data in real-time. The era we are currently in should more accurately be called “All Sorts of Data in Real-Time”.  It is the variety of data, the unknown value, and the zero-latency nature of it that confounds us—not the volume of the data files. Big Data is, however, a metaphor for disruption as it has largely disrupted corporate decision-making.

Yet, no one wants to miss the boat, so we’ve launched headlong into defining value in the era of All Sorts of Data in Real-Time.  Corporations are filling their ranks with folks from academic-intensive non-traditional math and computer science backgrounds.  Technologists, data scientists, mathematicians and support vendors – people who naturally find comfort in all this data – are now at the helm of these initiatives.

The mid-1990s was the first time business executives were involved across the board, manipulating, analyzing and personalizing new streams of data to suit their needs. Today, decision-makers suffer differently for three reasons.  First, executives are drowning in data, and their brains aren’t able to process any faster. Second, they are working through the IT and Data Science groups in the name of expertise and efficiency. Finally, executives are further removed from the data—and from letting their intuition help shape the process.

Bringing Rationality and Intuition into Balance:  2014 and Beyond

Most senior decision-makers are uncomfortable with the complexity of advanced applied mathematics. They’ve got no feel for it. Conversely, most analysts and technologists are trained to showcase math’s deterministic power. Yet, math is not deterministic when it comes to explaining random or rare occurrences. We’re relying heavily on math at a time when executives are struggling to keep up and analysts are ill-prepared to present data in a manner that speaks to business acumen.

We find ourselves at a paradox yet again. So, how do we break the cycle?

In the coming decade, a natural migration of math talent upward into organizations will address some of these issues.  Senior leaders will inevitably develop a feel for the math behind the analysis.  Analysts will become more exposed to storytelling.  Until we close this gap, three approaches stand to elevate analytics to becoming a definitive source for heightened intuition:

Transparency:  While math is deterministic, it cannot explain the irrational or random events that fill our everyday lives. Models are inherently unstable when pushed beyond the bounds of the data used to create them.  However, this won’t prevent an executive from asking a question beyond the scope of the data.  An analyst, and the overall organization, would be better served settling on “we can’t know for sure” than banking on a misleading projection. It is important to create a culture of vulnerability as we all work through the experiments required to learn.

Learn before Solving: Whether executives are expected to decipher math, or analysts are required to present a business case, we need to give both parties permission to ask questions and iterate.  Without this heuristic iterative approach, we hinder our implicit learning processes and validation sequences. Defining a set of objectives around learning would help level-set “rules of thumb” and create consensus on cause-and-effect relationships.  Trying to score consumer experience before both sides understand the definition and drivers undermines trial and error—the very nature of how we inform intuition.

Multidisciplinary Approaches: “Never once do I get a deck that says, ‘This is what I would do if I were you.’”  This quote from the CMO of a major insurance company captures the dysfunction of analytics teams.  Decision-makers and analysts need to be on the same team working through challenges together to develop an appreciation for each other’s vantage points.  More often than not, the analyst perceives their role as “solving a problem” as opposed to “help me make a decision”.  Teams can better account for these differing points of view by adopting the following guidelines:

Overall, these guidelines will help mitigate contradictions that compromise institutional intuition. As a result, executives and analysts will begin to develop mutual appreciation for learning, making choices, and driving confident decisions in our complex world.  Just as we evolved from green screens to empowerment, the next generation of decision-making promises alignment between the numbers and the stories they tell.

As published in Marketing Insights.