When Making Decisions, What Should You Trust: AI or Your Gut?

Stephen DeAngelis

August 31, 2020

Intuition is defined as the ability to understand something immediately, without the need for conscious reasoning. It’s also been called a gut feeling. Alyssa Newcomb notes, “Gut feelings aren’t always correct, but ask any business leader how they make decisions, and intuition almost always plays a role — even in the age of advanced analytics.”[1] Valerie van Mulukom (@herebehumans), a Research Associate in Psychology at Coventry University, believes gut feelings have been given a bad rap in the west. She explains, “Imagine the director of a big company announcing an important decision and justifying it with it being based on a gut feeling. This would be met with disbelief — surely important decisions have to be thought over carefully, deliberately and rationally? … Emotions are actually not dumb responses that always need to be ignored or even corrected by rational faculties. They are appraisals of what you have just experienced or thought of — in this sense, they are also a form of information processing. Intuition or gut feelings are also the result of a lot of processing that happens in the brain. Research suggests that the brain is a large predictive machine, constantly comparing incoming sensory information and current experiences against stored knowledge and memories of previous experiences, and predicting what will come next.”[2]


I agree that the human brain is a marvel and a wonder; however, I also believe augmenting the brain with technology makes a whole lot of sense. Bain analysts, Michael C. Mankins and Lori Sherer (), write, “We know from extensive research that decisions matter — a lot. Companies that make better decisions, make them faster and execute them more effectively than rivals nearly always turn in better financial performance. Not surprisingly, companies that employ advanced analytics to improve decision making and execution have the results to show for it.”[3]


Augmenting human decision-making


Artificial intelligence (AI), in its many forms, has been a continual target of fear-mongering for years. Most of the concerns have been targeted at artificial general intelligence — that is, yet-to-be developed machines capable of thinking on their own. The most common fear is that such machines may decide humans need eliminating. Fortunately, most cognitive technologies today fall short of artificial general intelligence. Nevertheless, some people still worry that letting machines make decisions on their own could prove troublesome. For example, a panel at a JPMorgan Chase conference focused on “Optimization and the Path to Innovation,” concluded, “Using artificial intelligence to help inform decisions rather than to make decisions in place of people is the best way to comply with a patchwork of state laws surrounding the use of data and artificial intelligence.”[4] Cognitive computing, a form of AI, was originally created with augmentation in mind. Former IBM CEO, Ginni Rometty (@GinniRometty), explains, “[When IBM coined the term cognitive computing] the idea was to help you and I make better decisions amid cognitive overload. That’s what has always led us to cognitive. If I considered the initials AI, I would have preferred augmented intelligence. It’s the idea that each of us are going to need help on all important decisions.”[5]


Falon Fatemi (@falonfatemi), founder and CEO of Node, prefers the term artificial intuition to augmented intelligence. She explains, “Artificial intuition makes intelligent decisions that are unique to each application, user, and use-case. … What we’ve seen, from a business outcomes perspective, is the system is not only able to predict accurately and find more prospects — such as those best prospects that are likely to convert very quickly — but the system has actually been able to navigate and identify new markets of opportunity that were previously untapped by those customers using traditional heuristics-based methods. And that’s the power of this technology. It essentially turns data into decisions — both the context-specific that it’s analyzing within an application, and also learning from the intuition that the end-users have as to where they’re most successful.”[6]


The editorial team at insideBIGDATA writes, “Artificial Intelligence is quickly becoming the most sought technology to create business advantage. From predictive analysis, to streamlining workflows, to transforming customer experience, AI is being used industry and department wide to ripple great impact throughout entire organizations.”[7] In order to achieve “great impact,” companies need to ensure they are supported by both data scientists and decision scientists. Chris Dowsett, the Head of Decision Science + Analytics at Instagram, explains, “The Data Scientist focuses on a finding insights and relationships via statistics. The Decision Scientist is looking to find insights as they relate to the decision at-hand. Example decisions might include: Age groups to focus on, most optimal way to spend a yearly budget or figuring out a way to measure a non-traditional media mix. … A business needs to both move forward with decision making while also improve its products for the longer term.”[8]


Not every decision requires human intervention


Eric Colson (@ericcolson), Chief Algorithms Officer at Stitch Fix, writes, “Many companies have adapted to a ‘data-driven’ approach for operational decision-making. Data can improve decisions, but it requires the right processor to get the most from it. Many people assume that processor is human. The term ‘data-driven’ even implies that data is curated by — and summarized for — people to process. But to fully leverage the value contained in data, companies need to bring artificial intelligence into their workflows and, sometimes, get us humans out of the way. We need to evolve from data-driven to AI-driven workflows. Distinguishing between ‘data-driven’ and ‘AI-driven’ isn’t just semantics. Each term reflects different assets, the former focusing on data and the latter processing ability.”[9] Every decision-maker knows some decisions are rules-based and routine. These types of decisions are ideal for AI-based decision-making. Colson notes, however, “Removing humans from workflows that only involve the processing of structure data does not mean that humans are obsolete. There are many business decisions that depend on more than just structured data.” Even in those cases, augmenting human decision-making can be important. He concludes, “Moving from data-driven to AI-driven is the next phase in our evolution. Embracing AI in our workflows affords better processing of structured data and allows for humans to contribute in ways that are complementary.”


[1] Alyssa Newcomb, “Artificial Intuition Wants to Guide Business Decisions. Can It Improve on ‘Going With Your Gut’?” Fortune, 11 July 2019.
[2] Valerie van Mulukom, “Is it rational to trust your gut feelings? A neuroscientist explains,” The Conversation, 16 May 2018.
[3] Michael C. Mankins and Lori Sherer, “Creating value through advanced analytics,” Bain Brief, 11 February 2015.
[4] Dan Clark, “Allow Artificial Intelligence to Supplement Decisions, Not Make Them,” Law.com, 4 March 2020.
[5] Megan Murphy, “Ginni Rometty on the End of Programming,” Bloomberg BusinessWeek, 20 September 2017.
[6] Newcomb, op. cit.
[7] Editorial Team, “State of AI Decision-Making,” insideBIGDATA, 27 February 2020.
[8]Chris Dowsett, “Data Science vs Decision Science,” Towards Data Science, 24 January 2020.
[9] Eric Colson, “What AI-Driven Decision Making Looks Like,” Harvard Business Review, 8 July 2019.