There is a well-known idiom cautioning against throwing good money after bad. John Pavlus (@johnpavlus) writes, “Any student of economics knows this basic rule, which states that rational agents should not take irrecoverable or ‘sunk’ costs into account when making decisions about present or future investments. Nevertheless, human beings break this rule all the time, succumbing to a cognitive bias known as the ‘sunk-cost fallacy.’ If you have ever sat through a bad movie because you did not want to ‘waste’ the money you paid for the ticket or finished a PhD program you lost interest in years ago because of all the work you had already done, you have made this mistake.”[1] Pavlus notes a study conducted by Sandeep Baliga (@BaligaSandeep), a professor at the Kellogg Graduate School of Management, and Jeffrey Ely, a professor at Northwestern University, suggests sunk-cost thinking may be “hardwired” into the human brain. Baliga told Pavlus, “We’re programmed by instinct — it’s like wanting to eat fatty food or meat whenever you see it. We have to exert an enormous amount of self-control to avoid those tendencies.” Wouldn’t it be great if there was a way to avoid bad decisions that could result in throwing good money after bad? Jennifer Lerner, Thornton F. Bradshaw Professor of Public Policy, Decision Science, and Management at Harvard’s Kennedy School, is trying to find ways to help decision makers avoid common mistakes and biases in order to make better decisions.[2]
Applying decision science
According to Lerner, sunk-cost bias is only one way people end up making poor decisions. She notes, “We’ve conducted several experiments aimed at mitigating sunk-cost mistakes and we are also working on many other worrisome tendencies. There are probably 30 different errors and biases that even the smartest people fall victim to systematically.” Recently, Lerner has been working with the United States Navy. She notes, “In the domain of national security, stakes are especially high and commanders know it. They want to set the highest possible standards. If we academics can improve the accuracy of risk perception by even 10 percent, for example, we can save lives. And in fact, we can improve the accuracy of estimates by significantly more than 10 percent.” Although life-or-death isn’t always involved in business decisions, good decision-making still makes a tremendous difference.
Bain analysts, Michael C. Mankins and Lori Sherer (@lorisherer), assert if you can improve a company’s decision making you can dramatically improve its bottom line. They explain, “The best way to understand any company’s operations is to view them as a series of decisions.”[3] They add, “We know from extensive research that decisions matter — a lot. Companies that make better decisions, make them faster and execute them more effectively than rivals nearly always turn in better financial performance. Not surprisingly, companies that employ advanced analytics to improve decision making and execution have the results to show for it.” One tool now available to leaders to help them improve decision-making is cognitive computing. Swamini Kulkarni explains, “Cognitive computing systems are used to find solutions to complex situations where answers are uncertain or ambiguous, using computerized models that simulate the human cognition process.”[4]
Augmenting human decision-making
A few years ago, analysts from Deloitte predicted machine intelligence, rather than artificial intelligence (AI), would be the “next chapter in the advanced analytics journey.”[5] They noted the “algorithmic capabilities” of machine intelligence (what many are now calling cognitive technology) “allow for improvement in employee performance, workload automation, and allow for the development of cognitive agents to help with human thinking and engagement.” Around the same time Deloitte analysts were making their predictions, Shailesh Manjrekar (@shail_manjrekar), Director of Product and Solutions at Western Digital, wrote, “With the ability to understand the context behind the content, cognitive computing is taking Big Data analytics from data-driven to value-driven. As we unleash more possibilities and opportunities for data through technological advances, we are able to not only turn data into information, but also to transform it into knowledge and even wisdom.”[6]
The term “cognitive computing” was coined by IBM. Ginni Rometty (@GinniRometty), IBM’s CEO, explains the company wanted to find a term more conducive to human/machine collaboration. According to Rometty, AI seeks to match human intelligence whereas cognitive computing seeks to augment human intelligence. She explains, “[When IBM coined the term cognitive computing] the idea was to help you and I make better decisions amid cognitive overload. That’s what has always led us to cognitive. If I considered the initials AI, I would have preferred augmented intelligence. It’s the idea that each of us are going to need help on all important decisions.”[3] At Enterra Solutions®, we define cognitive computing as a combination of semantic intelligence (i.e., machine learning, natural language processing, and ontologies) and computational intelligence (i.e., advanced mathematical techniques). Our cognitive system, the Enterra Cognitive Core™, is a system that can Sense, Think, Act, and Learn®. Like Rometty, I see cognitive computing mostly in an augmenting role.
Humans require augmenting because their emotional states can adversely affect decision-making. For example, Lerner has found specific emotions, such as happiness, anger, sadness, and fear can alter decision processes in three ways involving the content of thought, the depth of thought, and the implicit goals activated. What about intuition? John Bargh, a psychology professor at Yale, concludes, “It’s a bad idea to rely on it when the consequences of our decision are dangerously high.”[7] According to Bargh, “Our emotional states change what our gut tells us.” For example, he states, “Say you are angry and tell someone off and think that is the truth. The next day you may be in a very different emotional state and the truth is different.” Augmenting intuition with data and advanced analytics can moderate the effects of bias and emotion. Gerrit Kazmaier (@gerritkazmaier), Executive Vice President for Analytics & Database and Data Management at SAP, explains, “Making the right choice requires a company to understand every aspect of its business — in the past, the present, and the future — and to recognize the value of the data available to them and what it tells them about their business. Ultimately, the aim of analytics within the enterprise should therefore not simply be to report on what has been, but to enable everyone at every level of an organization to make decisions with confidence.”[8]
Concluding thoughts
Manjrekar concluded, “The power of data is undeniable, and companies are finding ways to generate more value from it, enabled by the combination of people and machines. … Moving forward, it is important to turn data into information, knowledge and eventually wisdom.” Kazmaier adds, “From the boardroom to the shop floor, analytics has become a tool that can be accessed by everyone. As users and as people, we bring our own unique perspective to our analysis of the data. It is exactly this combination of such powerful artificial intelligence and the inherent creativity of the people who use it that ultimately enables us to make decisions faster and with greater confidence than ever before.” The combination of decision science and technology holds great promise for improved decision-making in both the business and public sectors.
Footnotes
[1] John Pavlus, “There’s actually reason to throw good money after bad,” Quartz, 23 May 2013.
[2] Michael Blanding, “The Decision Scientist: Kennedy School Professor Jennifer Lerner is teaching the military how to harness the science of judgment and decision making,” Harvard Kennedy School, Spring 2019.
[3] Michael C. Mankins and Lori Sherer, “Creating value through advanced analytics,” Bain Brief, 11 February 2015.
[4] Swamini Kulkarni, “Cognitive Computing Is Not Hype: It Is A Must-Have For Organisations,” Compare the Cloud, 24 July 2019.
[5] Justine Brown, “Deloitte: Machine intelligence, not AI, will be the next big thing,” CIO Dive, 9 February 2017.
[6] Shailesh Manjrekar, “How Cognitive Computing and Augmented Intelligence Is Changing The Data-Driven World,” Western Digital Blog, 6 February 2017.
[7] Elizabeth Bernstein, “Does Your Gut Always Steer You Right?” The Wall Street Journal, 9 October 2017.
[8] Gerrit Kazmaier, “From Augmented Analytics to Confident Decisions,” Manufacturing Business Technology, 2 May 2019.