Home » Artificial Intelligence » Artificial Intelligence, Cognitive Computing, and Decision-Making

Artificial Intelligence, Cognitive Computing, and Decision-Making

December 1, 2021

As President and CEO of a cognitive computing firm, I am occasionally asked to explain the differences between artificial intelligence and cognitive computing. Since IBM coined the term “cognitive computing” to describe its Watson computing system a decade ago, there have been numerous attempts, with varied success, to explain the differences. Former IBM CEO, Ginni Rometty (@GinniRometty), stated, “[When IBM coined the term cognitive computing] the idea was to help you and I make better decisions amid cognitive overload. That’s what has always led us to cognitive. If I considered the initials AI, I would have preferred augmented intelligence. It’s the idea that each of us are going to need help on all important decisions.”[1] In other words, cognitive computing began as an effort to augment human decision-making. We live in a complex world and decision-makers need a way to deal with that complexity. Cognitive computing can help.

 

That’s why I try to steer discussions away from arguments about the differences between AI and cognitive computing to focus on decision science. Enterra Solutions® is advancing Autonomous Decision Science™ (ADS™), which combines mathematical computation with semantic reasoning and symbolic logic. The Enterra ADS™ system analyzes data, automatically generates insights, makes decisions with the same kind of subtlety of judgment as an expert, and executes those decisions at machine speed with machine reliability. Like Rometty, I believe cognitive computing systems should be thought of as “augmented intelligence” systems whose goal is to help decision-makers make the best possible decision.

 

Cognitive Computing and Decision-Making

 

Cognitive computing systems provide the “best” rather than the “right” answers. The now-defunct Cognitive Computing Consortium explained:

 

The cognitive computing system offers a synthesis not just of information sources but of influences, contexts, and insights. To do this, systems often need to weigh conflicting evidence and suggest an answer that is ‘best’ rather than ‘right’. Cognitive computing systems make context computable. They identify and extract context features such as hour, location, task, history or profile to present an information set that is appropriate for an individual or for a dependent application engaged in a specific process at a specific time and place. They provide machine-aided serendipity by wading through massive collections of diverse information to find patterns and then apply those patterns to respond to the needs of the moment.”

 

Mary Ann Richardson (@CMR_ExecAdv), an independent IT analyst at technology research firm CMR Executive Advisory, insists cognitive computing has two distinguishing characteristics: Interactions with humans and generation of contextual solutions.[2] She writes, “Cognitive computing systems are thinking, reasoning and remembering systems that work with humans to provide them with helpful advice in making decisions. Its insights are intended for human consumption. AI intends to use the best algorithm to come up with the most accurate result or action. It works without human input. … [In addition,] cognitive computing can take into consideration conflicting and changing information that fits contextually into the situation at hand. Its results come from using predictive and prescriptive analytics–not pre-trained algorithms. … In the end, AI uses algorithms to solve problems to come up with a final decision; cognitive computing provides the pertinent information that will allow humans to make the final decision for themselves.”

 

According to IBM Senior Vice President John E. Kelly (@johnkellyibm), we are entering the Era of Cognitive Computing — a transition he believes is inevitable. “If we don’t make this transition,” Kelly told an audience at an IBM conference back in 2013, “the data will be too big for us to have any impact on it. I think that this era of computing is going to be about scaling human capability. The separation between human and machine is going to blur in a very fundamental way.”[3] The fact that cognitive computing systems are designed primarily to augment, rather than replace, human decision-making is the most important distinction between AI and cognitive computing. Technology writer Jun Wu calls cognitive computing platforms, “Smart decision support systems.”[4] She adds, “With recent breakthroughs in technology, these decision support systems simply use better data [and] better algorithms to come up with a better analysis of vast stores of information. Therefore, cognitive computing refers to: Understanding and simulating reasoning [and] understanding and simulating human behavior.” She concludes, “Using cognitive computing systems, every day, we make better human decisions at work.”

 

When to Use Cognitive Computing

 

Sue Feldman (@susanfeldman), President of Synthexis and co-founder of the Cognitive Computing Consortium, provides some useful guidelines concerning when cognitive technologies are useful and when they aren’t.[5] Those guidelines are:

 

When to Use:

 

• When problems are complex (i.e., information and situations are shifting and the outcome depends on context).
• When there are diverse, changing data sources (e.g., when using structured data with unstructured data, like text or images).
• When there is no clear right answer (i.e., when evidence is complex, conflicting or ambiguous).
• When multiple ranked, confidence scored options are needed.
• When unpredictability makes processing intensive and difficult to automate.
• When context-dependent information is desired, based on time, user, location, or point in task.
• When exploration or work across silos is a priority.

 

When not to use Cognitive Computing:

 

• When predictable, repeatable results are required (e.g., sales reports, inventory tracking).
• When all data is structured, numeric and predictable.
• When human-machine natural language interaction is not necessary.
• When a probabilistic approach is not desirable.
• When shifting views and answers are not appropriate or are indefensible due to industry regulations.
• When existing transactional systems are adequate.

 

As Radhika Subramanian (@Radhika4Real), Co-founder and CEO of Emcien, notes, “The primary reason for implementing these kinds of solutions is efficiency, applying the processing power of computers to the kinds of complex problems that are usually left to humans. … Instead of building smarter computers to bring miraculous solutions, most of these problems are better solved by smarter application of computing.”[6] She notes, “[There has been] a dramatic increase in complexity because the increased number of variables to consider is so much higher. That kind of complexity requires new approaches, and the dynamic nature of the decisions that have to be made doesn’t permit months of research to reach an acceptable level of accuracy.”

 

Concluding Thoughts

 

Gary Fowler, CEO and Co-Founder of GSDVS.com and Yva.ai, insists, “As a next-generation solution that’s already a reality, cognitive computing stands out with a number of attributes that make it a viable investment for businesses undergoing a digital transformation and seeking more growth, engagement and productivity. … Cognitive systems are tools designed to help humans — such as employees or company leaders or even customers — make better-informed decisions.”[7] Maruti Techlabs analysts add, “Despite all the challenges and hurdles, the benefits of cognitive technology cannot be overlooked. … Cognitive technology is sure to revolutionize multiple industry segments in the years to come.”[8] In fact, a growing number of analysts believe the adoption of cognitive technologies is becoming essential for business survival. Thomas H. Davenport (@tdav), a distinguished professor at Babson College, and Deloitte LLP managing directors Jeff Loucks and David Schatsky, conclude, “Companies cannot afford to bide their time while competitors potentially move forward with the technologies. … Now is the time for organizations to define AI-powered business use cases and outcomes that can deliver measurable ROI.”[9]

 

Footnotes
[1] Megan Murphy, “Ginni Rometty on the End of Programming,” Bloomberg BusinessWeek, 20 September 2017.
[2] Mary Ann Richardson, “Cognitive Computing vs. AI: 3 Key Differences and Why They Matter,” Toolbox, 23 September 2021.
[3] Audrey Quinn, “IBM research stakes its future on cognitive computing,” ZDNet, 2 October 2013.
[4] Jun Wu, “AI and Cognitive Computing,” Digital Leaders, 6 June 2019.
[5] Amber Lee Dennis, “Cognitive Computing Demystified: The What, Why, and How,” Dataversity, 15 February 2017.
[6] Radhika Subramanian, “Artificial Intelligence and Cognitive Computing,” Data Science Central, 4 March 2016.
[7] Gary Fowler, “Understanding Cognitive Cloud Computing And Its Potential Impact On Business,” Forbes, 24 February 2021.
[8] Staff, “What are the use cases and advantages of Cognitive Computing?” Maruti Techlabs, 2020.
[9] Thomas H. Davenport, Jeff Loucks, and David Schatsky, “Early Adopters Bullish on Business Value of Cognitive,” The Wall Street Journal, 11 January 2018.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!