People struggle to label, define, and describe new technologies. For example, when the car was first developed, people called it a horseless carriage. Today, we find the same phenomenon occurring with driverless cars.

Cognitive Computing in the Digital Era

Stephen DeAngelis

March 02, 2021

People struggle to label, define, and describe new technologies. For example, when the car was first developed, people called it a horseless carriage. Today, we find the same phenomenon occurring with driverless cars. If the term “automobile” wasn’t already taken, it might well be the term used to describe autonomous vehicles. It should come as no surprise, therefore, when people talk about cutting-edge computing capabilities, many of them are unhappy with terms like artificial intelligence (AI), machine learning (ML), and cognitive computing. When Alan Turing started considering thinking machines, he called them “computing machinery.” He began his seminal paper, “Computing Machinery and Intelligence,” by stating “I propose to consider the question, ‘Can machines think?’”[1] Turing didn’t believe defining the terms “machine” and “think” would accomplish much, so he developed what became known as the Turing Test. He concluded his paper by writing, “We may hope that machines will eventually compete with men in all purely intellectual fields. … We can only see a short distance ahead, but we can see plenty there that needs to be done.” And, over the years, much has been done. Nevertheless, we still struggle with labels.

 

Intelligence is defined as the ability to acquire and apply knowledge and skills. If a machine can do that, then the intelligence isn’t artificial, it’s real — perhaps machine intelligence would be a better term. However, the term intelligence, in the minds of many people, involves sentience (i.e., the capacity to feel, perceive, or experience subjectively — to be self-aware). In hopes of avoiding metaphysical arguments, IBM coined the term “cognitive computing.” However, when something is cognitive it involves conscious intellectual activity (such as thinking, reasoning, or remembering). No one asserts that cognitive computing systems are “conscious” in any meaningful way. Former IBM CEO, Ginni Rometty (@GinniRometty), explained why IBM selected the term “cognitive computing.” She stated, “[When IBM coined the term cognitive computing] the idea was to help you and I make better decisions amid cognitive overload. That’s what has always led us to cognitive. If I considered the initials AI, I would have preferred augmented intelligence. It’s the idea that each of us are going to need help on all important decisions.”[2]

The importance of cognitive computing

Since no viable terms have been developed to replace artificial intelligence, machine learning, or cognitive computing, I suspect we will be using them for a long time. As President and CEO of a cognitive computing firm, I want to focus on the capabilities of cognitive computing systems. Sue Feldman (@susanfeldman), President of Synthexis and co-founder of the defunct Cognitive Computing Consortium, writes, “When we first attempted to define cognitive computing, we found clear differences between it and AI. We posited that for software to be considered a new type of computing — ‘cognitive,’ it must solve problems that were insoluble today. This new class of problem has no precise answers. Instead, it is open to interpretation — it is ambiguous or has no one right answer that is amenable to computation.”[3] In other words, cognitive computing systems can help decisionmakers with the types of challenges they face every day.

 

Feldman adds, “For an application to be considered ‘cognitive,’ … we proposed that it be adaptive, interactive, iterative, stateful, and, above all, contextual. In the … years since cognitive computing appeared, however, our understanding of the missing pieces that this new kind of computing might provide has evolved. We have found that different depths of analysis are required for different purposes. This means that potential users and buyers must define what their purpose is in investing in a new technology.” Feldman provides some useful guidelines for when cognitive technologies are useful and when they aren’t.[4]

 

When to Use:

  • When problems are complex (i.e., information and situations are shifting and the outcome depends on context).
  • When there are diverse, changing data sources (e.g., when using structured data with unstructured data, like text or images).
  • When there is no clear right answer (i.e., when evidence is complex, conflicting or ambiguous).
  • When multiple ranked, confidence scored options are needed.
  • When unpredictability makes processing intensive and difficult to automate.
  • When context-dependent information is desired, based on time, user, location, or point in task.
  • When exploration or work across silos is a priority.
 

When not to use Cognitive Computing:

  • When predictable, repeatable results are required (e.g., sales reports, inventory tracking).
  • When all data is structured, numeric and predictable.
  • When human-machine natural language interaction is not necessary.
  • When a probabilistic approach is not desirable.
  • When shifting views and answers are not appropriate or are indefensible due to industry regulations.
  • When existing transactional systems are adequate.

Cognitive computing in the Digital Era

The Digital Era is characterized by the availability of massive amounts of data. Yossi Sheffi (@YossiSheffi), the Elisha Gray II Professor of Engineering Systems at MIT, asserts, “The well-worn adage that a company’s most valuable asset is its people needs an update. Today, it’s not people but data that tops the asset value list for companies.”[5] Because data comes from a variety of sources and in a multitude of formats, it needs to be digitized. “How do enterprises gain the most from this valuable resource?” asks Tech writer Kamalika Some. “The answer is through Data Digitalization. As the name suggests, data digitization is the process by which physical or manual data files like text, audio, images, video are converted into digital forms.”[6] She goes on to note, “Leveraging the power of cognitive computing algorithms, enterprises can synthesize raw data from various information sources, weigh multiple options to arrive at conclusive answers. To achieve this, cognitive systems encapsulate self-learning models using data mining, pattern recognition and natural language processing (NLP) algorithms.” As Feldman noted above, cognitive computing systems don’t provide “conclusive answers”; they provide the best rather than the right answer.

 

Enterra’s cognitive computing system — the Enterra Cognitive Core™, a system that can Sense, Think, Act, and Learn® — like all cognitive computing systems, helps business leaders make decisions when confronted with ambiguous data. SAS analysts, Alison Bolen, Hui Li, and Wayne Thompson, observe, “Cognitive computing brings with it a promise of genuine, human-to-machine interaction. When machines become cognitive, they can understand requests, connect data points and draw conclusions. They can reason, observe and plan.”[7] Like any technology, a business case must be made for its use; however, finding a business case for cognitive computing isn’t difficult. Why? Because every business needs to make decisions. In fact, Bain analysts, Michael C. Mankins and Lori Sherer (), assert, “The best way to understand any company’s operations is to view them as a series of decisions.”[8] They add, “We know from extensive research that decisions matter — a lot. Companies that make better decisions, make them faster and execute them more effectively than rivals nearly always turn in better financial performance. Not surprisingly, companies that employ advanced analytics to improve decision making and execution have the results to show for it.”

Concluding thoughts

Once people stop worrying about definitions and focus on capabilities, cognitive computing systems demonstrate their worth. Feldman observes, “Cognitive applications and platforms have seen a marked evolution that has given us a mixture of features that solve problems rather than insist solely on purity of design. They are purpose-built experiments, mixing cognitive, AI, machine learning, and new kinds of interfaces in order to address a specific purpose or need.” As organizations transform into digital enterprises, the need for cognitive computing capabilities will become even more apparent.

 

Footnotes
[1] Alan M. Turing, “Computing Machinery and Intelligence,” Mind 49: 433-460, 1950.
[2] Megan Murphy, “Ginni Rometty on the End of Programming,” Bloomberg BusinessWeek, 20 September 2017.
[3] Sue Feldman, “Cognitive computing and AI begin to grow together,” Information & Data Management, 26 March 2020.
[4] Amber Lee Dennis, “Cognitive Computing Demystified: The What, Why, and How,” Dataversity, 15 February 2017.
[5] Yossi Sheffi, “What is a Company’s Most Valuable Asset? Not People,” Supply Chain @ MIT, 20 December 2018.
[6] Kamalika Some, “Cognitive AI and the Power of Intelligent Data Digitalization,” Analytics Insight, 19 July 2020.
[7] Alison Bolen, Hui Li, and Wayne Thompson, “Becoming Cognitive: Understanding cognitive computing versus artificial intelligence, deep learning and machine learning,” Longitudes, 3 April 2017.
[8] Michael C. Mankins and Lori Sherer, “Creating value through advanced analytics,” Bain Brief, 11 February 2015.

 

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!