Precision in language can be extremely important. Thomas Hornigold (@physicspod), a physics student at the University of Oxford, believes it’s time we use more precise language when discussing artificial intelligence (AI). “There are,” he notes, “various definitions of artificial intelligence.”[1] He explains:
“There’s the cultural idea (from films like Ex Machina) of a machine that has human-level artificial general intelligence. But human-level intelligence or performance is also seen as an important benchmark for those that develop software that aims to mimic narrow aspects of human intelligence, for example, medical diagnostics. The latter software might be referred to as narrow AI, or weak AI. Weak it may be, but it can still disrupt society and the world of work substantially. Then there’s the philosophical idea, championed by Ray Kurzweil, Nick Bostrom, and others, of a recursively-improving superintelligent AI that eventually compares to human intelligence in the same way as we outrank bacteria. Such a scenario would clearly change the world in ways that are difficult to imagine and harder to quantify; weighty tomes are devoted to studying how to navigate the perils, pitfalls, and possibilities of this future. The ones by Bostrom and Max Tegmark epitomize this type of thinking. This, more often than not, is the scenario that Stephen Hawking and various Silicon Valley luminaries have warned about when they view AI as an existential risk.”
I agree that understanding the differences in various AI technologies is important for business leaders to understand.
Weak, Strong, and General Artificial Intelligence
Smita Kumari, a management professional, notes, “The increasing use of artificial intelligence and related technology requires clear understanding of the technologies jargons. … Understanding the difference between AI and Cognitive computing helps to decide which one is best for your business.”[2] Vidushi Vij, a digital marketing professional at Oodles Technologies, agrees. She writes, “To know the difference between an AI-powered platform and Cognitive powered platform is important for any business.”[3] Important as knowing the difference between AI and cognitive computing may be, Vij nonetheless laments, “Cognitive Computing is little difficult to explain.” Ginni Rometty (@GinniRometty), IBM’s CEO, explains the difference between AI and cognitive computing begins with intent. AI seeks to match human intelligence whereas cognitive computing seeks to augment human intelligence. Rometty explains, “The idea was to help you and I make better decisions amid cognitive overload. That’s what has always led us to cognitive. If I considered the initials AI, I would have preferred augmented intelligence. It’s the idea that each of us are going to need help on all important decisions.”[4] Kumari agrees with that differentiation. “AI will solve the problem fully,” she writes, “whereas cognitive computing acts as supplement to make more informed decisions.” Cognitive computing is already a practical reality; whereas, artificial general intelligence is still aspirational. As I see it, there are currently three levels of AI being developed. They are:
- Weak AI: Wikipedia states: “Weak artificial intelligence (weak AI), also known as narrow AI, is artificial intelligence that is focused on one narrow task.” In other words, weak AI was developed to handle/manage a small and specific data set to answer a single question. Its perspective is singular, resulting in tunnel vision.
- Strong AI: Strong AI originally referred to Artificial General Intelligence (i.e., a machine with consciousness, sentience and mind), “with the ability to apply intelligence to any problem, rather than just one specific problem.” Today, however, there are cognitive systems that fall short of AGI but far surpass weak AI. These systems were developed to handle/manage large and varied data sets to answer a multitude of questions in a variety of categories. This is the category into which cognitive computing falls. Cognitive AI can deal with ambiguities whereas weak AI cannot.
- General AI: The AGI Society notes the ultimate goal of AGI is to develop “thinking machines” (i.e., “general-purpose systems with intelligence comparable to that of the human mind”).
The important thing to remember about cognitive computing is that it can function in ambiguous situations — the kind of situations in which individuals and companies often find themselves.
Strong AI (aka cognitive computing) is practical AI
Most businesses don’t require artificial general intelligence to be successful. Practical AI, which will never threaten humankind, offers most of what companies need. Jasmine Henry (@jasminehenry10) calls this cognitive technologies “applied AI” and she insists such technologies are now a business necessity.[5] “While it may be too soon to declare that AI has become a commodity,” she writes, “research reveals that the majority of enterprises are using AI, machine learning or cognitive technologies in some capacity.” She goes on to note:
“A recent Harvard Business Review survey of 250 executives on the results of applied AI in the enterprise revealed the following commonly cited business benefits:
- 51 percent cited enhancing product features, functions and performance
- 36 percent cited optimizing internal operations
- 36 percent cited automation
- 35 percent cited superior decision-making
- 32 percent cited new products
For many, unlocking the remarkable potential of AI for customer satisfaction and productivity use cases requires operationalizing these technologies.”
I define cognitive computing as the combination of semantic intelligence (which involves Natural Language Processing, machine learning, and ontologies) and computational intelligence (advanced mathematics and analytics). Because cognitive platforms use Natural Language Processing, they can interface with users without their having to be a programming or analytical expert. In my discussion with clients, I’ve found they need embedded analytics that perform traditional roles of three types of experts:
- A business domain expert — the customer of the analysis who can help explain the drivers behind data anomalies and outliers.
- A statistical expert — to help formulate the correct statistical studies, the business expert knows what they want to study, and what terms to use to help formulate the data in a way that will detect the desired phenomena.
- A data expert — the data expert understands where and how to pull the data from across multiple databases or data feeds.
A cognitive computing platform, like Enterra’s Artificial Intelligence Learning Agent™ (AILA®), which leverages embedded analytics lets users ask questions and receive answers in language they understand. It’s strong and it’s practical.
Summary
In order to leverage AI capabilities, business leaders need to understand how weak and strong AI can be practically applied by their organizations. Henry concludes, “AI technology has become startlingly commonplace. The organizations that wind up ahead during the AI revolution are likely to be those able to compete on strategy, rather than those appearing with the shiniest technologies. Applied artificial intelligence is a reality, and executives are wise to consider how operationalizing AI technologies and delivering smarter AI-driven services could yield advantages.”
Footnotes
[1] Thomas Hornigold, “Why We Need to Fine-Tune Our Definition of Artificial Intelligence,” Singularity Hub, 20 June 2018.
[2] Smita Kumari, “AI or Cognitive Computing: Understanding the difference between the two,” House of Bots, 28 May 2018.
[3] Vidushi Vij, “Are Cognitive Computing And AI The Same Thing,” Oodles Technologies Blog, 28 may 2018.
[4] Megan Murphy, “Ginni Rometty on the End of Programming,” Bloomberg BusinessWeek, 20 September 2017.
[5] Jasmine Henry, “Applied artificial intelligence is no longer an advantage: It’s a necessity,” Mobile Business Insights, 15 February 2018.