Home » Artificial Intelligence » Defining Terms in the World of Artificial Intelligence

Defining Terms in the World of Artificial Intelligence

April 19, 2017

supplu-chain

There is, and likely will remain, confusion about terms used in the field of artificial intelligence — terms like artificial intelligence (AI), machine learning, and cognitive computing. Having a basic understanding of what these terms means is important because we are entering an era that will largely be defined by smart, AI-related technologies. “After decades of unfulfilled promises and hype,” writes former IBM executive Irving Wladawsky-Berger, “AI seems to be reaching a tipping point. The necessary ingredients are finally coming together: lots and lots of data, with the volume of data pouring in expected to double every three years or so; advanced machine learning algorithms that extract insights and learn from all that data; drastically lowered technology costs for collecting, storing and analyzing these oceans of information; and access to an increasing variety of data-driven, cloud-based AI applications.”[1] Randy Bean (@RandyBeanNVP), CEO and Founder of NewVantage Partners, adds, “While Big Data has been driving … disruption over the course of the past five years, AI and machine learning are seen to be rapidly emerging avenues for innovation and disruption in the decade ahead. … With a plurality of executives envisioning a decade of accelerating disruption, corporations face a challenge, and an opportunity, to respond to a dynamic and changing business world or run the risk of falling behind. Big Data and forms of AI such as machine learning will begin to merge, further accelerating the pace of change.”[2]

 

Artificial Intelligence is an Umbrella Term

 

Analysts from Nanalyze report, “The vast majority of nearly 2,000 experts polled by the Pew Research Center in 2014 said they anticipate robotics and artificial intelligence will permeate wide segments of daily life by 2025. … But just what is artificial intelligence?”[3] They answer that question with a definition proffered in a U.S. Government report called Preparing for the Future of Artificial Intelligence: “Artificial intelligence is a computerized system that exhibits behavior that is commonly thought of as requiring intelligence.” They add, “Understanding the world of AI only begins with a simple artificial intelligence definition. There’s a whole universe of terminology we need to explore in order to understand the domain.” Aleks Buczkowski (@abuczkowski) agrees there is a need for some clarity. He writes, “You hear these buzzwords almost every day but what is the actual meaning of Artificial Intelligence, Machine Learning, and Deep Learning … Are these terms related and overlapping? What’s the technology behind it?”[4] Deloitte analysts, David Schatsky (@dschatsky), Craig Muraskin, and Ragu Gurumurthy, assert, “The field of AI suffers from both too few and too many definitions.”[5] I hold no false hopes that this article will sort things out; but, below you will find some definitions that could help you better understand the field of artificial intelligence.

 

A Brief Glossary of Terms

 

Machine Learning: Buczkowski asserts, “Researchers tried many different approaches to creating AI, but today the only area that brings promising and relevant results is called Machine Learning.” That is a bit of an overstatement; but, he is correct that machine learning is making great strides. Nanalyze analysts explain, “Machine learning is about how computers with artificial intelligence can improve over time using different algorithms (a set of rules or processes), as it is fed more data.” Ryan Young adds, “Machine learning is a subdivision of AI that involves machines deciphering data and learning for themselves. It’s used a lot throughout the businesses of today as is very efficient when used in areas such as speech, object, and facial recognition, translation, and other tasks. Programs that use machine learning can learn to recognize patterns on their own and make predictions based on what it’s learned.”[6] Steven Norton asserts, “When people talk about artificial intelligence, they usually are referring to one of its subfields: machine learning. While AI concerns itself with making machines think like humans, machine learning has a narrower purpose: enabling computers to learn from data with minimal programming.”[7]

 

Cognitive Computing: I define cognitive computing as a combination of semantic reasoning (i.e., the use of machine learning, natural language processing, and ontologies) and computational intelligence (i.e., advanced analytics). There are, however, a number of approaches that fall under the cognitive computing rubric. Most of those approaches involve machine learning, natural language processing, and advanced analytics. The analysts from Nanalyze write, “Cognitive computing is one of those terms that has fairly recently entered the AI lexicon. One of the definitions for cognitive computing … that seems to be prolific around the inter-webs sums it up thus: ‘Cognitive computing involves self-learning systems that use data mining (i.e., big data), pattern recognition (i.e., machine learning) and natural language processing to mimic the way the human brain works.’ Katherine Noyes writes in Computerworld that cognitive computing ‘deals with symbolic and conceptual information rather than just pure data or sensor streams, with the aim of making high-level decisions in complex situations.'” Jenna Hogue adds, “The types of problems [involved in cognitive computing] … tend to be much more complex and human-like than the average non-cognitive system. These problems tend to comprise multiple different variables included, shifting data elements, and an ambiguous nature.”[8]

 

Neural Networks and Deep Learning: Some experts try to make a distinction between Neural Networks and Deep Learning; however, the terms are often used synonymously. For example, Norton writes, “The hottest field in artificial intelligence today is a branch of machine learning known as deep learning. It uses complex algorithms — essentially a set of instructions for solving a particular problem — to perform more abstract tasks such as recognizing images. A well known deep learning tool is the neural network, which roughly tries to mimic the operations of a human brain.” And Buczkowski writes, “Although Artificial Neural Networks have been around for a long time, only in the last few years the computing power and the ability to use vector processing from GPUs enabled building networks with much larger and deeper layers than it was previously possible and it brought amazing results. Although there is no clear border between the terms, that area of Machine Learning is often described as Deep Learning.” As both Norton and Buczkowski note, neural networks are a type of machine learning. Nanalyze analysts explain, “Neural networks are superficially based on how the brain works. There are different kinds of neural networks — feed forward and recurrent are a couple terms that you may encounter — but basically they consist of a set of nodes (or neurons) arranged in multiple layers with weighted interconnections between them. Each neuron combines a set of input values to produce an output value, which in turn is passed on to other neurons downstream.” Each layer refines given input and creates a more precise output for the next layer. Nanalyze analysts conclude, “Deep learning is simply a larger neural network. Deep learning networks typically use many layers — sometimes more than 100 — and often use a large number of units at each layer, to enable the recognition of extremely complex, precise patterns in data.”

 

Natural Language Processing: Natural language processing (NLP) is involved in both inputs to computers and outputs to users. As the name implies, NLP is capable of understanding and using common language to find solutions to problems. Since much of the data being created today is unstructured (i.e., it comes from sources other than structured databases; such as, social media, newspapers, videos, etc.), NLP is essential to analyze it. And since many computer users are not data scientists, NLP can be used to explain results in terms understandable to non-technical users. The analysts at Nanalyze explain, “Natural language processing, as defined by aitopics.org, ‘enables communication between people and computers and automatic translation to enable people to interact easily with others around the world.'”

 

Summary

 

J-P De Clerck (@conversionation), writes, “Deep learning, image recognition, hypothesis generation, artificial neural networks, they’re all real and parts are used in various applications. According to IDC, cognitive computing is one of six Innovation Accelerators on top of its third platform and the company expects global spending on cognitive systems to reach nearly $31.3 billion in 2019.”[9] It might seem strange to have to declare AI and its various element “all real,” but to many people they still sound a lot like science fiction. Bean warns business leaders that not only is AI real it must be part of any successful business strategy. He explains, “To compete in a disruptive decade, competitive businesses must begin the cultural transformation so that they are well positioned to adapt rapidly to sudden shifts and changing market dynamics. To borrow from the Irish playwright Samuel Beckett, highly adaptive organizations, like highly adaptive individuals, must demonstrate that they are able to fail fast and learn faster. From Big Data to artificial intelligence, big companies are bracing for a decade of disruption. Many organizations may face a stark choice — grow or die!”

 

Footnotes
[1] Irving Wladawsky-Berger, “The Emerging, Unpredictable Age of AI,” The Wall Street Journal, 10 February 2017.
[2] Randy Bean, “Companies Brace for Decade of Disruption From AI,” MIT Sloan Management Review, 24 January 2017.
[3] Staff, “An Artificial Intelligence Definition for Beginners,” Nanalyze, 12 November 2016.
[4] Aleks Buczkowski, “What’s the difference between Artificial Intelligence, Machine Learning and Deep Learning?Geoawesomeness, 19 March 2017.
[5] David Schatsky, Craig Muraskin, and Ragu Gurumurthy, “Demystifying artificial intelligence,” Deloitte University Press, 4 November 2014.
[6] Ryan Young, “Artificial Intelligence, Machine Learning, and Deep Learning and How they Differ from One Another,” TrendinTech, 25 March 2017.
[7] Steven Norton, “CIO Explainer: What is Artificial Intelligence?The Wall Street Journal, 18 July 2016.
[8] Jenna Hogue, “Cognitive Computing: The Hype, the Reality,” Dataversity, 12 January 2017.
[9] J-P De Clerck, “Artificial intelligence (AI) and cognitive computing: what, why and where,” i-scoop, August 2016.

Related Posts: