Home » Artificial Intelligence » Cognitive Computing from Pilot Programs to Moonshots

Cognitive Computing from Pilot Programs to Moonshots

April 12, 2018

In recent months, the term “cognitive computing” has been used nearly as often in news stories as “artificial intelligence.” One reason cognitive computing has gained cachet is because it doesn’t stir the kind of fear associated with artificial intelligence (AI). Although cognitive computing is a subset of AI, solving business problems is its main purpose as opposed to the broader search for a sentient machine like the fictional HAL 9000. Aan Chauhan and Srinivas TK, executives at Cognizant Technology Solutions, assert, “As cognitive computing capabilities grow, so do opportunities for transforming organizations, from how infrastructure is provisioned and managed to how customers are engaged and delighted.”[1] The Cognitive Computing Consortium defines cognitive computing this way: “Cognitive computing addresses complex situations that are characterized by ambiguity and uncertainty; in other words it handles human kinds of problems. Cognitive computing systems often need to weigh conflicting evidence and suggest an answer that can be considered as ‘best’ rather than ‘right’.” There are, however, a number of approaches that fall under the cognitive computing rubric. Most of those approaches involve machine learning, natural language processing, and advanced analytics. My company’s entry in this field is called the Enterra Enterprise Cognitive System™ (Aila™) — a system that can Sense, Think, Act, and Learn®.

 

Cognitive Computing Myths

 

Thomas H. Davenport (@tdav), a Professor in Management and Information Technology at Babson College, and David Schatsky (@dschatsky), managing director at Deloitte LLP, believe “a set of lingering myths” has taken root “among those whose exposure to the [cognitive] technology has been limited.”[2] Those myths include:

 

Myth 1: Cognitive is all about automation. “It is rare to find a media report about AI that doesn’t speculate about job losses. Much of the reason for that is the commonly held belief that the technology’s primary purpose is automating human work. But that’s hardly the full story — in fact, there are significant uses for AI that do not involve substituting machine labor for human labor.”

 

Myth 2: Cognitive kills jobs. “Hand in hand with the belief that AI is all about automation is the expectation that it will destroy countless jobs. While it’s impossible to know what will happen in the distant future, both the objectives and the predictions of survey respondents suggest that job loss won’t be a major outcome.”

 

Myth 3: The financial benefits are still remote. “Many people view AI as a futuristic technology dominated by a handful of tech giants making headlines with high-profile applications. They believe most companies will not be able to achieve real financial benefits anytime soon. There is some truth to this view: The tech giants are indeed at the forefront of AI R&D and have capabilities not available to everyone. On the other hand, there are ordinary companies in every industry that have deployed AI and reaped financial benefits.”

 

Myth 4: AI is overhyped and bound to disappoint. “There is no question AI is one of the hottest technology topics today, but is it overhyped?” Jennifer Zaino (@Jenz514) doesn’t think so. She writes, “Cognitive Computing increasingly will be put to work in practical, real-world applications. The industries that are adopting it are not all operating at the same maturity levels; there remain some challenges to conquer. The wheels are very much in motion to make cognitive-driven Artificial Intelligence applications a key piece of enterprise toolsets.”[3]

 

Myth 5: Cognitive technology is just for ‘moonshots’. “It’s not uncommon to hear cognitive technology projects equated with ‘moonshots’ — highly ambitious, transformational change initiatives often considered disruptive to companies and industries. Does this mean smaller projects are not worth pursuing? … Any transformational project is likely to face high risks and expense. Multiple less-ambitious projects — particularly when focused on specific business process goals — can be more likely to succeed and may also yield transformational outcomes collectively.”

As their comments demonstrate, when Deloitte surveyed early users of cognitive technologies, results indicated none of those myths were true.

 

Benefits of Using Cognitive Technology

 

The era in which we live has been given numerous names including the Information Age and the Digital Age. Data is a common feature regardless of what you want to call the current time period. What that means for business is that data is a new resource that must be managed and used wisely. Stephanie Simone notes, “Cognitive computing and machine learning are going to transform knowledge management. Chatbots, cognitive search, natural language processing, and semantic technologies accelerate the ability of humans to find what they need to do their jobs.”[3] Computer science experts Serge Haziyev and Yuriy Milovanov list a few of the ways cognitive computing can be used: speech understanding; face detection; recommendations; medical diagnosis; risk assessment; and sentiment analysis.[4] They add, “As Cognitive Systems aim to handle real world problems, which are highly uncertain and may be influenced by potentially unlimited number of different factors, quality and consistency of their results highly depends on the number of factors they consider while making the decision. That brings yet another technological trade-off as the complexity of the problem grows tremendously with the number of the data sources. Aggregating and integrating the data from different data sources and processing it in a unified way is also challenging.” Despite the challenges, cognitive computing systems are better able to integrate data sources and handle more variables than previous analytic platforms.

 

Chauhan and TK suggest, “It’s smart to deploy cognitive computing applications where they will have an immediate impact and measurable return. The most mature technologies, meaning those that are going mainstream and are commercially available, are most likely to meet these criteria.” The best way to determine where that impact might be is to “find opportunities in the organization where deploying cognitive computing capabilities will quickly realize quantifiable returns.” When implementing a cognitive technology solution, they recommend taking measured steps. “Identify use cases for combinations of mature and emerging cognitive computing technologies for business transformation in the longer term. These projects will have a time-to-maturity of about one to three years. A critical learning in this phase is anticipating and planning for managing the disruption cognitive computing will cause to the business.” At Enterra Solutions®, we recommend a crawl, walk, run approach that facilitates a smooth transition process from a pilot project to full-scale operation. Even though Davenport and Schatsky note cognitive technologies aren’t just for moonshots, Chauhan and TK encourage organizations to consider taking moonshots. They explain, “Continuing advances in cognitive computing’s ability to mimic human sensory perception, thinking and judgment calls make it possible for organizations to consider completely new possibilities for their operating models, core competencies and value propositions.”

 

Summary

 

Haziyev and Milovanov conclude, “Cognitive Computing … urges digital solutions to meet human-centric requirements: act, think, and behave like a human in order to achieve maximum synergy from human-machine interaction. … We believe that soon every digital system will be measured on its cognitive abilities.” Chauhan and TK add, “Gaining experience with practical cognitive computing will enable organizations to envision additional uses for the technology and begin to design solutions incorporating its current, and future, capabilities. They can also anticipate a future in which cognitive applications will free humans from rote drudge work, giving them time and energy to unleash uniquely human creativity — a quality that even the most advanced cognitive applications are unlikely to mimic.”

 

Footnotes
[1] Aan Chauhan and Srinivas TK, “A strategy for making the best cognitive computing investments now,” Information Management, 9 January 2018.
[2] Thomas H. Davenport and David Schatsky, “5 Myths About Cognitive,” The Wall Street Journal, 13 March 2018.
[3] Jennifer Zaino, “Cognitive Computing, Artificial Intelligence Apps Have Big Future in the Enterprise,” Dataversity, 17 September 2015.
[4] Stephanie Simone, “How Cognitive Computing is Shaping Knowledge Management,” KMWorld, 15 January 2018.
[5] Serge Haziyev and Yuriy Milovanov, “Cognitive Computing: How to Transform Digital Systems to the Next Level of Intelligence,” Dataconomy, 11 January 2018.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!