“It is the dawn of a new era, the cognitive era,” declared IBM CEO Ginni Rometty (@GinniRometty) at the 2016 International Consumer Electronics Show.[1] She defined the cognitive era as one in which “digital business” would be enhanced by “digital intelligence.” Like any good CEO, Rometty was pushing her company’s entry in the field — the famous Jeopardy! champion Watson. Rometty declared Watson to be “the world’s first cognitive system.” Cognitive computing, however, involves a number of analytic approaches — and the IBM Watson approach isn’t always the best one for every business situation. Dan Briody, Senior Editor of THINK Leaders, defines cognitive computing as “systems that learn at scale, reason with purpose, and interact with humans naturally.”[2] He continues:
“Cognitive computing systems aren’t programmed; they’re trained to sense, predict, infer and, in some ways, think, using artificial intelligence and machine learning algorithms that are exposed to massive data sets. These systems improve over time as they build knowledge and acquire depth in specialty areas or ‘domains.’ Current computing systems require that rules be hard-coded into a system by a human expert. However, cognitive computers program themselves; they process natural language, make sense of unstructured data and learn by experience much in the same way humans do. These systems not only bring massive parallel processing capabilities to churn through enormous volumes of often fluid data, but also use image and speech recognition as their eyes and ears, making interaction with human teams more natural. The dynamic learning inherent in these systems provides a feedback loop for machines and humans to refine insights and teach one another.”
Cognitive computing systems don’t have to include all of the capabilities discussed by Briody (i.e., they do not all require image and speech recognition capabilities); but, in order to reason about structured and unstructured data, cognitive computing systems do require natural language processing and machine learning capabilities. As President and CEO of a cognitive computing firm, I certainly agree with Rometty that we are on the cusp of the cognitive era; but, a reasonable person has to wonder whether such declarations are hype or whether we really are at the dawn of a new age of computing. Briody believes the latter. “Cognitive computing will enable new business models and change the way entire industries work,” he writes, “allowing business and government leaders to take on projects of previously insoluble size and complexity. It combines massive data sets with sophisticated analytics, natural language processing, and machine learning to help human experts synthesize findings and improve decision-making.” Improving decision making is critical. Bain analysts, Michael C. Mankins and Lori Sherer (@lorisherer), explain that decision making is one of the most important aspects of any business. “The best way to understand any company’s operations,” they write, “is to view them as a series of decisions.”[3]
The term “era” isn’t one that should be used lightly. It is defined as “a long and distinct period of history with a particular feature or characteristic.” To call the coming years the cognitive era means that cognitive computing will be computing’s defining characteristic in the years ahead. Will cognitive computing measure up to that standard? Briody answers that question this way:
“The applications of cognitive computing to business are endless. Some experts believe that this technology represents our best — perhaps our only — chance to tackle some of the most enduring systemic issues facing our planet, from understanding climate change to identifying risk in our increasingly complex economy. … The capabilities enabled by cognitive computing will force business leaders to rethink their operating models. While some processes may be refined, others will need to be reinvented, and still others built from scratch. New skills and training will be required, such as developing the ability to design and frame appropriate challenges for cognitive systems. New ways of thinking, working and collaborating will invariably lead to cultural and organizational change, some of which may be challenging. … We are entering an era of cognitive computing, marked by the marriage of robust computing horsepower with a ‘brain-like’ interface capable of synthesizing vast amounts of data and continuous ‘learning’.”
James Kobielus (@jameskobielus), IBM’s big data evangelist, asserts, “Truly intelligent systems are mainstream now in most industries.”[4] He believes that cognitive systems will become the heart and soul of digital enterprises. “As a business asset,” he explains, “cognitive analytics — which represent the backbone of these systems — build on enterprise investments in artificial intelligence, natural language processing, machine learning, artificial neural networks, streaming analytics, unstructured data, Internet of Things, and decision automation.” Kobielus predicts that cognitive analytics will become:
- Inescapable. “Cognitive analytics are transforming life and work in the 21st century.”
- A seabed of innovation. “Cognitive analytics are the focus of disruptive innovation in the Insight Economy.”
- The convergence point for all big data. “Cognitive analytics are ingesting new types of information. The cognitive revolution runs on fresh feeds of an ever-growing pool of disparate data, including media streams, IoT sensor data, and other nontraditional sources.
- The hottest specialty in data science. “These professionals are combining statistical modeling, subject matter domain knowledge, and programming skills.”
- The key ingredient of personalization. “The future of cognitive applications in our lives is pervasive personalization. Cognitive systems are becoming active, context-aware personalized-interaction agents.”
As noted above, most cognitive computing systems use a combination of machine learning, mathematics, and natural language processing, but there are differences. My company’s entry in the cognitive computing field is the Enterra Enterprise Cognitive System™ (ECS), a system that can Sense, Think, Act, and Learn®. It differs in its cognitive computing approach than the one used by IBM’s Watson. Watson is designed to respond to queries where the answer is found within a large corpus of documents. It analyzes massive amounts of data and provides a “best guess” answer (IBM calls it a “confidence-weighted response”) based on what it finds. In contrast, Enterra’s ECS is designed for queries where both semantic reasoning and advanced calculations are required to derive the answer. This is a requirement, as the most important business problems involve modeling, prediction, planning, or optimization. There are benefits to both approaches and which approach should be utilized depends on the specific challenge that needs to be tackled.
The bottom line is that declaring we are entering a new era — the cognitive era — is more than hype. Chetan Dube, CEO, IPsoft, states, “I believe a tectonic shift in the relationship between man and machine is imminent. As the intelligence of cognitive systems matures it will carry humans to higher planes of creative thinking. These cognitive machines are going to redefine the business landscape. In fact, I wouldn’t be surprised if within the next 10 years, you will walk down the corridor past a co-worker and not know if they are human or machine. This may sound a little farfetched to some but we are on the cusp of a second industrial revolution — one that will see huge changes to the global workforce where we combine digital labour with human labour. As cognitive agents become more intelligent and behave in the same way as humans, but with the power of superhuman scale and speed, we remove the shackles that bind humans to repetitive chores. It marks a seminal moment in our evolution.”[5] Simply put, as artificial intelligence systems continue to mature, cognitive computing will find a permanent home in digital enterprises.
Footnotes
[1] Mike Snider, “IBM’s Rometty heralds dawn of ‘cognitive era’, inks new Watson deals,” USA Today, 7 January 2016.
[2] Dan Briody, “New Vocabulary: Cognitive Computing,” THINK Leaders, October 2015.
[3] Michael C. Mankins and Lori Sherer, “Creating value through advanced analytics,” Bain Brief, 11 February 2015.
[4] James Kobielus, “Top Trends to Watch in Cognitive Analytics in 2016,” Dataversity, 4 January 2016.
[5] James Nunns, “Predictive analytics, cognitive computing & algorithms: 10 Big Data predictions for 2016,” Computer Business Review, 23 December 2015.