“Leading computer scientists and technologists say the definition of ‘computer’ is again changing,” writes Adrienne LaFrance (@). “We have reached a profound moment of convergence and acceleration in computing technology, one that will reverberate in the way we talk about computers, and specifically with regard to the word ‘computer,’ from now on.” James Brase, the deputy associate director for data science at Lawrence Livermore National Laboratory, told LaFrance, “It’s like the move from simple adding machines to automated computing. Because we’re making an architectural change, not just a technology change. The new kinds of capabilities — it won’t be a linear scale — this will be a major leap.” The major leap Brase is describing involves cognitive computing. LaFrance explains:
“The architectural change he’s talking about has to do with efforts to build a computer that can act — and, crucially, learn — the way a human brain does. Which means focusing on capabilities like pattern recognition and juiced-up processing power — building machines that can perceive their surroundings by using sensors, as well as distill meaning from deep oceans of data.”
Like Brase, Peter Fingar (@), an internationally recognized expert on business strategy, globalization and business process management, insists that cognitive computing is ushering in a new era of computing. According to Fingar, the first computing era was as much man as machine. He calls it the “Tabulating Era” and marks its beginning in 1900. The second era — the “Programming Era” — began half a century later. The next era, which he labels the “Cognitive Era,” began in 2011 when IBM’s Watson beat human champions on the television game show Jeopardy! Analysts at Allied Market Research agree that we are entering a new era and add, “[The] cognitive computing technology era is ready to dominate major sectors.” Jamshed Avari (@) adds, “Cognitive computing is the vehicle through which companies are promising to usher in a whole new era of personal technology.”
Although the pundits cited above speak about cognitive computing as though it encompassed a single approach or technology, that’s a mistaken impression. Cognitive computing, a field of artificial intelligence, involves several approaches and different technologies. For example, Watson basically uses a brute force approach to cognitive analytics. It analyzes massive amounts of data and provides a “best guess” answer (IBM calls it a “confidence-weighted response”) based on what it finds. That’s how Watson won Jeopardy! This brute force approach is often called deep learning. At Enterra Solutions®, we take a different approach to cognitive computing. Our cognitive computing system, which we call the Enterra Enterprise Cognitive System™ (ECS) is an enterprise cognitive system that can Sense, Think, Act, and Learn®. The ECS uses various techniques to overcome the challenges associated with most deep learning systems. Like deep learning systems, the ECS gets smarter over time and self-tunes by automatically detecting correct and incorrect decision patterns; but, the ECS also bridges the gap between a pure mathematical technique and semantic understanding. The ECS has the ability to do math, but also understands and reasons about what was discovered. Marrying advanced mathematics with a semantic understanding is critical — we call this “Cognitive Reasoning.” The ECS’ cognitive computing approach — one that utilizes the best (and multiple) solvers based on the challenge to be solved and has the ability to interpret those results semantically — is a superior approach for many of the challenges to which deep learning is now being applied.
As Fingar pointed out, the field of cognitive computing is relatively new — still in its infancy. Thomas H. Davenport, a Distinguished Professor at Babson College, writes, “Most organizations that are exploring cognitive technologies — smart machines that automate aspects of decision-making processes — are just putting a toe in the water.” Rajeev Ronanki (@), a Deloitte consultant, told Davenport that most of the clients with which he deals are still not sure they know how to properly utilize cognitive computing. “They want to make sure that it’s not science fiction before they really commit to the technology at full scale,” he stated. “So the first project is typically a pilot that takes four to six months to develop.” Cognitive computing may sound like science fiction, but I can assure that it’s not. Davenport agrees. He explains:
“The good news is that this technology is not science fiction, but rather something from which organizations can benefit today. Chances are good, then, that organizations will want to create architectures for cognitive technologies that support more than a single application. In fact, I believe that it won’t be long before sophisticated organizations set out to build ‘cognitive architectures’ that interface with, but are distinct from, their regular IT architectures.”
In my dealings with potential clients, I try to explain that cognitive computing solutions can enhance their legacy systems without having to replace them. That’s a plus because business executives want solutions but not at the cost of having to discard expensive IT systems. Many of the challenges facing businesses today involve big data — an area in which cognitive computing really shines. “Problems of big data analytics can overwhelm business managers and the businesses they run,” writes Jack Vaughan (@). “Enter cognitive computing, which is an amalgam of natural language processing, analytics, machine learning and more.” Vaughan interviewed Judith Hurwitz (@), a technology consultant and author, and asked her about the benefits of cognitive computing for businesses. Hurwitz told him, “The real challenge we’re facing — and this is the reason why cognitive computing is resonating — is that we’re in a world of more data, and more complex data, than ever. One of the early use cases for cognitive computing is around healthcare. It’s just astronomical the number of pages of new research, new cases, new clinical trials and new treatments that the doctor must consider. New ideas are coming from across the world fast and furious.”
As Hurwitz notes, the real breakthrough that cognitive computing technologies provide are their ability to ingest vast amounts of data (both structured and unstructured) and make sense of it; thus, providing decision makers with actionable insights and discovering new relationships. And unlike humans, cognitive computing systems never forget. As Hurwitz told Vaughan, all of the technologies involved in cognitive computing are coming together at the right time to create a revolution in the business world. “If you look at each of the elements,” she stated, “whether it’s natural language processing, ontology, creating a corpus of data or ingesting data, applying analytics or machine learning, distributed computing, cloud — each one of those has been approached as a topic in an area of endeavor. What you start to see is that cognitive computing is all of the above. When all of the elements come together at the right place, at the right time, with the right economic balance, then transformations and revolutions happen.”
 Adrienne LaFrance, “What Is a ‘Computer’ Anymore?” The Atlantic, 20 July 2015.
 Peter Fingar, “The Cognitive Computing Era is Here: Are You Ready?” bpm.com, 16 July 2015.
 “Cognitive Computing Technology Era Is Ready To Dominate Major Sectors,” Allied Market Research, 16 June 2015.
 Jamshed Avari, “Cognitive Computing Isn’t Just a Buzzword, It’s the Next Big Thing in Technology,” NDTV Gadgets, 12 May 2015.
 Thomas H. Davenport, “Building Your Cognitive Technology Architecture,” The Wall Street Journal, 15 July 2015.
 Jack Vaughan, “Can cognitive computing help users combine big data and analytics?,” TechTarget, August 2015.