Home » Artificial Intelligence » The Dawn of Cognitive Computing

The Dawn of Cognitive Computing

October 28, 2014

supplu-chain

By now, most people have heard of IBM’s Watson computing system that famously defeated two past champions on the television show Jeopardy! Watson has moved from competing on game shows to helping doctors diagnose cancer and helping chefs create new food dishes. In academic circles, there remains a debate about whether Watson is truly a cognitive computing system. Douglas Hofstadter (@q_douglashofsta), a cognitive scientist at Indiana University and a Pulitzer Prize-winning author, claims, for example, that IBM’s Watson computer system and Apple’s Siri personal assistant aren’t “real” artificial intelligence. [“Why Watson and Siri Are Not Real AI,” Popular Mechanics, 10 February 2014] “Watson is basically a text search algorithm connected to a database just like Google search,” he explains. “It doesn’t understand what it’s reading. In fact, read is the wrong word. It’s not reading anything because it’s not comprehending anything. Watson is finding text without having a clue as to what the text means. In that sense, there’s no intelligence there. It’s clever, it’s impressive, but it’s absolutely vacuous.”

 

On the other hand, Miles Brundage (@Miles_Brundage) and Joanna Bryson (@j2bryson) believe that Hofstadter is picking at nits. Following the publication of his interview in Popular Mechanics, they responded, “Artificial intelligence is here now. … Although Watson includes many forms of text search, it is first and foremost a system capable of responding appropriately in real-time to new inputs. It competed against humans to ring the buzzer first, and Watson couldn’t ring the buzzer until it was confident it had constructed the right sentence. And, in fact, the humans quite often beat Watson to the buzzer even when Watson was on the right track. Watson works by choosing candidate responses, then devoting its processors to several of them at the same time, exploring archived material for further evidence of the quality of the answer. Candidates can be discarded and new ones selected. IBM is currently applying this general question-answering approach to real-world domains like health care and retail. This is very much how primate brains (like ours) work.” [“Why Watson Is Real Artificial Intelligence,” Future Tense, 14 February 2014]

 

Regardless of which side of the debate you come down on, Watson has convinced the public that artificial intelligence will play a significant role in a number of human activities in the decades ahead. Personally, I’m closer to the Hofstadter camp than I am to the Brundage/Bryson camp. I believe that Cognitive computing goes beyond combining artificial intelligence, natural language processing, and machine learning. To be cognitive, a system should be able to understand context as well as content. Does that mean that Brundage and Bryson are wrong? No, they also make some valid points. Watson, and most other cognitive computing systems including Enterra Solutions® Cognitive Reasoning Platform™, don’t possess artificial general intelligence. That is, they are not sentient in any sense of that word. That’s really Hofstadter’s main point. Nevertheless, cognitive computing systems have a role to play.

 

To overcome some of the objections raised by Hofstadter, Enterra adds reasoning engines connected to the world’s largest common sense ontology to the traditional combination of artificial intelligence algorithms, natural language processing, and machine learning. The ontology provides the system with some of the understanding that Hofstadter claims is missing in systems like Watson and Siri. Without this added dimension, learning machines can return some pretty funny (and erroneous) conclusions. Eric Blattberg (@EricBlattberg) reports, for example, “When deep learning startup AlchemyAPI exposed its natural language processing system to the Internet, it determined that dogs are people because of the way folks talked about their pets. That might ring true to some dog owners, but it’s not accurate in a broader context. That hilarious determination reflects the challenges — and opportunities — inherent to machine learning.” [“Cognitive computing is smashing our conception of ‘ground truth’,” Venture Beat, 20 March 2014] Because the ontology knows that dogs aren’t human, a cognitive computing system that utilizes an ontology would never conclude that dogs are human.

 

One of the real services that the Watson system has provided is making the public aware of just how useful cognitive computing systems can be. Such systems are no longer novelties whose primary purpose is to entertain the public. They are providing serious analysis for businesses. Erik Brynjolfsson (@erikbryn) and Andrew McAfee (@amcafee), wrote, “The exponential, digital, and recombinant powers of the second machine age have made it possible for humanity to create two of the most important one-time events in our history: the emergence of real, useful artificial intelligence (AI) and the connection of most of the people on the planet via a common digital network. Either of these advances alone would fundamentally change our growth prospects. When combined, they’re more important than anything since the Industrial Revolution, which forever transformed how physical work was done.” [“The Dawn of the Age of Artificial Intelligence,” The Atlantic, 14 February 2014] If you substitute “cognitive computing” for “artificial intelligence” in that statement, I am in complete agreement.

 

I believe that cognitive computing will take us to the next step in computer analytics; a step that could be labeled Analytics 4.0. If you are interested in reading about the first three steps in computer analytics, I suggest you read Thomas H. Davenport’s excellent article in the Harvard Business Review. [“Analytics 3.0, December 2013] He explains that during the Analytics 1.0 era, “the great majority of business intelligence activity addressed only what had happened in the past; they offered no explanations or predictions.” The Internet, he points out, increased the amount of data being produced dramatically. “As analytics entered the 2.0 phase,” he writes, “the need for powerful new tools — and the opportunity to profit by providing them — quickly became apparent.” He believes that the era of Big Data began in the Analytics 2.0 phase. That phase, however, quickly evolved into the Analytics 3.0 phase in which “every firm in every industry” can benefit from analytics.

 

I believe that the Analytics 4.0 phase will involve cognitive computing. Analytics at that point will go beyond explanations, prescriptions, and predictions to real understanding thanks to the reasoning capabilities of the systems providing the analysis. Accenture’s latest technology vision entitled “From Digitally Disrupted to Digital Disrupter,” calls cognitive computing “the ultimate long-term solution” for most businesses’ analytic requirement. Although cognitive computing systems may still seem like a novelty to some people, visionary business people understand that we are at the dawn of a new age of computing.

Related Posts: