Home » Artificial Intelligence » Cognitive Computing

Cognitive Computing

January 21, 2013

supplu-chain

Researchers at the Cognitive Computing Research Group at the University of Memphis claim that cognitive computing, like the Roman God Janus, has two faces. In the case of cognitive computing, they claim there is a science face and an engineering face. “The science face fleshes out the global workspace theory of consciousness into a full cognitive model of how minds work. The engineering face of cognitive computing explores architectural designs for software information agents and cognitive robots that promise more flexible, more human-like intelligence within their domains.” [“Cognitive Computing Research Group,” University of Memphis] Frankly, the business world is more interested in the engineering face of cognitive computing (i.e., how artificial intelligence can help companies better understand the world in which they operate); however, you really can’t have one face without the other. That’s why commercial firms as well as academic institutions are pursuing cognitive computing.

 

Mark Smith, CEO & Executive Vice President of Research at Ventana Research, claims that IBM’s “Watson blends existing and innovative technology into a new approach called cognitive computing.” [“IBM Watson Advances a New Category of Cognitive Computing,” Perspectives by Mark Smith, 11 December 2012] However, Roger Kay asserts, “Watson, the reigning jeopardy champ, is smart, but it’s still recognizably a computer.” He believes that cognitive computing represents “something completely different.” [“Cognitive Computing: When Computers Become Brains,” Forbes, 9 December 2011] On some level, both Smith and Kay are correct. Smith writes, “At the simplest operational level [cognitive computing] is technology for asking natural language-based questions, getting answers and support appropriate action to be taken or provide information to make more informed decisions.” Watson, he notes, “relies on massive processing power to yield probabilistic responses to user questions using sophisticated analytical algorithms.” On the other hand, Kay writes, “Cognitive computing, as the new field is called, takes computing concepts to a whole new level.” Cognitive computing goes beyond calculating probabilities to thinking. Smith continues:

“A cognitive system like Watson accesses structured and unstructured information within an associated knowledge base to return responses that are not simply data but contextualized information that can inform users’ actions and guide their decisions. This is a gigantic leap beyond human decision-making using experience based on random sources from the industry and internal sets of reports and data. This innovative new approach to computing is designed to aid humans by working with natural language – English in the case of today’s Watson.”

Smith goes on to provide a brief primer about cognitive computing. He writes:

“For those of you who are not used to the word cognitive, the foundation of cognition is the sum of all the thinking processes that contrib­ute to gaining knowledge for problem-solving. In computational systems these process­es are modeled using hardware and software; machine-based cognition thus is a step toward imbuing an arti­fi­cial system with attributes we typically consider human: the abilities to think and learn. Watson builds on a foundation of evidence from preexisting decisions and knowledge sources that it can load for reference in future inquiries. The evidence-based reasoning that Watson employs to answer question is part of the big deal in its approach.”

Smith notes that “Watson supports three types of engagement – ask, discover and decide – that take natural language questions, find facts to support a decision process, then gets probabilistic guidance on decisions.” Eventually, learning systems, like Watson, go beyond guessing (i.e., selecting the most probable answer) to knowing. Since such systems ingest massive amounts of data, crunch it with enormous computing power, and can do that on a continuous basis, there are good reasons for people like Shweta Dubey to call cognitive computing a disruptive technology. [“Is Cognitive Computing the Next Disruptive Technology?, The Motley Fool, 28 December 2012] Dubey believes that Watson represents a breakthrough because IBM has figured out how to monetize its cognitive capabilities. Smith agrees cognitive computing technologies have a bright future in business. He writes:

“This goes beyond search and retrieval technology; machine learning and processing of questions using very large volumes of data, commonly referred to as big data, is the foundation on which Watson as a cognitive system operates. Most important is the continuous learning method and what I would call adaptive intelligence. While machine learning and pattern-based analytics are part of the cognitive system, the ability to process big data efficiently to provide a probabilistic set of recommendations is just the kind of innovation many industries need. … IBM has a huge opportunity to bring innovation to business through the use of Watson, and has been experimenting with a number of deployments to test its potential. … IBM has been working with organizations in healthcare and financial services, but believes Watson could be useful in just about every industry that must have what I call better situation intelligence that must accommodate current conditions and preexisting information to determine the best answer.”

Dharmendra Modha, Manager of cognitive computing at IBM Almaden Research Center (who has been called IBM’s “Brain Guy”), is one of the driving forces behind the company’s efforts to create thinking machines. “For more than half a century,” he writes, “computers have been little better than calculators with storage structures and programmable memory, a model that scientists have continually aimed to improve. Comparatively, the human brain—the world’s most sophisticated computer—can perform complex tasks rapidly and accurately using the same amount of energy as a 20 watt light bulb in a space equivalent to a 2 liter soda bottle.” [“Beyond machines,” IBM] Creating a computer that is as efficient and even more effective than the human brain is the ultimate goal of cognitive computing. Modha calls cognitive computing “thought for the future.” He continues:

“Making sense of real-time input flowing in at a dizzying rate is a Herculean task for today’s computers, but would be natural for a brain-inspired system. Using advanced algorithms and silicon circuitry, cognitive computers learn through experiences, find correlations, create hypotheses, and remember—and learn from—the outcomes. For example, a cognitive computing system monitoring the world’s water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making.”

Although there are dreams of about global networks of things, the reality is that such networks are currently cost prohibitive. The cost/benefit analysis of cognitive computing networks will be a different matter. Companies that don’t take advantage of cognitive computing will find themselves at a severe disadvantage in the years ahead. Modha reports that “IBM is combining principles from nanoscience, neuroscience and supercomputing as part of a multi-year cognitive computing initiative” being funded in part by the Defense Advanced Research Projects Agency (DARPA). The IBM site contains links to a number of videos that discuss the IBM project.

 

Will true cognitive computing ever be achieved? That remains an open question depending on whether you are talking about Artificial General Intelligence (AGI) or more limited artificial intelligence applications. No one questions that cognitive computing applications for limited purposes are going to play an important role in the business environment in the years ahead. Frank Buytendijk ponders what business intelligence and business process management may look like in the future if cognitive computing becomes ubiquitous. [“Can Computers Think?” BeyeNETWORK, 27 December 2012] He writes:

“If computers can think, even be self-aware, and if datasets can have a certain individuality, computers might as well express their opinions. Their opinions, as unique individuals, may differ from the opinions of another data source. Managers need to think for themselves again and interpret the outcome of querying different sources, forming their particular picture of reality – not based on ‘the numbers that speak for themselves’ or on fact-based analysis, but based on synthesizing multiple points of view to construct a story.”

I suspect that most people believe that cognitive computing systems will require them to think less not more. If Buytendijk is correct, that won’t necessarily be the case. On the subject of post-modern business process management, he writes:

“It would not be possible to define and document every single process that flows through our organization. After all, every instance of every process would be unique, the result of a specific interaction between you and a customer or any other stakeholder. What is needed is an understanding that different people have different requirements, and structure those in an ontological approach. In a postmodern world, we base our conversations on a meta-understanding. We understand that everyone has a different understanding. Of course, as we do today, we can automate those interactions as well. Once we have semantic interoperability between data sets, processes, systems and computers in the form of master data management, metadata management, and communication standards based on XML (in human terms: ‘language’), systems can exchange viewpoints, negotiate, triangulate and form a common opinion. Most likely, given the multiple viewpoints, the outcome would be better than one provided by the traditional ‘single version of the truth’ approach.”

If we have entered the post-modern era, Buytendijk concludes, “One thing is clear: Before we are able to embrace postmodernism in IT, we need to seriously re-architect our systems, tools, applications and methodologies.” This shouldn’t be change we fear but change we embrace. Cognitive computing opens a new world of possibilities for making life better.

Related Posts: