Home » Artificial Intelligence » Innovation and Cognitive Computing

Innovation and Cognitive Computing

April 14, 2015

supplu-chain

“Machine learning is coming into a golden age,” writes Mehrdad Fatourechi (), CTO of BroadbandTV Corp, “and with it we’re seeing an awakening of possibilities formerly reserved for science fiction.”[1] If you are unfamiliar with machine learning, Fatourechi explains:

“Machine learning (ML) is a computer’s way of learning from examples, and it’s one of the most useful tools we have for the construction of artificial intelligence (AI). It begins with the design of an algorithm that learns from collected data, creating machines that in most cases become smarter as data volumes intensify.”

Fatourechi notes that significant advances have been made in machine learning, but he believes the best is yet to come. “While the depth of advancement is unknown,” he writes, “what we can say with high certainty is that development in this field in the past five years will be nothing compared to what we’re going to see in the five years to come.” In his play “The Tempest,” William Shakespeare observed “What’s past is prologue.” Fatourechi apparently agrees with the Bard. “Based on machine learning’s current state” (i.e., the past), he offers up “four predictions of what we could see in the near future” (i.e., the prologue). Before looking at Fatourechi’s predictions, I’d like to provide one of my own: It will be cognitive computing, more than simple machine learning, that fosters much of the innovation Fatourechi sees in the future. Cognitive computing involves artificial intelligence, natural language processing, and, in the case of the Enterra® Cognitive Reasoning Platform™, semantic reasoning. The most famous cognitive computing system, IBM’s Watson, doesn’t use semantic reasoning. Basically, Watson uses a brute force approach to cognitive analytics. It analyzes massive amounts of data and provides a “best guess” answer (IBM calls it a “confidence-weighted response”) based on what it finds. To learn more about why I believe cognitive computing will be more important that simple machine learning, read my article entitled “Cognitive Computing: The Next Big Thing.” Since cognitive computing embraces machine learning, all of Fatourechi’s predictions apply equally well to machine learning and cognitive computing systems. The first area Fatourechi makes predictions about is image-based recognition. He writes:

Image-Based Recognition: The technology for image and video-based recognition is on the horizon, and with it a whole new experience for users. Thanks to deep learning, we are now at the dawn of computers recognizing images, and the people and actions within them, with high accuracy based on the image alone and with minimum reliance on external data. It’s not just new pictures that will become recognizable either, but the entire history of digitized images and video footage. This will massively change how these assets are located and shared online. For example, YouTube might soon intelligently find content related to parts of a clip you watched and liked based only on the visual content of the video itself. The resulting efficiencies in both our work and personal time will be profound.”

Deep learning is a specific type of machine learning. As the description of college course offering on the subject states, “deep learning algorithms attempt to learn multi-level representations of data, embodying a hierarchy of factors that may explain them. Such algorithms have been demonstrated to be effective both at uncovering underlying structure in data, and have been successfully applied to a large variety of problems ranging from image classification, to natural language processing and speech recognition.” In other words, deep learning uses the brute force approach. The next area where Fatourechi expects to sees significant innovation thanks to machine learning is healthcare.

Healthcare: Machine learning’s ability to analyze and store massive amounts of data should provide physicians with much-needed second opinions and lead to the detection and treatment of medical ailments on a mass scale. Packaged as smart, wearable computing devices, personal health monitors that detect various conditions as they arise should become widespread in the next five years, in a similar fashion to activity trackers like Fitbit. The advancements here could significantly accelerate our human desire to protect our own longevity and create major breakthroughs for the operations of the medical industry.”

One of the first sectors in which IBM’s Watson was commercially used was healthcare. Memorial Sloan Kettering is using Watson as an Oncology Advisor. Forrester analyst Skip Snow (@SkipSnow), asserts, “While IBM and Memorial Sloan Kettering might be deploying its Watson’s Oncology Advisor trained by Memorial Sloan Kettering to a hospital network that spans four nations, we do not have the peer-reviewed research to show that the outcomes for those who use the advisor are better.”[2] Snow’s comment underscores the fact that cognitive computing is still in its infancy. Like Fatourechi, however, Snow believes that technology will make healthcare better in the future. He concludes:

“As Healthcare’s analytical and operational workflows become increasingly embedded in software, we will see an increasing automation of care delivery. That automation empowers increasingly globalized care delivery infrastructure where remote medicine is practiced globally. Fierce competition will determine which care-delivery organizations will prevail. The ones that do will create profound digital capabilities.”

The automation about which Snow writes will be driven by artificial intelligence. The next areas about which Fatourechi writes are travel and communication.

Travel & Communication: By 2020, real-time translation technology may be fully accessible. We’ll see everything from an app on your phone that instantly translates foreign signs and texts to phone conversations that are immediately converted to a listener’s native language, without speakers even knowing the difference. As globalization booms, the language lines will soon be crossed. Business, in particular, stands to benefit enormously from the advancement here, with tech giants such as Google and Microsoft already taking the necessary steps to build such tools, making the need for a premium multilingual workforce obsolete.”

The final area addressed by Fatourechi is advertising.

Advertising: Based on recent ML advancements, in just a few short years augmented reality technology should become the commonplace method for integrated branding. This will allow advertisers to seamlessly place products into existing content by properly identifying the depth, relative size, lighting, and shading of the product in comparison to the setting. This essentially makes any historical video property available for integration. The computer vision technology firm Mirriad has already been heralded (and won an Oscar) for its advancements in the field. Looking at online video, as companies continue to try and tap into hugely popular amateur content, this technology will revolutionize their capabilities.”

The breakthroughs in advertising that will be brought through cognitive computing will go well beyond branding in videos. As Ben Rossi () states, “We are in for a change, a different kind of change than we’ve ever experienced.” In Accenture’s latest technology vision entitled “From Digitally Disrupted to Digital Disrupter,” the consulting firm states, “What if … machines could be taught to leverage data, learn from it, and, with a little guidance, figure out what to do with it? That’s the power of machine learning — which is a major building block of the ultimate long-term solution: cognitive computing.” Imagine a “what if” technological future and cognitive computing is likely to play an important role in it.

 

Footnotes
[1] Mehrdad Fatourechi, “How machine learning will fuel huge innovation over the next 5 years,” VentureBeat, 28 February 2015.
[2] Skip Snow, “Cognitive Computing Is Changing Healthcare, Slowly,” Forrester, 17 February 2015.
[3] Ben Rossi, “How artificial intelligence will make humans smarter,” Information Age, 25 November 2014.

Related Posts: