Embracing Artificial Intelligence

Stephen DeAngelis

August 31, 2016

“Artificial intelligence is approximating human reasoning more and more closely all the time,” writes Steven Norton (@steven_norton). “Wide-scale adoption by business may be approaching, with important implications for how people live and work.”[1] I’m a bit more positive than Norton about the widespread adoption of artificial intelligence (AI) in the business world. I don’t think there is any “may be” about it. Steven Kuyan (@stevenkuyan), the Managing Director of NYU Future Labs, and John Frankel (@john_frankel), an early stage venture capitalist at ff Venture Capital, believe we will reach a tipping point this year.[2] “2016,” they assert, “is the year [A.I. has] become a buzzword — incorporating machine learning, natural language processing, voice recognition, and data mining, to name a few technologies. Major corporations are now striving to integrate A.I. into their products.” Seth Earley (@sethearley), founder and CEO of Earley Information Science, adds, “The good news is that the early dividends from AI are already within reach of most midsize companies as they look for ways to expand their digital boundaries. In fact, the building blocks of AI can produce great results with fewer technical requirements and less time and money than many companies realize. What’s more, those that take this initial step are getting a leg up on AI’s future, since that step is going to be a prerequisite for everything that follows.”[3] Norton reports that the AI market is hot: “International Data Corp. predicts the worldwide market for cognitive software platforms and applications, which roughly defines the market for AI, to grow to $16.5 billion in 2019 from $1.6 billion in 2015 with a CAGR of 65.2%.”

 

Defining Artificial Intelligence and Cognitive Computing

 

Mohit Sharma, a Director at Mindfields, notes that a number of definitions for AI have been offered; but, he writes, “A 1955 description still works: AI is algorithm-based machine mimicking of human behaviours such as learning and problem solving without any intervention.”[4] Over the past couple of years, a branch of AI known as cognitive computing has been receiving a lot of attention. I define cognitive computing as the combination of two types of intelligence, namely: Semantic Intelligence (which involves machine learning, natural language processing, and semantic reasoning) and Computational Intelligence (which involves advanced mathematics). Bruno Michel (@BmiBruno), a scientist at IBM Research – Zurich, adds, “The term ‘cognitive computing’ refers to systems that, rather than being explicitly programmed, are built to learn from their experiences. By extracting useful information from unstructured data, these systems accelerate the information age, helping their users with a broad range of tasks, from identifying unique market opportunities to discovering new treatments for diseases to crafting creative solutions for cities, companies, and communities.”[5]

 

Learning from Data

 

The rise of artificial intelligence in the era of Big Data is no coincidence. There is a symbiotic relationship between the two. Big Data is useless until it is analyzed and AI systems need large amounts of data on which to train. Brandon Reynolds explains, “Algorithms adapt to data, developing behaviors not programmed in advance.”[6] Norton adds, “Artificial intelligence encompasses the techniques used to teach computers how to learn, reason, perceive, infer, communicate and make decisions like humans do. Its applications span technologies that can recognize images, schedule meetings and process human speech, to name just a few. … Companies need lots of clean and reliable data, as well as a clear idea of what they want the machine to be able to learn.” Earley calls this a “knowledge-based approach” and he notes that this approach “organizes data and language into highly malleable and helpful blocks of information.” He adds, “These ‘AI Lite’ systems don’t learn new tricks — unless their human minders use new code instructions to ‘teach’ them. But they can become very smart indeed about sorting and distributing their information in extremely fast ways.”

 

Embracing Artificial Intelligence

 

“What is clear,” writes Sharma, “is that AI will disrupt numerous business models with higher frequency than previous changes.” That fact alone should provide sufficient reason for companies to embrace artificial intelligence. If your company is not asking how AI could (or should) change its business model, it could get blindsided by an upstart competitor. Of course, AI shouldn’t be adopted just because it’s available. A sound business case needs to be made for any AI implementation. “Even in this new information age,” writes Earley, “not everything requires the razzle-dazzle of AI.” Finding such a business case, however, shouldn’t be hard. As Earley notes, “Companies and government agencies are starting to find plenty of places where knowledge-based tools can make a huge difference. These include improving data-mining operations, helping with training, and making structured, repeatable tasks and processes far more efficient and less costly. And they are finding the tools increasingly useful, of course, in dealing with online customers.” Kuyan and Frankel agree that finding a business case for AI shouldn’t be difficult. “Because A.I. has the potential to affect so many industries and in so many ways,” they write, “we think of it as a layer across multiple sectors, rather than a sector in and of itself.”

 

Conclusion

 

Norton concludes, “AI has come a long way since the 1956 Dartmouth artificial intelligence conference, which many consider the birthplace of the discipline. The field has grown in fits and starts since then, partially due to a lack of computing power. Now with the availability of cheaper and faster computing power, AI has become a more viable pursuit in the enterprise. … The use of AI is cropping up in all sorts of markets.” Earley asserts there are three fundamental elements that must be in place for successful AI implementation to take place. “Getting your money’s worth from AI, he writes, “whether at the knowledge-based end of the spectrum or later on, with more extensive, and expensive, applications — requires three things. The effort must involve careful analysis and preparation, which takes into account each department but keeps the focus on the full enterprise. It must have a formal (and nuanced) governance structure. And someone at the most senior level of the company must sponsor it.” I agree with Sharma that we should embrace, rather than fear, artificial intelligence in the business world. Sharma draws his inspiration from an earlier forward thinker — Marie Curie, who stated: “Nothing in life is to be feared, it is only to be understood. Now is the time to understand more, so that we may fear less.”

 

Footnotes
[1] Steven Norton, “CIO Explainer: What is Artificial Intelligence?,” The Wall Street Journal, 18 July 2016.
[2] Steven Kuyan and John Frankel, “Why is now the time for artificial intelligence?Venture Beat, 12 August 2016.
[3] Seth Earley, “How Companies Are Benefiting from “Lite” Artificial Intelligence,” Harvard Business Review, 19 July 2016.
[4] Mohit Sharma, “Why we should embrace artificial intelligence as businesses and individuals,” The Australian Financial Review Magazine, 9 August 2016.
[5] Bruno Michel, “The Future of Computing,” Project Syndicate, 30 June 2016.
[6] Brandon Reynolds, “4 Ways Artificial Intelligence Will Change Just About Everything,” Salesforce Blog, 9 August 2016.