Home » Big Data » Cognitive Computing: Complementing the Quants

Cognitive Computing: Complementing the Quants

April 22, 2015

supplu-chain

“Artificial Intelligence (AI) is an idea that has oscillated through many hype cycles over many years, as scientists and sci-fi visionaries have declared the imminent arrival of thinking machines,” writes Bradford Power (). “But it seems we’re now at an actual tipping point.”[1] Power implies that one of the reasons that artificial intelligence systems haven’t been of much use to businesses up to now is because there have not been enough people with the right technical skills to go around. The next-generation of artificial intelligence systems — cognitive computers — will help alleviate that situation. Power writes:

“Interestingly, for a long time, doing detailed analytics has been quite labor- and people-intensive. You need ‘quants,’ the statistically savvy mathematicians and engineers who build models that make sense of the data. As Babson professor and analytics expert Tom Davenport explained to me, humans are traditionally necessary to create a hypothesis, identify relevant variables, build and run a model, and then iterate it. Quants can typically create one or two good models per week. However, machine learning tools for quantitative data — perhaps the first line of AI — can create thousands of models a week.”

Last year, Mark Gibbs (@quistuipater), a contributing editor to Network World, wrote about the growing data scientist challenge. “There aren’t enough of them,” he wrote. “And no matter how many universities can produce in the next few years there still won’t be enough.”[2] At Enterra®, we were flattered that Gibbs discussed how our Enterra Enterprise Cognitive System™ (ECS) can help mitigate this growing challenge. He wrote:

“The answer is, of course, to set a computer onto the task of analyzing and deriving insights and conclusions. Unfortunately most of the available solutions are complex to use and require that you ask just the right question in some sort of computer language. Enterra Solutions, a key competitor in the big data analytics market, has a solution that is completely different in that it can automatically mine data exhaustively and intelligently to draw conclusions based on natural language queries. Enterra Solutions can ingest huge amounts of data and using natural language processing transform it into knowledge using a generalized ontology to discover the meanings of words in context along with the implicit rules and relationships as used by humans. Then, when a question is asked in what is more or less natural language, the database of knowledge is accessed by Enterra’s Hypothesis Engine. The Hypothesis Engine is an artificial intelligence system that applies common sense and domain-specific ontologies to further structure the knowledge. Next, using Enterra’s Rules-Based Inference System it can determine an objective and find the facts to support that objective (backward chaining) as well as using facts to determine objectives (forward chaining) as determined by the knowledge found and its significance. Other engines in the system weigh results, formulate database queries, and analyze assets and all of these components pass data back and forth between themselves based on rules and inferences to derive conclusions.”

When we talk to clients, we note that when an organization has an analytic problem, it typically has to assemble a team of three experts:

 

  • A business domain expert – the customer of the analysis who can help explain the drivers behind data anomalies and outliers.
  • A statistical expert – to help formulate the correct statistical studies, the business expert knows what they want to study, and what terms to use to help formulate the data in a way that will detect the desired phenomena.
  • A data expert – The data expert understands where and how to pull the data from across multiple databases or data feeds.

 

Having three experts involved dramatically lengthens the time required to analyze, tune, re-analyze, and interpret the results. Enterra’s approach empowers the business expert by automating the statistical expert’s and data expert’s knowledge and functions, so the ideation cycle can be dramatically shortened and more insights can be auto-generated. Even some of the business expert’s logic is automated to help tune and re-analyze the data. Power points out another clear advantage of utilizing a cognitive computing system — speed. “In business,” he writes, “the explosive growth of complex and time-sensitive data enables decisions that can give you a competitive advantage, but these decisions depend on analyzing at a speed, volume, and complexity that is too great for humans. AI is filling this gap as it becomes ingrained in the analytics technology infrastructure in industries like health care, financial services, and travel.” To get a sense of where cognitive systems are taking the business world, Power spoke with several experts who are following the field. He reports:

“I talked to some venture capitalists, whose profession it is to keep their eyes and minds trained on the future. Mark Gorenberg, Managing Director at Zetta Venture Partners, which is focused on investing in analytics and data startups, told me, ‘AI historically was not ingrained in the technology structure. Now we’re able to build on top of ideas and infrastructure that didn’t exist before. We’ve gone through the change of Big Data. Now we’re adding machine learning. AI is not the be-all and end-all; it’s an embedded technology. It’s like taking an application and putting a brain into it, using machine learning. It’s the use of cognitive computing as part of an application.’ Another veteran venture capitalist, Promod Haque, senior managing partner at Norwest Venture Partners, explained…, ‘if you can have machines automate the correlations and build the models, you save labor and increase speed. With tools like Watson, lots of companies can do different kinds of analytics automatically.'”

I have noted in previous articles that, to some extent, all businesses rely on data and many analysts predict that the most successful businesses in the future will be those who transform into digital enterprises. An article in the Retail Info Systems News discusses how cognitive computing systems are going to help foster this transformation. The article states:

“As new data sources contribute to the value and volume of big data, interpretations and correlations become too complex for analysts’ existing analytical technologies, and trial-and-error — or ad hoc queries — to extract results. In essence, big data is becoming a catalyst that is changing the tools required to process and analyze these large data files. Moving forward, retailers will need assistance in gaining more detailed context from incoming information. Executives need accurate answers that can help them make more informed, accurate business decisions — but results are subject to error due to users’ manual intervention or cognitive bias. Enter the value of cognitive systems, or machine learning. Using artificial intelligence, cognitive systems rely on algorithms within a computer program to observe, learn, analyze, offer suggestions, and even create new ideas. Machines — or hard-core robust computers — apply historical data to a problem by creating a model and using it to predict future behavior or trends.”

Cognitive computing systems are democratizing access to world-class analytics for businesses that might not otherwise have access to the technical and intellectual skills of the quants discussed by Power. To learn more about why businesses are starting to get excited about cognitive computing, read my article entitled “Cognitive Computing: The Next Big Thing.”

 

Footnotes
[1] Bradford Power, “Artificial Intelligence Is Almost Ready for Business,” Harvard Business Review, 19 March 2015.
[2] Mark Gibbs, “Not enough data scientists? Use AI instead.Network World, 7 March 2014.
[3] Staff, “Enter the Machine: The Arrival of Next-Gen Analytics,” 23 February 2015.

Related Posts: