Home » Applied Mathematics » Of Algorithms and Analytics

Of Algorithms and Analytics

July 20, 2018

supplu-chain

One of the most intriguing developments of the past several decades is the increasingly ubiquitous presence of algorithms in our lives. Algorithms are what help make sense of the growing oceans of data being produced by our connected devices. Half a decade ago, Leo Hickman (@LeoHickman), Director/editor of CarbonBrief, wrote, “From dating websites and City trading floors, through to online retailing and internet searches (Google’s search algorithm is now a more closely guarded commercial secret than the recipe for Coca-Cola), algorithms are increasingly determining our collective futures.”[1] That statement is as true (or truer) today than it was in 2013.

 

Of Algorithms and Analytics

 

Dr. Panos Parpas, a lecturer at Imperial College London, told Hickman, “The current interest in [algorithms] is due to the vast amounts of data now being generated and the need to process and understand it. They are now integrated into our lives. On the one hand, they are good because they free up our time and do mundane processes on our behalf. The questions being raised about algorithms at the moment are not about algorithms per se, but about the way society is structured with regard to data use and data privacy. It’s also about how models are being used to predict the future. There is currently an awkward marriage between data and algorithms.” The “awkward marriage” often involves trying to determine which mathematical model provides the most benefit when analyzing large datasets.

 

To overcome some of this awkwardness, Enterra Solutions® leverages the Representational Learning Machine™ (RLM) created by Massive Dynamics™. The RLM can help determine what type of analysis is best-suited for the data involved in a high-dimensional environment. Additionally, the RLM operates in a “glass box” fashion. In traditional machine learning, predictions are generated in “black box” fashion. The RLM allows users to see through the system and understand the drivers and the foundation for any given prediction. This high-dimensional model representation (HDMR) analysis relies on global variance-based sensitivity analysis to generate an understanding of the relationships among variables and is therefore of particular benefit for applications with a large number of input variables, permitting the practitioner to focus principally on those variables that prove important to driving pertinent outcomes.

 

Kayla Matthews (@KaylaEMatthews) notes, “It’s crucial to understand data modeling when working with big data to solidify important business decisions. Although specific circumstances vary with each attempt, there are best practices to follow that should improve outcomes and save time.”[2] She suggests six “best practices” to consider:

 

1. Define the Business Objective. Obviously, businesses analyze big data to gain insights that help them make better business decisions. Thomas H. Davenport (@tdav), a Distinguished Professor at Babson College, notes, “Many organizations want and need to integrate analytics with their production systems — for evaluating customers, suppliers, and partners in real time, and for making real-time offers to customers. This requires a good deal of work to integrate analytics into databases and legacy systems.”[3] Matthews adds, “The sheer scope of big data sometimes makes it difficult to settle on an objective for your data modeling project. However, it’s essential to do so before getting started. Otherwise, you’ll waste money or end up with information that doesn’t meet your needs.”

 

2. Pick a Data Modeling Methodology and Automate It When Possible. Matthews notes, “Data modeling makes analysis possible. … After deciding which data modeling method works best, depend on it for the duration of a project.” As noted above, part of the problem in data modeling is the complexity found in modern businesses. Deloitte Consulting analysts, David Linich and Michael Puleo, note, “Almost all business processes suffer from excess complexity and variability, but both are difficult to spot — and even harder to eradicate — without fact-based analytical tools.”[4]

 

3. Make Your Data Models Scalable. Matthews notes, “Just as a successful business must scale up and meet demand, your data models should, too.”

 

4. Consider Time As an Important Element in Your Data Model. The speed of business is now computer speed with many decisions needing to be made in real- or near-real-time. Matthews notes, “Time-driven events are very useful as you tap into the power of data modeling to drive business decisions.” She notes, however, that several time horizons may be important. “By looking at data across time,” she explains, “it’s easier to determine genuine performance characteristics. Based on what you see, it may be less likely you’ll abort business plans due to hasty judgments.”

 

5. Avoid Misleading Data Visualizations. How information is presented to business leaders is often as important as the results of analysis. Matthews explains, “There are various ways you could present the information gleaned from data modeling and unintentionally use it to mislead people. For example, you might generate a chart that has a non-zero y-axis. If people don’t look at the left side of the graphic carefully, they may misunderstand the results and think they are overly dramatic. Using colors in certain ways or scaling your charts improperly can have the same effects.” Cognitive computing platforms with embedded analytics capabilities also offer natural language processing and can help business leaders make decisions by explaining results in terms they understand. Davenport notes, “A key assumption behind analytics in the past is that they are prepared for human decision-makers. But cognitive technologies take the next step and actually make the decision or take the recommended action. They are actually a family of technologies, including machine and deep learning, natural language processing, robotic process automation, and more. Most of these technologies have some form of analytics at their core and, to me, they have more potential for changing how we do analytics than any other technology.”

 

6. Create Valuable Data Definitions. Matthews writes, “Worthwhile definitions make your data models easier to understand, especially when extracting the data to show it to someone who does not ordinarily work with it. Instead of just creating basic definitions, uphold a best practice and define your data in broader ways, such as why you need the data and how you’ll use it.”

 

Summary

 

Danish Wadhwa, an IT professional, notes, “AI is considered a supporting tool to help humans.”[5] In fact, AI is the key link between data and analytics. It is no coincidence cognitive technologies have emerged and grown rapidly in the Big Data Era. AI, Wadhwa explains, “is able to process large amounts of data faster compared to a human brain. … Allowing computers to perform the tasks they are capable of doing faster than humans is vital for businesses, so they can channel human resources on more important tasks.” By understanding what you want from your data, and selecting the right model to garner those results, companies can compete better.

 

Footnotes
[1] Leo Hickman, “How algorithms rule the world,” The Guardian, 1 July 2013.
[2] Kayla Matthews, “6 data modeling best practices for better business intelligence,” Information Management, 20 September 2017.
[3] Thomas H. Davenport, “A Revolution in Analytical Technology,” LinkedIn, 6 July 2017.
[4] David Linich and Michael Puleo, “Taming Complexity With Analytics,” The Wall Street Journal, 21 December 2015.
[5] Danish Wadhwa, “How AI Is Linked To Business Analytics,” Business Computing World, 16 January 2018.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!