In this post, I’ll be discussing three articles from SupplyChainBrain that all touch on why processes, technology, and people matter when discussing Big Data analytics. In the first article, Mark Kornbluth, Managing Director of Client Technology at Kroll Associates, writes, “Former U.S. Secretary of Defense Donald Rumsfeld famously addressed the absence of evidence of weapons of mass destruction in Iraq with a statement that was oddly prophetic for today’s global business: ‘There are known knowns; there are things we know we know. We also know there are unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.'” [“Spotting Unknowns in a Sea of Data & Separating Critical Compliance Risks from the Noise,” SupplyChainBrain, 11 November 2011] Kornbluth continues:
“Fast-forward nine years and the vast majority of multinational corporations are now fighting their own battle with unknown unknowns lurking in their global supply chains. The phenomenon is the result of dueling trends: as more firms expand into high-risk emerging markets, governments have ratcheted up their enforcement of anti-bribery laws.”
I start this discussion by mentioning bribery laws because “Wal-Mart Stores Inc. faces significant legal risks after it disclosed that it is investigating its operations in Mexico for possible violations of the U.S. law that prohibits bribery in foreign countries, legal experts said.” [“Wal-Mart Faces Risk in Mexican Bribe Probe,” Wall Street Journal, 22 April 2012] It wasn’t technology or processes that placed Wal-Mart in a compromising position; it was people. Too often when we think about analytics we forget the people part of the equation and that can be a mistake. Jessica Wohl reports, “Allegations that Wal-Mart Stores Inc stymied an internal investigation into extensive bribery at its Mexican subsidiary are likely to lead to years of regulatory scrutiny and could eventually cost some executives their jobs.” [“Wal-Mart probe could cost some executives their jobs,” Reuters, 23 April 2012]. Kornbluth reports that preventing illegal actions by employees should be a top concern of executives because enforcement is on the rise. He writes:
“According to data tracked by the law firm Gibson Dunn & Crutcher, the number of Foreign Corrupt Practices Act (FCPA) enforcement actions increased 85 percent from 2009 to 2010, with 48 new DOJ cases and 26 new SEC actions filed. In total, companies paid a record $1.8bn in financial penalties to the DOJ and SEC in 2010, according to data from both agencies.”
Kornbluth believes that technology can help companies gather and analyze data to help them monitor employee behavior so that they can either prevent or mitigate illegal behavior. He continues:
“Multinationals have responded with an aggressive ramp-up of compliance efforts complete with data and analytics on everything from vendor background checks to regional country risk monitoring. The result? As The Wall Street Journal reported in September, ‘Companies are being inundated with data. … But many managers struggle to make sense of the numbers.'”
As I frequently point out, data that is not analyzed or results that are not properly presented to decision makers are simply not useful. In fact, as Kornbluth points out, it can be unhelpful. He claims that in his company’s work, he and his associates “have found that the data overload problem is most frequently the result of decentralized data management processes that have not been standardized across an organization.” So having begun with people, Kornbluth next turns to processes and technology to show how they can help provide a solution. He continues:
“Too often, the process of screening and monitoring international fraud risk is done manually with e-mails back and forth to third-party vendors, documents stored in a variety of locations and individuals in different business units following different processes. Thus, despite a surfeit of available data, companies are missing key red flags through simple mismanagement of resources.”
Kornbluth recommends “a four-step data management process to help multinationals spot unknowns more effectively.” Those steps are:
“1. Define a Third-Party Screening Policy: Amazingly, many multinationals are collecting terabytes of data from their global operations with no unified corporate policy on how to use that data across the organization. The first step in any risk management project of this scale is to clearly define what key criteria a company is screening for, notification rules in the case that red flags are found and specific report types that will be produced worldwide. Without this core set of guiding principals, companies are bound to quickly become slaves to their data.
“2. Build a Centralized Online Database: A typical multinational operating in high-risk emerging markets will have thousands of vendors and agents working on its behalf. Basic background checks on each of these entities would create an overwhelming data deluge without standardized processes. By hosting all third-party screening data in a secure, encrypted, centralized database, it is possible to set up rules for easy review.
“3. Standardized Reporting: The only way to accurately analyze the myriad of different red flags that crop up around the world is to use a consistent reporting structure. To be useful, third-party screening reports must report the same data in the same order globally. This includes global compliance database checks, adverse media in the local language on the company and its management, address history, corporate registry information, civil court checks, criminal court checks, bankruptcies and several others in a uniform format. An organization must agree on what they are screening across the organization and stay consistent in their approach.
“4. Annual Review: Risk profiles change over time, making it important to regularly screen third-parties for any changes in their structure, management team, lines of business or regions of operation. Ideally, the review process should be initiated annually. When it comes to systematically identifying potential fraud risks before they result in enforcement actions, there is no shortage of data. The key to a successful compliance program is accessing the correct data to analyze threats and draw clear conclusions.”
I agree with Kornbluth that in any area that a company analyzes “accessing the correct data” is critical. Unfortunately, the area that Kornbluth writes about is only one of hundreds or thousands of areas in which data must be collected and analyzed by multinational enterprises. That makes the matter of collection and integration even more important. In the second article, Tim Rey, director of advanced analytics with Dow, offered “a glimpse into that company’s use of analytics and complex mathematics to examine multiple areas of its global supply chain” to the editorial staff at SupplyChain Brain. [“The Benefits of Advanced Analytics at Dow Chemical Company,” 19 October 2011] The staff wrote:
“The application of advanced analytics requires substantial resources drawn from multiple parts of the organization. ‘It’s a balance of people, process, methods and technology,’ says Rey.”
You see the theme here: people, processes, and technology are all critical when it comes to Big Data analytics. In Rey’s case, people are not the source of trouble but the source of value. The article explains:
“Individuals must be highly trained in math, machine learning, forecasting, simulations and operations research, to name a few key areas. At Dow, many of the people who participate in advanced analytics have a background in research and development, where they were already doing mathematics and modeling for manufacturing processes. Most possess advanced degrees, says Rey. Experts in Six Sigma and master black belts can be of great help in the effort.”
Although some people claim that Big Data analytics, and the technologies that support it, are mature, most pundits believe that we are still in its infancy — but maturing fast. The article explains:
“While the concept of advanced analytics isn’t new, it hasn’t been applied to the business side of the organization until relatively recently. A handful of universities are beginning to certify students in that area. Dow is speeding the development of the discipline by bringing in graduate students and putting them to work on the analysis of business processes. The company has already done a wide range of work in areas such as fraud detection in auditing, strategy, portfolio optimization, forecasting and model, purchasing cost forecasting, and the use of purchasing decisions to minimize cost. It’s essential to get access to data as quickly as possible, Rey says. ‘Waiting six to nine months for a report to be available doesn’t work.'”
One of the benefits of using cloud computing to conduct Big Data analytics is that it can provide the benefits of fast computing without having to invest valuable corporate resources into infrastructure and maintenance. Rey explains that getting external support is often a good idea. He explains:
“Commercially available software can help; companies don’t need to build optimization algorithms from scratch. In any case, says Rey, one needn’t wait for all corporate data sources to be structured before undertaking an analytical process. There’s always the risk of getting too complex in one’s calculations. ‘All models are wrong,’ notes Rey. ‘Some models are useful.’ The trick lies in striking the right balance between theory and reality. And companies must always be aware of the ‘garbage in, garbage out’ nature of data. ‘You have to be careful,’ says Rey. ‘Sometimes the quality of the data doesn’t merit the complexity of the model.'”
Those are all great points. By implementing a Sense, Think/Learn, Act™ system like the one my company, Enterra Solutions, offers, poor quality data is eventually winnowed out as the system learns what is important. That’s the real power of a good analytical system. It can also help discover some of the unknown unknowns discussed by Kornbluth.
The final article reports some of the highlights of an interview that Greg Gorbach, vice president-collaborative manufacturing at ARC Advisory Group, conducted with Paul Boris, vice president of collaborative manufacturing at SAP. [“The Power of Real-Time Analytics,” SupplyChainBrain, 23 April 2012] The article states:
“Boris says the right technology can free information and processes to enable real-time analytics, which in turn can drive business processes and deliver performance-enhancing information directly to individuals, wherever they are working. ‘For example, you might deliver engineering information directly to the hands of the operator working on an asset, in real time, and perhaps wrap that information in 3D to provide a richer view,’ he says.”
Once again you see experts stressing the importance of processes, technology, and people as they relate to Big Data analytics. As with most things in life, describing how things should work is always easier than actually getting them to work. The article concludes:
“Creating and using real-time analytics in this way requires capabilities in several areas, Boris says. These include applications, analytics, and modeling around on demand, on premise, on device. The latter ‘speaks to having a hybrid cloud on premise as well as the necessary database,’ he says, ‘but the technology already exists.’ … Social media also is having an impact. Boris sees it as a way to share knowledge and best practices, which is becoming increasingly important as the workforce ages and a company’s knowledge base is lost through attrition. … Boris says his vision for the next generation of manufacturing and supply chain is one of different processes that can be plugged or unplugged, giving incredible agility and more sustainability with less labor required.”
One of the main take-aways from the interview was, “Having information and processes trapped in operational silos is a continuing problem that keeps many businesses from performing as well as they should.” I couldn’t agree more. Siloed information hurts all of the big three — processes, technology, and people. And they all matter.