In past posts, I have noted that the World Economic Forum has declared data to be a new asset class. All businesses, regardless of their size, generate data; but, not all businesses use that asset to its full extent. Malory Davies, editor of Supply Chain Standard, reminds us, “What gets measured gets managed.” He goes on to state, “The corollary to that, of course, is that if you are to manage the right things then you have to have effective ways of measuring them.” [“Measure for measure,” Supply Chain Standard, 25 July 2011] To come full circle, we need to remember that measuring involves data. Obtaining the right data, Davies insists, is “easier said than done.” He reminds us that “there are always Donald Rumsfeld’s famous ‘unknown unknowns’ to deal with – those things that we don’t know we don’t know.” That’s where Big Data analytics can help. Programs that think and learn can help discover some of the unknowns so that they can be measured. Davies, however, asserts that companies should withstand the temptation “to measure everything and then try to work out what it all means afterwards.” Such efforts, he writes, are “wasteful and inevitably will throw up large amounts of confusing data.” If you begin by analyzing data that you know you want and need, and let an intelligent system discover other things that may be of importance, you avoid being wasteful while simultaneously being wise. Davies continues:
“Certainly, there is plenty of evidence that the complexity and scope of global supply chains means measuring performance still remains a challenge for many companies. It makes sense to focus on the quality of metrics rather than quantity for effective performance measurement and improvement. … Ultimately, good metrics require people, tools and processes right across the enterprise taking into account company strategy to show meaningful performance.”
David F. Carr reports, “Advances in analytic technologies and business intelligence are allowing CIOs to go big, go fast, go deep, go cheap and go mobile with business data.” [“5 Business Analytics Tech Trends and How to Exploit Them,” CIO, 23 March 2012] He further notes, “In interviews, CIOs consistently identified five IT trends that are having an impact on how they deliver analytics: the rise of Big Data, technologies for faster processing, declining costs for IT commodities, proliferating mobile devices and social media.” He discusses each of those trends in turn beginning with Big Data. He writes:
“Big Data refers to very large data sets, particularly those not neatly organized to fit into a traditional data warehouse. Web crawler data, social media feeds and server logs, as well as data from supply chain, industrial, environmental and surveillance sensors all make corporate data more complex than it used to be. Although not every company needs techniques and technologies for handling large, unstructured data sets, Verisk Analytics CIO Perry Rotella thinks all CIOs should be looking at Big Data analytics tools.” … Technology leaders should adopt the attitude that more data is better and embrace overwhelming quantities of it, says Rotella. … One of the most talked about Big Data technologies is Hadoop, an open-source distributed data processing platform originally created for tasks such as compiling web search indexes. It’s one of several so-called ‘NoSQL’ technologies (others include CouchDB and MongoDB) that have emerged to organize web-scale data in novel ways. Hadoop is capable of processing petabytes of data by assigning subsets of that data to hundreds or thousands of servers, each of which reports back its results to be collated by a master job scheduler. Hadoop can either be used to prepare data for analysis or as an analytic tool in its own right. Organizations that don’t have thousands of spare servers to play with can also purchase on-demand access to Hadoop instances from cloud vendors such as Amazon.”
The value of emerging Big Data analytical technologies is not just that they crunch mountains of data it’s that they do it quickly. Fast analysis is the next subject discussed by Carr. He writes:
“Big Data technologies are one element of a larger trend toward faster analytics, says University of Kentucky CIO Vince Kellen. ‘What we really want is advanced analytics on a hell of a lot of data,’ Kellen says. How much data one has is less critical than how efficiently it can be analyzed, ‘because you want it fast.’ The capacity of today’s computers to process much more data in memory allows for faster results than when searching through data on disk-even if you’re crunching only gigabytes of it. Although databases have, for decades, improved performance with caching of frequently accessed data, now it’s become more practical to load entire large datasets into the memory of a server or cluster of servers, with disks used only as a backup. Because retrieving data from spinning magnetic disks is partly a mechanical process, it is orders of magnitude slower than processing in memory. Rotella says he can now ‘run analytics in seconds that would take us overnight five years ago.’ His firm does predictive analytics on large data sets, which often involves running a query, looking for patterns, and making adjustments before running the next query. Query execution time makes a big difference in how quickly an analysis progresses. … To improve analytics performance, hardware matters, too. Allan Hackney, CIO at the insurance and financial services giant John Hancock, is adding GPU chips-the same graphical processors found in gaming systems-to his arsenal. ‘The math that goes into visualizations is very similar to the math that goes into statistical analysis,’ he says, and graphics processors can perform calculations hundreds of times faster than conventional PC and server processors.”
In a business environment that is moving at an increasingly fast pace, results from slow analytic processes can be like reading yesterday’s news. Companies no longer have the luxury of poring over data for long periods of time before making decisions. Speed matters. Fortunately, costly super-computers are no longer necessary to achieve acceptable results for most businesses. Decreasing technology costs is the next trend discussed by Carr. He writes:
“Along with increases in computing capacity, analytics are benefitting from falling prices for memory and storage, along with open source software that provides an alternative to commercial products and puts competitive pressure on pricing. [John Ternent, CIO at Island One Resorts,] is an open-source evangelist. … ‘To me, open source levels the playing field,’ he says, because a mid-sized company such as Island One can use … an open-source application … for statistical analysis. … The changing economics of computing [is] altering some basic architectural choices. For example, one of the traditional reasons for building data warehouses was to bring the data together on servers with the computing horsepower to process it. When computing power was scarcer than it is today, it was important to offload analytic workloads from operational systems to avoid degrading the performance of everyday workloads. Now, that’s not always the right choice. … By factoring out all the steps of moving, reformatting and loading data into the warehouse, analytics built directly on an operational application can often provide more immediate answers.”
Carr reports that, even though the cost of computing is going down, “potential savings are often erased by increased demands for capacity.” That’s why so many companies are moving analytical processes to the cloud. Lots of potential headaches and expenses can be lifted off of in-house IT departments and placed on the shoulders of cloud service providers. Before doing that, however, William J. Holstein recommends that companies that are “considering adopting advanced business analytics should: Determine precisely what analytical tools the company needs; weigh the advantages of buying versus building; and, assess your ability to commit the necessary time and resources.” [“Analyze This!” Chief Executive, 7 March 2012] Returning to the trends being discussed by Carr, he next writes about mobile applications.
“Like nearly every other application, BI is going mobile. … For CIOs, addressing this trend has more to do with creating user interfaces for smartphones, tablets and touch screens than it is about sophisticated analytic capabilities. … The requirement to create native applications for each mobile platform may be fading now that the browsers in phones and tablets are more capable, says Island One’s Ternent. ‘I’m not sure I’d invest in a customized mobile device application if I can just skin a web-based application for a mobile device.'”
The final trend discussed by Carr is social media. He writes:
“With the explosion of Facebook, Twitter and other social media, more companies want to analyze the data these sites generate. New analytics applications have emerged to support statistical techniques such as natural language processing, sentiment analysis, and network analysis that aren’t part of the typical BI toolkit. Because they’re new, many social media analytics tools are available as services.”
Although Carr’s discussion may lead one to believe that Big Data analytics are only useful and affordable for large companies, Holstein reports that many small- and medium-sized businesses “must [also] handle an astounding amount of data.” That means that those companies “can use analytical tools just as the largest corporations can—or the hottest Web-based social media startups or the biggest intelligence agencies with three-letter names.” He continues:
“They can use those tools to eke out real competitive advantages against rivals that haven’t embraced the new capabilities. Even the most advanced tools, such as those IBM developed to such powerful effect with its Watson competitor on the Jeopardy game show, are within reach of companies with $10 million, $50 million or $100 million in annual sales.”
Holstein believes that there are “at least four stages in adopting an analytical system” each of which focuses on a different question. In Stage 1, the question is: “What Will We Analyze?” He writes:
“It’s important to go through a considered thought process before any decisions are made about what type of systems to purchase or develop, says Paul Magnone, co-author of Drinking From the Fire Hose and a 21-year veteran of IBM. There are specialist companies and there are integrators who bring various specializations together under one roof, ‘but the step before that is to get a grasp of your business and ask the right questions,’ says Magnone. … Those questions include: What is most important to the business? What matters most to your customers?”
I have consistently pointed out in my posts that good solutions always begin with good questions. The better the question the better the solution. Holstein notes that the variety of data sources will make a difference in the vendor or services a company eventually employs. Data integration is always a serious consideration. Stage 2 in adopting an analytical system focuses on the question: “Do We Buy or Build?” Holstein writes:
“One of the debates in the field is whether small- and mid-size enterprise CEOs should try to develop their own business analytics in cooperation with vendors or simply rely on the outsiders to install systems that essentially ‘plug in’ to what they already have. The big vendors argue that they have already built hundreds of industry-specific models and can tweak those systems for a particular SME. They can even deliver the services via the cloud, meaning the customer pays for use as he or she downloads or utilizes software and other services. That raises a corollary issue: do you want to take a big plunge on a major expense or do you want to proceed with a step-by-step implementation with a long-term partner? The reality on the ground seems to be that most small company CEOs want to have a hand in developing their analytical capabilities gradually, not in a single big-bang moment.”
One of the reasons that my company, Enterra Solutions®, builds modules is that we realize that one-size-fits-all solutions don’t normally work (especially for small- and medium-sized companies). Some tailoring is almost always required. Stage 3 in adopting an analytical system focuses on the question: “Are We Ready to Invest?” Holstein writes, “If a CEO decides to co-develop a business analytics system, odds are that he or she will need internal talent to help.” That help comes in the form of employees who know the business (i.e., employees who will use the system) and IT people who will be needed to help administer it.
Stage 4 in adopting an analytical system focuses on the question: “Do We Understand the Impact?” Holstein writes: “The reality is that reaching a certain point of sophistication with business analytics changes the way the company is run and challenges the traditional culture. … All of which explains why going down the path of business analytics can be so profound.” Holstein concludes, “Whatever complexities may exist, the payoffs from the successful implementation of an analytical system are clear.” He provides examples of the kinds of returns on investment that companies should expect if they embrace Big Data analytics. Those returns include more business and increased profits.