Big Data Analytics: From Disappointment to Delight

Stephen DeAngelis

March 15, 2021

Over the past few years, businesses have expressed both delight and disappointment with their big data analytics efforts. The lesson to be learned is that doing big data analytics correctly isn’t as easy as some people would make you believe. Nevertheless, the editorial team at insideBIGDATA believes this past year proved the value of advanced analytics. They write, “Looking back over the past year, it’s clear that for many organizations, regardless of size or industry, technology was invoked to survive the crisis. Much has been reported about the rapid migration to the cloud and the move to support remote working but according to James Don-Carolis, Managing Director of TrueCue, data, and the value which can be obtained from actionable, business intelligence, often acts as the differentiator between success and failure.”[1] Don-Carolis told the insideBIGDATA team, “Economic challenges will still make it problematic for businesses to get a full sense of what lies ahead but in order to traverse the current and post-pandemic landscape, those organizations able to make insight-driven decisions will be far more likely to prosper in the coming months and years.”

 

Don-Carolis’ assertion is backed by other historical studies. Bain analysts, Michael C. Mankins and Lori Sherer (@lorisherer), assert, “The best way to understand any company’s operations is to view them as a series of decisions.”[2] Mankins and Sherer add, “We know from extensive research that decisions matter — a lot. Companies that make better decisions, make them faster and execute them more effectively than rivals nearly always turn in better financial performance. Not surprisingly, companies that employ advanced analytics to improve decision making and execution have the results to show for it.” So, what’s the difference between organizations expressing delight and those experiencing disappointment with the analytics efforts?

 

Making a business case

 

Don-Carolis states, “Our opinion is before embarking on any data and analytics initiative, the range of likely benefits and business impacts should be discussed, understood and, to the extent possible, measured and quantified.” He admits that’s easier said than done. He explains, “It can be quite hard for data and analytics initiatives. This is because some of the impacts are direct, such as time saved by automating repetitive tasks, whereas others are less tangible, for example the impact of making quicker and/or better decisions. However, this should not be a deterrent to quantifying impact, because the thought process required for the quantification can itself give benefit.” One way to determine potential return on investment (ROI) is by sponsoring exploratory experiments (i.e., proof of concept projects). At Enterra Solutions®, we recommend a “crawl, walk, run” approach. This approach allows solutions to be tested on a small project and, if successful, tweaked as they scale.

 

Boris Trofimov, a Software Architect at Sigma Software, notes, “In today’s digital world, companies embrace big data business analytics to improve decision-making, increase accountability, raise productivity, make better predictions, monitor performance, and gain a competitive advantage. However, many organizations have problems using business intelligence analytics on a strategic level.”[3] In order to achieve desired objectives, Trofimov asserts both the analytics used and the system they are used on must be correct. Incorrect analytics, deep system or infrastructure problems can create challenges and disappointment.

 

Overcoming disappointment analytics challenges

 

Trofimov identifies five big data analytics challenges facing many businesses. They are:

 

1) Business analytics solution fails to provide new or timely insights. This is one area that has been a frequent source of disappointment to organizations. Trofimov writes, “At times, it seems, the insights your new system provides are of the same level and quality as the ones you had before.” The first place you should look, he asserts, is at your data. The problem, he writes, is that your analytics may “not have enough data to generate new insights. This may either be caused by the lack of data integrations or poor data organization.” If your disappointment is not with the results, but with timeliness, Trofimov says your system may be set up wrong (i.e., it is set up for batch processing, but you need real-time analysis). He writes, “Check if your ETL (Extract, Transform, Load) is able to process data based on a more frequent schedule. In certain cases, batch-driven solutions allow schedule adjustments with a 2x times boost. Another option is to use an architecture approach called Lambda Architecture, which allows you to combine the traditional batch pipeline with a fast real-time stream.” Finally, he writes, the problem may with your approach (i.e., trying to apply an old approach to a new system). He explains, “It would be difficult to get new answers by asking old questions. This is mostly a business issue, and possible solutions to this problem differ a lot case-by-case. The best thing is to consult a subject matter expert who has broad experience in analytical approaches and knows your business domain.”

 

2) Inaccurate analytics. Trofimov writes, “There’s nothing worse to a business than inaccurate analytics, and this issue needs to be addressed as soon as possible.” Again, Trofimov explains, the problem may be with your data. “If your system relies on data that has defects, errors, or is incomplete,” he explains, “you’ll get poor results. Data quality management and an obligatory data validation process.” If your data isn’t the problem, he says look at system defects related to the data flow. “This happens, he explains, “when the requirements of the system are omitted or not fully met due to human error intervention in the development, testing, or verification processes.” The crawl, walk, run approach should help discover and correct such problems.

 

3) Analytics complexity. Trofimov writes, “If using data analytics becomes too complicated, you may find it difficult to extract value from your data. The complexity issue usually boils down either to the [user experience] (when it’s difficult for users to navigate the system and grasp info from its reports) or technical aspects (when the system is over-engineered).” He suggests that data visualization may help cut through the complexity. He explains that a user interface (UI)/user experience (UX) expert can “help you to create a compelling flexible user interface that is easy to navigate and work with.” If the system is overengineered, Trofimov writes, “Get your team together and define key metrics: what exactly you want to measure and analyze, what functionality is frequently used, and what is your focus. Then just get rid of all unnecessary things.”

 

4) Long system response time. As noted above, when a system can’t respond at the business speed you require, disappointment is inevitable. Trofimov recommends a couple of ways to overcome this challenge. First, make sure your data is organized properly. Trofimov explains, “Check whether your data warehouse is designed according to the use cases and scenarios you need. In case it is not, re-engineering will definitely help.” Other things to check are big data analytics infrastructure and resource utilization. He explains, “The problem can be in the system itself, meaning that it has reached its scalability limit. It also might be that your hardware infrastructure is no longer sufficient. The simplest solution here is upscaling, i.e., adding more computing resources to your system.”

 

5) Cost of maintenance. Since ROI is important to any business, when the cost of maintenance outweighs the benefits received, you’ve got a problem. Trofimov writes, “Any system requires ongoing investment in its maintenance and infrastructure. … It’s always a good idea to take a fresh look at your system and make sure you are not overpaying.” He also notes that outdated technologies may be costing you money. He explains, “New technologies that can process more data volumes in a faster and cheaper way emerge every day. Therefore, sooner or later, the technologies your analytics is based on will become outdated, require more hardware resources, and become more expensive to maintain than the modern ones. … The best solution is to move to new technologies. In the long run, they will not only make the system cheaper to maintain but also increase reliability, availability, and scalability.” Finally, he notes, “If you are still on-premise, migration to the cloud might be a good option.”

 

Concluding thoughts

 

Many business leaders are learning that cognitive computing platforms with embedded advanced analytics — like the Enterra Cognitive Core™, a system that can Sense, Think, Act, and Learn® — are helping them make better decisions — even when confronted with ambiguous data. Don-Carolis concludes, “There are many benefits to improving data analytics maturity levels, including improved forecasting, generating better actionable insights and heightening your understanding of competitors. To progress in these uncertain times, it is critical business leaders continue to leverage the investments made into data initiatives, in order to facilitate smarter decision making and bring clarity at a time of great uncertainty.” It’s time to leave disappointment behind and become delighted with your big data analytics.

 

Footnotes
[1] Editorial Team, “Data and Analytics: Delivering Clarity at a Time of Great Uncertainty,” insideBIGDATA, 13 February 2021.
[2] Michael C. Mankins and Lori Sherer, “Creating value through advanced analytics,” Bain Brief, 11 February 2015.
[3] Boris Trofimov, “5 Challenges Of Big Data Analytics in 2021,” 15 February 2021.