Some companies have fallen deeply in love with big data. Analysts from Deloitte report, “Companies modernizing their approaches to data management are beginning to develop deliberate techniques for managing, monetizing, and unlocking the competitive value of this increasingly vital enterprise asset.” To fully unlock the value in data, Deloitte analysts insist you must set it free. They explain, “Making data available and actionable to all business units, departments, and geographies can create enhanced value across the business.” There is an old adage that states: “If you love something set it free. If it comes back it’s yours. If not, it was never meant to be.” As the Deloitte analysts note, companies setting their data free are finding it does return in the form of value. It’s a love affair that was meant to be.
Setting Data Free
What does it mean to set data free? Surprisingly, the first step, according to Deloitte analysts, is to manage and protect it. They explain, “This strategy calls for implementing modern approaches to data architecture and governance, and navigating global regulations for privacy and protection, among other initiatives and challenges.” This really shouldn’t be too surprising. We see an example of this principle every day on our roads. We are only free to drive through cities when everybody follows the rules. At the heart of setting data free is democratization of data (i.e., making it widely available throughout an organization). Technologies (e.g., cognitive computing and cloud services) are now available to make this happen. To fully democratize data, users must have access to the data and a way to analyze it.
James Kobielus (@jameskobielus), SiliconAngle Wikibon’s lead analyst for AI, data science, and application development, predicts, “Public clouds are the future of enterprise big data analytics, and their use is creating the unified platform needed to fully gain its value.” Efi Cohen (@efic1), Chief Technology Officer and co-founder at Dataroma, adds, “Thanks to all-new, cost-effective storage solutions (e.g., the cloud) today’s businesses are able to do things that they never thought would be possible.” What is it about the cloud that makes it so valuable to businesses? The simple answer is access. The cloud becomes the repository of data accessible to everyone who needs access in an organization. As noted above, Kobielus believes public rather than private clouds will ultimately be the storage choice for most businesses. He explains:
“Public clouds are becoming the preferred big data analytics platform for every customer segment. That’s because public cloud solutions are maturing more rapidly than on-premises stacks, adding richer functionality, with increasingly competitive cost of ownership. Public clouds are growing their application programming interface ecosystems and enhancing their administrative tools faster than what is emerging from the world of big data analytics solutions designed for on-premises deployments.”
Placing data in the cloud provides centralized storage and access. Thibaut Ceyrolle (@ThibautCeyrolle), a Vice President at Snowflake Computing, explains, “By channeling big data through purpose-built, cloud-based data warehousing platforms, it can be shared within and between organizations, in real-time, with greater ease and help them better respond to the market. Through cloud, organizations are able to share their data in a governed and secure manner across the enterprise, ending the segregation and latency of data insights among both internal departments and companies external third-parties.” He calls data sharing and the cloud, “A match made in heaven.”
Having accessible data is of little to no value if those accessing it don’t have the ability to analyze it. Fortunately, cognitive computing platforms have embedded advanced analytic capabilities. Ceyrolle notes, “Previously, analyzing data sets was limited to the IT or business intelligence teams that sat within an organization. But as data continues to grow and become an important ‘oil’ for enterprises, it has caused a shift in business requirements. For data-driven companies, data is now being democratized across the entire workforce.” Kobielus adds, “Innovative application providers are starting to disrupt the big data competitive landscape with AI-based solutions.” In my discussion with clients, I’ve found they need embedded analytics that perform traditional roles of three types of experts:
- A business domain expert — the customer of the analysis who can help explain the drivers behind data anomalies and outliers.
- A statistical expert — to help formulate the correct statistical studies, the business expert knows what they want to study, and what terms to use to help formulate the data in a way that will detect the desired phenomena.
- A data expert — the data expert understands where and how to pull the data from across multiple databases or data feeds.
Even if large datasets were manageable for human experts, having three experts involved dramatically lengthens the time required to analyze, tune, re-analyze, and interpret the results. Cognitive technologies empower the business expert by automating the statistical expert’s and data expert’s knowledge and functions, so the ideation cycle can be dramatically shortened and more insights can be auto-generated. Cognitive computing is the key that unlocks data democratization for companies. David Weldon (@DWeldon646) reports, “The adoption of self-service analytics in many industries and by government agencies is on such a brisk pace that by 2019 the analytics output of business users with self-service capabilities will surpass that of formal data scientists.” He draws that conclusion from a Gartner survey of thousands of CIOs. Regarding the survey, Carlie J. Idoine (@CarlieIdoine), a research director at Gartner, stated, “The trend of digitalization is driving demand for analytics across all areas of modern business and government. Rapid advancements in artificial intelligence, the Internet of Things and SaaS analytics and analytics and BI platforms are making it easier and more cost-effective than ever before for non-specialists to perform effective analysis and better inform their decision making.”
Ceyrolle observes, “As data volumes increase year-on-year, we also see data sharing evolve in the process. The insatiable desire for data will result in organizations tapping into the benefits of machine learning to help sift through the mountains of information they receive. Organizations who capitalize on machine learning will also be better positioned to extrapolate the variety of data sources available and glue it together to serve as an interconnected data network.” In other words, the cloud and cognitive computing can combine to set data free. Kobielus concludes, “Enterprises are moving more rapidly out of the experimentation and proof-of-concept phases with big data analytics and are achieving higher levels of business value from their deployments.” Cohen adds, “As AI goes mainstream, the most dramatic example will be in the data analytics space. … Rather than relying on the user to query the data and make connections, AI empowers users by providing on-the-ground assistance as needed so that they can deliver on their respective KPIs and drive bottom line results.” With data accessible via the cloud and analytics accessible via cognitive technologies, it’s little wonder companies are having a love affair with big data.
 Deloitte, “If You Love Data, Set It Free,” The Wall Street Journal, 28 March 2018.
 James Kobielus, “Big data analytics: The cloud-fueled shift now under way,” IT World, 8 March 2018.
 Efi Cohen, “Three trends driving data analytics efforts,” Information Management, 26 February 2018.
 Thibaut Ceyrolle, “Data-sharing and cloud: A big data match made in heaven,” Computer Weekly, 19 March 2018.
 David Weldon, “Self-service analytics and BI outpacing the output of data scientists,” Information Management, 5 March 2018.