Home » Artificial Intelligence » Big Data Trends

Big Data Trends

April 16, 2013

supplu-chain

Eric Lundquist, a technology analyst at Ziff Brothers Investments, penned an article about ten trends he thought emerged at the GigaOM Structure Data conference in New York in March. [“10 Big Data Trends From the GigaOM Structure Data Conference,” eWeek, 1 April 2013] He doesn’t appear to present the trends in any particular priority. In fact, individual “trends” often seem to be more of a philosophical point of view than a technological vector. For example, his first trend is labeled, “Start with the applications.” He writes:

“What new applications could your company create if you could meld information outside your company’s confines with your traditional internal product development process? … The types of applications that big data enables are potentially company-altering. But you need a company structure that can think in new ways and encourage innovation before you get mired in a technology evaluation. Create the new application environment and then gather the tools to make it happen.”

One of the examples used by Lundquist to make his point comes from the discipline of genealogy. Obviously, genealogical data is information that must be gathered from outside sources. Ancestry.com has managed to create an application that takes advantage of the many genealogical resources that are now available and merge them in a way that attracts customers. They have been so successful that the company was acquired last fall “by an investor group led by European private-equity firm Permira for about $1.6 billion.” [“Ancestry.com Sets $1.6 Billion Deal,” by Ryan Dezember, Wall Street Journal, 22 October 2012]

 

Lundquist’s second “trend” is to “think physical.” By that Lundquist means that, even in the digital age, physical things still matter the most. He talks about biometrics, manual programming, and connecting physical things through networking. The most important “action around big data,” he insists, “is unfolding in big part where the physical and digital worlds intermix.”

 

His third trend is labeled “go simple, but big.” During the GigaOM conference, Jack Norris, vice president of marketing at MapR Technologies, told participants, “Simple algorithms and lots of data trump complex models.” Lundquist adds, “This may be the biggest story in big data.” Lundquist continues:

“The scene completion process for Google’s Street View (which removes offensive or embarrassing images and ‘fills in’ the scene) went from using a complicated formula over about 150,000 photos to a simple formula, but with more than 1 million photos with vastly superior results, said Norris. The same process could apply to financial services, customer sentiment, weather forecasting or anywhere big data sets could be combined with a simple query process.”

The fourth big data trend identified by Lundquist is “the Internet of a Lot of Things.” He argues that as more and more things get connected and start contributing data, everything from software to hardware are going to have adapt to this big data world. Other analysts call this trend the “Internet of Things” or the “Internet of Everything.” Lundquist’s description is much more accurate for the moment. The Internet of Things is more about connecting devices than people. Michael Fauscette states the Internet of Things “is really about machine to machine (M2M)/automated communication of data between connected devices, it’s not just hardware though, the software is really the key to making [the] IoT usable.” [“The Internet of Things,” Enterprise Irregulars, 27 January 2013] Fauscette concludes, “It’s exciting to see [the] IoT finally move mainstream and to see the reality of all that can actually be done to add value to businesses and to individuals. The next few years will see a massive growth in what can be done using [the] IoT as more and more things get connected.”

 

The fifth big data trend identified by Lundquist is “the emerging platform.” Todd Papaioannou, the founder of Continuuity and former big data engineer at Yahoo, told conference participants, “Hadoop is hard — let’s make no bones about it. It’s damn hard to use. It’s low-level infrastructure software, and most people out there are not used to using low-level infrastructure software.” Papaioannou believes that Hadoop is only an evolutionary step towards a “new Internet-style computing model,” one that works even better than Hadoop in the big data environment. That may be true, but no heir apparent is on the horizon.

 

The sixth trend identified by Lundquist is “making the big shift.” According to Lundquist, the big shift involves moving “from up-front, long-deployment, high service cost enterprise software to rapid, outside-in big data [software] running on disposable, inexpensive hardware will upend the tech industry.” He appears to think that companies that fail to make the big shift are destined to find their way into the business dustbin.

 

Lundquist’s seventh trend is “discerning the signal versus noise.” One of the challenges with big data is its size. Not every byte of data is valuable. Finding the gold dust in a pile of sand is not easy and probably never will be. “In my opinion,” Lundquist writes, “the number of people who might be described as data analysis artists is very small, and the executives thinking that a big data dive is all they need to reform their business are mistaken.”

 

Lundquist labels his next big data trend “dealing with a new model of application development.” Michael Palmer, Aetna’s head of innovation, told Lundquist, “We now live in a world of disposable apps.” Lundquist believes that this means that a company that pursues a lengthy, deliberate path to application development is likely to be overrun by companies that can find a quicker way to develop and distribute applications. He writes:

“Instead of a lengthy and expensive development process, use several companies to develop a simple app and pick the one you like the best. Once you have found the best app, go on to iterating on the next app. The model is much more like trying out apps from an Apple or Android app store than the old enterprise model. This is a big change in enterprise application development and includes knowing as much about how to meld apps through API management as the actual app creation.”

Lundquist’s ninth trend involves “applying new rules.” According to Lundquist, these new rules include the fact that startups are likely to provide better answers to emerging challenges faster than established companies. As a result, he asserts, things are going to be turned on their heads. “Think about your applications from the outside in,” he writes, “instead of inside out.” My only caveat to “applying to new rules” is to make sure that the rules have value. Before the dot.com crash early this century, pundits were talking about all sorts of new rules that would drive a new economy. The new rules didn’t look so valuable after the bubble burst.

 

Lundquist’s final big data trend is a recommendation to look “on the fringe.” He writes:

“Maybe it isn’t the fringe anymore, but consider how 3D printing will change your design process, how sensor-based data gathering will strain your current networks and how your employees living on mobile smartphones downloading apps from app stores may be just the people you need to build your next technology road map.”

It’s been over 15 years since Clayton Christensen first discussed The Innovator’s Dilemma. One of the dilemmas was the fact that innovations that looked like fringe products suddenly were good enough to capture major shares of the markets in which they emerged. Things change — sometimes dramatically. I agree with Lundquist that you need to keep your head on a swivel and conduct lots of “what if” exercises in search of both risks and opportunities. As I wrote at the beginning, I’m not sure that all of the items discussed by Lundquist are “trends” per se; but, they do provide a lot of grist for thought.

Related Posts: