Home » Big Data » Analytics 2.0: Big Data, Big Testing, and Big Experiences — Part 1

Analytics 2.0: Big Data, Big Testing, and Big Experiences — Part 1

March 14, 2013

supplu-chain

In a recent Harvard Business Review article, Wes Nichols, cofounder and the CEO of MarketShare, a global predictive-analytics company headquartered in Los Angeles, described why it is important that companies take a much more holistic view of advertising. [“Advertising Analytics 2.0,” March 2013] He began his article by relating how one of his company’s clients learned “that ads increasingly interact.” By that, Nichols means that in today’s interconnected and mobile world “ads work in concert across media and sales channels.” He writes, “For instance, a TV spot can prompt a Google search that leads to a click-through on a display ad that, ultimately, ends in a sale.” Understanding those interactions and adjusting advertising spend to get the most bang for the buck is what Nichols calls “advertising analytics 2.0.” In the case of the client referred to above, Nichols writes:

“To tease apart how its ads work in concert across media and sales channels, our client recently adopted new, sophisticated data-analytics techniques. The analyses revealed, for example, that TV ate up 85% of the budget in one new-product campaign, whereas YouTube ads—a 6% slice of the budget—were nearly twice as effective at prompting online searches that led to purchases. And search ads, at 4% of the company’s total advertising budget, generated 25% of sales. Armed with those rich findings and the latest predictive analytics, the company reallocated its ad dollars, realizing a 9% lift in sales without spending a penny more on advertising.”

In today’s economy, a 9 percent lift in revenue is very impressive. Using newly developed technologies, Nichols reports that it is now possible to know “precisely how all the moving parts of a campaign collectively drive sales and what happens when you adjust them. Until recently, the picture was fuzzy at best.” Analytics, of course, begin with data and lots of it. A company must be able to gather, store, integrate, and analyze mountains of data to obtain actionable insights that provide the kind of ROI described by Nichols. Unfortunately, Nichols believes that many companies have become complacent. It appears they believe that a combination of “time-honored measurement techniques, [such as] consumer surveys, focus groups, media-mix models, and last-click attribution,” provides them with “a handle on how their advertising actually affects behavior and drives revenue.” He explains why this assumption is inaccurate and why complacency is adversely affecting them:

“That approach is backward-looking: It largely treats advertising [as] touch points — in-store and online display ads, TV, radio, direct mail, and so on — as if each works in isolation. Making matters worse, different teams, agencies, and media buyers operate in silos and use different methods of measurement as they compete for the same resources. This still-common practice, what we call swim-lane measurement, explains why marketers often misattribute specific outcomes to their marketing activities and why finance tends to doubt the value of marketing. As one CFO of a Fortune 200 company told me, ‘When I add up the ROIs from each of our silos, the company appears twice as big as it actually is.'”

What created the conundrum described by the CFO is the fact that each silo wants to take credit for a sale that occurs. If the actual reason a sale was made can’t be pinpointed, then everyone takes credit for the sale — meaning it gets counted more than once resulting in inflated results. Nichols reports that “today’s consumers are exposed to an expanding, fragmented array of marketing touch points across media and sales channels.” Nichole Kelly, President of Social Media Explorer and SME Digital, believes that exposure to fragmented touch points results in consumers entering into the path to purchase at one touch point then they “leave, jump levels, come back, leave again, come back at the beginning and at some point come back and buy.” [“The Death of the Sales Funnel as We Know It,” Social Media Explorer, 19 February 2013] Is it any wonder that a new era of advanced analytics is required to make sense of this jumble? Nichols continues:

“Seismic shifts in both technology and consumer behavior during the past decade have produced a granular, virtually infinite record of every action consumers take online. Add to that the oceans of data from DVRs and digital set-top boxes, retail checkout, credit card transactions, call center logs, and myriad other sources, and you find that marketers now have access to a previously unimaginable trove of information about what consumers see and do. The opportunity is clear, but so is the challenge. As the celebrated statistician and writer Nate Silver put it, ‘Every day, three times per second, we produce the equivalent of the amount of data that the Library of Congress has in its entire print collection. Most of it is … irrelevant noise. So unless you have good techniques for filtering and processing the information, you’re going to get into trouble.’ In this new world, marketers who stick with traditional analytics 1.0 measurement approaches do so at their peril.”

Nichols notes that companies using older analytical methods not only suffer by relying on backward information but from the fact that they are increasingly competing against companies that have moved on to more advanced methods. “Many of the world’s biggest multinationals are now deploying analytics 2.0,” he writes, “a set of capabilities that can chew through terabytes of data and hundreds of variables in real time. It allows these companies to create an ultra-high-definition picture of their marketing performance, run scenarios, and change ad strategies on the fly.” In today’s marketing and supply chain worlds, one often hears how important visibility, flexibility, and adaptability are becoming. Only analytics 2.0 addresses all three. “The resulting analyses, put simply,” Nichols states, “reveal what really works. With these data-driven insights, companies can often maintain their existing budgets yet achieve improvements of 10% to 30% (sometimes more) in marketing performance.”

Nichols is rightfully proud of his company’s contributions to the field of advertising analytics. Since my company, Enterra Solutions, is also joining the field, I appreciate the numerous challenges associated with big data analytics. But I also have an appreciation for the results that can be obtained. I agree with Nichols that, “powered by the integration of big data, cloud computing, and new analytical methods, analytics 2.0 provides fundamentally new insights into marketing’s effect on revenue.” Nichols asserts that analytics 2.0 involves three broad activities: attribution, optimization, and allocation. He discusses each in turn beginning with attribution.

“To determine how your advertising activities interact to drive purchases, start by gathering data. Many companies we’ve worked with claim at first that they lack the required data in-house. That is almost always not the case. Companies are awash in data, albeit dispersed and, often, unintentionally hidden. Relevant data typically exist within sales, finance, customer service, distribution, and other functions outside marketing. Knowing what to focus on — the signal rather than the noise — is a critical part of the process. To accurately model their businesses, companies must collect data across five broad categories: market conditions, competitive activities, marketing actions, consumer response, and business outcomes.”

Armed with the right kind data and analytical tools, companies can begin to understand interrelations between and “the impact of marketing activities across swim lanes.” Nichols indicates that good analysis can show when activity in one medium (such as television) receives an “assist” from another medium (such an online ad). He explains:

“Recognizing an assist depends on the ability to track how consumer behavior changes in response to advertising investments and sales activities. To oversimplify a bit: An analysis could pick up a spike in consumers’ click-throughs on an online banner ad after a new TV spot goes live — and link that effect to changes in purchase patterns. This would capture the spot’s ‘assist’ to the banner ad and provide a truer picture of the TV ad’s ROI. More subtly, analytics can reveal the assist effects of ads that consumers don’t actively engage with — showing, for example, a 12% jump in search activity for a product after deployment of a banner ad that only 0.1% of consumers click on.”

It is those kinds of subtle, but important, relationships that simply can’t be discovered using traditional measurement methodologies. Concerning optimization, Nichols writes:

“Once a marketer has quantified the relative contribution of each component of its marketing activities and the influence of important exogenous factors, war gaming is the next step. It involves using predictive-analytics tools to run scenarios for business planning. Maybe you want to know what will happen to your revenue if you cut outdoor display advertising for a certain product line by 10% in San Diego — or if you shift 15% of your product-related TV ad spending to online search and display. Perhaps you need to identify the implications for your advertising if a competitor reduces prices in Tokyo or if fuel prices go up in Sydney. Working with the vast quantities of data collected and analyzed through the attribution process, you can assign an ‘elasticity’ to every business driver you’ve measured, from TV advertising to search ads to fuel prices and local temperatures. (Elasticity is the ratio of the percentage change in one variable to the percentage change in another.) Knowing the elasticities of your business drivers helps you predict how specific changes you make will influence particular outcomes.”

In the past, performing that kind of “war gaming” was often slow and costly. Using today’s technologies, “what if” analysis is much more cost effective and can be done in minutes. Nichols final topic is allocation. He writes:

“Gone are the days of setting a marketing plan and letting it run its course — the so-called run-and-done approach. As technology, media companies, and media buyers continue to remove friction from the process, advertising has become easier to transact, place, measure, and expand or kill. Marketers can now readily adjust or allocate advertising in different markets on a monthly, weekly, or daily basis — and, online, even from one fraction of a second to the next. Allocation involves putting the results of your attribution and war-gaming efforts into the market, measuring outcomes, validating models (that is, running in-market experiments to confirm the findings of an analysis), and making course corrections.”

Marketing technologist Scott Brinker describes this process as going from big data, to big testing, to provide the consumer with a big experience. [“The big data bubble in marketing — but a bigger future,” Chief Marketing Technologist, 21 January 2013] Only analytics 2.0 is capable of dealing with all three “big” activities.

Related Posts: