Home » Artificial Intelligence » What the Truman Show Can Teach Us about Big Data

What the Truman Show Can Teach Us about Big Data

August 31, 2015


According to a report prepared for the World Economic Forum by Bain & Company, “Personal data is becoming a new economic ‘asset class’, a valuable resource for the 21st century that will touch all aspects of society. … To unlock the full potential of personal data, a balanced ecosystem with increased trust between individuals, government and the private sector is necessary.” In order to generate trust, all stakeholders need to know what’s going on and how personal information is being used. If you watched the movie “The Truman Show,” you know that Truman Burbank, the lead character in the movie reacted badly when he learned he couldn’t trust the people around him. Recalling the movie, Brad Meehan (@bradmeehan), a platform architect at VML, explains, “In the 1998 film ‘The Truman Show,’ Jim Carrey played Truman Burbank, the star of the most-watched reality television show in the world. The only problem was, Truman had no idea his every move was being watched; the cameras were all hidden.”[1] Meehan continues:

“Much like Truman, consumers now live in an online world where the content they see is orchestrated and controlled by marketers and big-data algorithms that decide which products they need, which news articles to read, and which friends they should see in their Facebook news feeds. And, like Truman, consumers are often unknowing participants.”

Like Truman, consumers often react badly when they learn that their personal information is being treated cavalierly; especially if they feel their privacy has been violated. Doc Searls, Director of Project VRM at Harvard’s Berkman Center for Internet and Society, has been relentless in his efforts to get organizations to understand that an accommodation must be reached between big data stakeholders if the greatest benefits for all parties are to be achieved. Emeka Obianwu (@UnboundID), Vice President of Channels and Alliances at UnboundID, interviewed Searls earlier this year. His first question to Searls was, “How are customer privacy concerns changing business and IT practices?”[2] Searls responded:

“On the one hand, it should be clear to every business that people are concerned about privacy. Pew says 91% agree that consumers have lost control over how personal information is collected and used. TRUSTe says 92% in the U.S. worry about their privacy online. It should also be clear that a growing number of people are doing something about it. AdBlock Plus is the most popular browser extension for Firefox. In August 2014, Pagefair reported a 27.6% ad blocking rate. Among Millennials, the rate was 41%. But these economic signals are mostly ignored. Today it is taken for granted by online marketing — advertising especially — that following people around, through browser cookies and other means, is a necessity. Fixing this isn’t easy. There are advertising systems that do their best to respect individual users. But surveillance of people in their private spaces, such as browsers and email clients, is still wrong. That little obvious harm is done or reported does not make it right. I expect, as power increases for individuals on the Net, liberties with personal privacy taken by businesses will decrease. But it will also take a while. The flywheels of business-as-usual are huge and spinning away.”

In other words, too many companies are treating consumers like they were Truman Burbank. Like Truman, however, the very real possibility exists that consumers will rebel and, when they do, business models that lose consumer trust could come crashing down like poorly secured light fixtures on the “Truman Show” set. How extensive are the data bases that some companies keep on consumers? Meehan writes:

“Third-party data brokers are at the hub of this data exchange. Companies like Acxiom have aggregated more than 1,500 bits of data per person for about 190 million consumers, while Quantcast describes its offering by stating, ‘Our data set is so extensive, it’s equivalent to having coffee with every U.S. online user every hour … We predict their next move and get you there first.'”

I suspect that most consumers are uncomfortable with notion that companies are metaphorically having coffee with them every hour of every day. So what can companies do to gain the trust of consumers? Searls told Obianwu, “The simplest thing is to make clear that personal information is not being collected, or that — if it is — it will be not be sold to others and will be made available to the people from which it has been collected. Too much of marketing, and ‘big data’ collection, is a three-dimensional shell game. People don’t know what’s going on. This is why ‘transparency’ is becoming a hot buzzword. But there is still little of it, so far.” Frankly, it’s unrealistic to think that personal data is not going to be collected (or sold) for commercial purposes. Many of the online services that people enjoy for “free” are only possible because revenue that can be generated from such data is making it possible. That is exactly the reason that Microsoft is making Windows 10 “free” for licensed Windows users. Francesca Sales (@Fran_S_TT) explains, “As with many other digital services, ‘free’ actually comes with a price tag. ‘Microsoft made Windows 10 a free upgrade because it has the explicit goal of making money from Internet services, ads, apps and games that run on it,’ said The Wall Street Journal‘s Geoffrey Fowler — a model the company is calling customer lifetime value. In other words, you won’t be paying with hard cash, but Microsoft wants to keep making money from your data long after you’ve purchased your last Windows license or given up your PC.”[3]


Given the fact that organizations want data and consumers want respect, the current standoff is going to persist; and, calls for transparency are going to increase. Sales explains that Microsoft understands that its size and ubiquity could result in its gaining a “big brother” reputation and that the company is working hard to rise above that perception. One step Microsoft is taking, Sales reports, is that the company is “processing and storing users’ sensitive and personally identifiable data on the back end so that it is anonymized and cannot be reassembled — if, that is, it’s even stored at all.” Obviously, Microsoft can’t offer things like personal assistant services without gathering some personal data. Respecting that data (i.e., protecting it and being transparent about its use) is the key to keeping consumers happy. Randy Bean (@RandyBeanNVP), CEO and managing partner of consultancy NewVantage Partners, writes, “Privacy breaches, unethical hacking, and other invasions of data privacy so often lead to the establishment of guardrails and restrictions that limit our ability to experience greater convenience, enjoy more personalized consumer experiences, benefit from greater customer self-service, or learn from data that we now have access to. We don’t want to surrender our freedoms. We want the freedom to do with ‘our data’ whatever we damn well please.”[4] Meehan writes “there are four steps a responsible marketer can take to build and retain trust with their consumers.” They are:

1. Structure your site to facilitate finding valued content quickly. Users want to find information quickly. Marketers should organize their content with a solid information architecture to allow visitors to find content efficiently. Users must feel like they are progressing toward their goal with every click. If they are not, the information architecture is not efficient and content should be restructured.


2. Let the user decide what and how much personal information to share. Marketers must offer information in a value exchange, meaning they must allow visitors to determine how much personal data they’re willing to give in exchange for the information they are expecting to get. The personal data in this exchange must be of equal value to the information gained, such as agreeing to share an email address in exchange for viewing an industry white paper. Once the bartering of information begins, the tailored, personalized experience can unfold bit-by-bit, with users giving their explicit permission at each step of the way.


3. Be transparent about what content is personalized. Marketers need to be fully transparent and provide the ability to show the consumer exactly what content on the site is being tailored to them and where the data came from.


4. Allow the user to opt-out of the value exchange at any point. If consumers don’t have the ability to opt-out of the content on a brand site, they may ultimately opt-out of your brand entirely. By giving users the ability to opt-out, this puts visitors in control of the negotiations, using their personal data as a currency to ‘pay for’ information.

Meehan concluded his article with a quote from Truman, “…it feels like the whole world revolves around me somehow.” Although consumers want to feel like they are the center of the universe, they don’t want to be unwilling or unknowing dupes in the process. Searls told Obianwu, “Customers are requesting more control over how their data is used, and many businesses are responding by making preference management part of their customer experience strategy.” Let’s hope that trend continues.


[1] Brad Meehan, “Don’t Be Like ‘The Truman Show.’ Let Users Control How they Share Information,” AdAge, 7 August 2015.
[2] Emeka Obianwu, “Doc Searls to Businesses: Respect Customer Data Privacy,” Business 2 Community (B2C), 11 April 2015.
[3] Francesca Sales, “Windows 10 privacy brouhaha: Overblown?TechTarget, 7 August 2015.
[4] Randy Bean, “Tracing Some of Big Data’s Big Paradoxes,” The Wall Street Journal, 12 May 2015.

Related Posts:

Full Logo


One of our team members will reach out shortly and we will help make your business brilliant!