Home » Artificial Intelligence » Big Data and Privacy: What should People Expect?

Big Data and Privacy: What should People Expect?

July 13, 2016

supplu-chain

“What does privacy really mean to you,” asks Cathy Nolan (@CatherineNolan7), an Information Analyst.[1] In the era of big data, that’s a really good question. “When we think about data privacy,” Nolan writes, “we are also including physical privacy to a certain extent, because having your personal data exposed means that you can also be targeted in physical ways — your home and business address, your physical presence at certain stores and gatherings through GPS, and your health and physical attributes through medical identity theft.” In fact, it has been argued that harboring any expectation of privacy in today’s connected world is unrealistic. What then should we expect from those who collect and analyze the data we generate? Manish Bahl (@mbahl), Senior Director at the Center for the Future of Work at Cognizant, believes the best we can hope for is that companies will protect the data and use it ethically. “One of the biggest threats to companies today comes not from the competition,” he explains, “but from the imperative to win and keep consumer trust. In an age when personal data is the key to honing a competitive edge, data ethics has become the new battleground for digital success. Companies that view trust as not just a privacy, security or technology issue, but also a brand-building opportunity and place consumers before near-term profits and self-interest will be best equipped to sail through trust-driven business disruption.”[2]

 

Using the word “big” when coining the term “big data” was unfortunate. It immediately conjures up George Orwell’s “big brother” with all its creepy implications. Orwell once stated, “Enlightened people seldom or never possess a sense of responsibility.” But responsibility is exactly what people expect from those who collect, analyze, and use personal data. Nolan reminds us that “the right to privacy” is a rather modern notion. She explains:

“Physical privacy is a recent occurrence in man’s history. The first people lived in caves or other shelters and shared a common living space. There was no concept of privacy and people wanted to be physically close together to share warmth, food and safety. Fast forward a few thousand years and most of the world’s populace lived in one-room shelters, often sharing their homes with their domestic animals. Except for the very wealthy, (and even Royalty had very little physical privacy) this way of life continued until the middle class in Europe started imitating the ‘upper class’ and built homes with multiple rooms where people could experience some physical privacy away from their neighbors and their families. We grew up in an age where physical privacy was the norm and highly prized to the extent that during Victorian times even seeing a lady’s ankle was considered uncouth! Now the pendulum is swinging the other way and many people are willing to give up some of their personal, including physical, privacy for the convenience of being connected to the Internet of Things (IoT). We are sharing everything.”

Sharing personal information in order to receive free services (like search engines and email services) or special offers (like those provided by loyalty programs) is a modern bargain struck between individuals and companies. Some people, however, wonder if it really is a bargain. John Leonard (@_JohnLeonard), Research Director at Computing, asks, “How much is your personal data worth?”[3] A follow-on question is: Worth to whom? Leonard explains:

“If we’re talking about name, address and date of birth the answer is not much at all. The lists containing these details sell for a few pounds per thousand, giving you a value of pennies at most. And yet companies such as Google and Facebook have become multi-billion dollar behemoths on the basis of personal data. Their value-add, of course, is taking our basic data and combining it with information picked up by tracking what we do on the internet and selling this as an aggregated dataset to advertisers. This data can be used to infer what we might do next and is incredibly valuable. It also helps Google fine tune its search engine and to offer it for free, so everyone’s a winner — at least in theory. However, this same information can also be used to manipulate what we see, not just in terms of advertisements but also news stories and other social media posts in a way that is completely opaque to the user. Our personal data is certainly not always used in our favour.”

In other words, the bargain people strike with companies collecting, analyzing, and using data ceases to become a bargain when trust is broken through the unethical use of that data. With the emergence of the Internet of Things, more data is going to be collected and analyzed. As a result, the issues of privacy and ethical use of data are going to continue to be hot topics. Michael Pumper (@mike_pumper), a manager at Sogeti USA, asserts that Orwell had a point when he wrote in his book 1984, “The choice for mankind lies between freedom and happiness and for the great bulk of mankind, happiness is better.”[4] In this case, Pumper is equating privacy to freedom and free services and benefits to happiness. I’m not certain those are diametrically opposed choices. Orwell was really talking about humankind’s desire for security and safety. Benjamin Franklin once stated, “Those who surrender freedom for security will not have, nor do they deserve, either one.” I believe Franklin carefully selected to use the term “surrender.” Choosing is not surrendering. We choose to obey traffic laws in order to protect lives. We choose to suffer the indignities of TSA searches to ensure safety. And we choose to provide personal data in exchange for some services or benefits. What we don’t choose is to have that data abused or used unethically. Pumper understands that companies collecting and using data have a few things in common with Orwell’s big brother and he insists they need to be careful how they protect and use the data they collect. “When considering investing in these data-rich technologies,” he writes, “there are three critical components for making sure customers are as comfortable and willing to engage as possible.” They are:

 

  • Be transparent. Notify your customers if they are being tracked, and be clear about what that data is being used for.
  • Anonymize Customer Data. Know where to draw the line. Storing generalized demographic data may be acceptable, but linking it to identifiable information is not.
  • Offer an opt-out option. Making data sharing mandatory can make customers feel as though they’re under forced surveillance. Knowing they can opt out at any time will put minds at ease.

 

Bahl notes, “Trust has been elevated to a C-suite issue because consumer trust converts into bottom-line benefits. … As consumers become more educated about how a company is using their data, they want a personal, tangible and immediate benefit in return.” He concludes:

“The increasing value and quality of the data that companies gather has changed not only the way products and services are delivered, but also the way consumers make decisions. As the digital revolution unfolds, trust will become even more important because consumers will not just expect but assume businesses have put their interests before everything else. Consumers are a business’s brand ambassadors, and losing their trust will directly impact the brand and the future of the business. Data ethics has become the new purpose for businesses. Trust will increasingly be seen not as the end objective, but as a necessity for business success.”

Although privacy will remain an issue in the years ahead, the fact of the matter is that big data is only going to get bigger and expectations of privacy are going to diminish. What won’t diminish is the expectation that data will be protected and used ethically.

 

Footnotes
[1] Cathy Nolan, “Personal Privacy and the IoT: What Does it Mean to You?Dataversity, 22 January 2016.
[2] Manish Bahl, “The business value of trust — A new digital battleground,” Enterprise Innovation, 30 May 2016.
[3] John Leonard, “What if we could demand something in return for our personal data?Computing, 7 December 2015.
[4] Michael Pumper, “Privacy and Ethics in IoT and Consumer Analytics,” Information Management, 26 April 2016.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!