Home » Big Data » Big Data Continues to Raise Privacy Concerns

Big Data Continues to Raise Privacy Concerns

September 24, 2013

“Every step in the big data pipeline is raising concerns,” writes Cynthia Dwork, Principal Researcher at Microsoft Research. and Deirdre K. Mulligan, an is Assistant Professor of School of Information at Berkeley Law. [“It’s Not Privacy, and It’s Not Fair,” Stanford Law Review, 3 September 2013] Those concerns include: “The privacy implications of amassing, connecting, and using personal information, the implicit and explicit biases embedded in both datasets and algorithms, and the individual and societal consequences of the resulting classifications and segmentation.” They continue:

“Although the concerns are wide ranging and complex, the discussion and proposed solutions often loop back to privacy and transparency — specifically, establishing individual control over personal information, and requiring entities to provide some transparency into personal profiles and algorithms.”

In the same issue of the Stanford Law Review, Jules Polonetsky, Co-Chair and Director of the Future of Privacy Forum, and Omer Tene, an Associate Professor at the College of Management Haim Striks School of Law, ask, “How should privacy risks be weighed against big data rewards?” [“Privacy and Big Data,” Stanford Law Review, 3 September 2013] They continue:

“Big data creates tremendous opportunity for the world economy not only in the field of national security, but also in areas ranging from marketing and credit risk analysis to medical research and urban planning. At the same time, the extraordinary benefits of big data are tempered by concerns over privacy and data protection. Privacy advocates are concerned that the advances of the data ecosystem will upend the power relationships between government, business, and individuals, and lead to racial or other profiling, discrimination, over-criminalization, and other restricted freedoms. Finding the right balance between privacy risks and big data rewards may very well be the biggest public policy challenge of our time.”

In a third article from that same issue, Joseph W. Jerome, a Legal and Policy Fellow at the Future of Privacy Forum, writes, “Big data is transforming individual privacy—and not in equal ways for all.” [“Buying and Selling Privacy,” Stanford Law Review, 3 September 2013] Eric Markowitz adds, “As the public grows more skeptical of data collection, digital privacy advocates finally find themselves in the spotlight–a position they’ve been craving for years. On the flip-side, tech companies have been dragged under their own spotlight — albeit one with a more critical hue. Now more than ever, people want to know: What, exactly, are you doing with all of their data?” [“The Data Privacy Debate Is Just Beginning,” Inc., 21 June 2013] I agree with Markowitz that the data privacy is just beginning and that should worry a lot of companies.

 

Dwork and Mulligan believe that current efforts to protect consumer privacy “fail to address concerns with the classifications and segmentation produced by big data analysis.” They explain:

“At worst, privacy solutions can hinder efforts to identify classifications that unintentionally produce objectionable outcomes — for example, differential treatment that tracks race or gender — by limiting the availability of data about such attributes. For example, a system that determined whether to offer individuals a discount on a purchase based on a seemingly innocuous array of variables being positive (‘shops for free weights and men’s shirts’) would in fact routinely offer discounts to men but not women. To avoid unintentionally encoding such an outcome, one would need to know that men and women arrayed differently along this set of dimensions. Protecting against this sort of discriminatory impact is advanced by data about legally protected statuses, since the ability to both build systems to avoid it and detect systems that encode it turns on statistics. … Rooting out biases and blind spots in big data depends on our ability to constrain, understand, and test the systems that use such data to shape information, experiences, and opportunities.”

Polonetsky and Tene have different concerns. They believe “the current privacy debate methodologically explores the risks presented by big data, [but] it fails to untangle commensurate benefits, treating them as a hodgepodge of individual, business, and government interests.” They go to detail the benefits of big data across a number of domains. Jeff Bertolucci reports that the software industry shares the concerns highlighted by Polonetsky and Tene. “The Software & Information Industry Association (SIIA) wants policymakers to avoid ‘broad policies’ that limit the use of digital information,” he writes. “While acknowledging the need to address privacy concerns, the software trade group says that stringent regulations in this area might slow the growth of the ‘nascent technological and economic revolution’ known as big data.” [“Don’t Let Privacy Fears Stifle Big Data, SIIA Urges,” InformationWeek, 3 June 2013] He continues:

“The SIIA argues that big data is too important to the global economy to do otherwise. It points to recent Gartner estimates that predict big data-related research will spur $34 billion in IT spending this year. Looking forward, ‘data-driven innovation,’ or DDI, will help create 4.4 million IT jobs globally by 2015, including 1.9 million in the United States, Gartner says. Data collection and use is at crossroads, and decisions by policymakers could have an enormous impact on American innovation, jobs and economic growth,’ said SIIA president Ken Wasch in a statement. Lawmakers must address privacy concerns regarding the storage and use of data, Wasch concedes, but he adds that they should do so ‘without strict regulation that stifles economic opportunity

It won’t be easy to find the right balance between big data and privacy hoped for by Wasch. In the end, no one is going to be happy with any policy that is enacted. The SIIA believes that “policymakers should recognize that ‘socially acceptable norms of privacy’ are changing with technology. These changes should influence policy decisions pertaining to DDI.” There is evidence that privacy norms are changing, especially among younger generations (but, that’s the topic of another post). Nevertheless, even younger generations have concerns about privacy.

 

Some pundits believe that we are engaging in privacy debate too late. They insist that privacy rights were abrogated years ago. For example, Rob Norman, Chief Digital Officer, GroupM Global, told a conference participants in New Delhi that we now live in an Orwellian world. In the future, he told the audience, “Privacy will be redundant.” [“The ‘privacy’ threat is real, digital marketers be aware,” by Noor Fathima Warsia, Digital Market Asia, 30 May 2013] Warsia reports, “Norman was echoing Sun Microsystems, Chief Executive, Scott McNealy’s words, famously said way back in 1999, during a product launch, ‘You have zero privacy anyway, get over it.'”

 

Most analysts, however, don’t seem to think that its too late to address privacy issues surrounding big data. For example, Jerome concludes:

“If we intend for our economic and legal frameworks to shift from data collection to use, it is essential to begin the conversation about what sort of uses we want to take off the table. Certain instances of price discrimination or adverse employment decisions are an easy place to start, but we ought to also focus on how data uses will impact different social classes. Our big data economy needs to be developed such that it promotes not only a sphere of privacy, but also the rules of civility that are essential for social cohesion and broad-based equality. If the practical challenges facing average people are not considered, big data will push against efforts to promote social equality. Instead, we will be categorized and classified every which way, and only the highest high value of those categories will experience the best benefits that data can provide.”

Markowitz adds, “Big Data may be hot — but it only works so long as consumers decide to cooperate and share their data freely. I’m not convinced that proposition is guaranteed in the future.” There is a lot riding on the success of big data. That’s why is remains critical that some accommodation be reached between use and privacy.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!