Home » Artificial Intelligence » Big Data and Ethical Corporate Behavior

Big Data and Ethical Corporate Behavior

July 23, 2013

“We can now gather, correlate, and analyze information in ways that were unthinkable in the past,” writes Timo Elliott. “However,” he continues, “with great power comes great responsibility. Analytics is a very powerful weapon, and weapons can be abused.” [“The Ethics of Big Data: Vendors Should Take A Stand,” Business Analytics, 12 June 2013] Ethically responsible data collection and analysis skyrocketed to the top of current news topics after former Booz Allen employee Edward Snowden leaked classified information revealing that the U.S. National Security Agency has been accessing telephone records and data from technology companies. Reactions to these revelations reaffirm the fact that privacy remains a very sensitive issue.

 

John Gapper reminds us that people are sensitive about privacy regardless of who is collecting and analyzing it — governments or companies. He writes, “Companies that hold rapidly expanding amounts of personal information are using new kinds of data analysis and artificial intelligence to shape products and services, and to predict what customers will want.” [“Big data has to show that it’s not like Big Brother,” Financial Times, 12 June 2013] Elliott sees nothing sinister in this. “I have been working in analytics for over twenty years,” he writes, “and have witnessed first hand how these technologies have made the world a better place. I’ve seen thousands of examples, from every type of corporate efficiency imaginable, to improving customer satisfaction at theme parks and making better use of limited blood supplies. Yet we’ve only seen the tip of the iceberg when it comes to ‘big data analytics’.” But the potential for abuse, he explains, is very real.

“The past clearly shows that without proper controls, there can be irresistible temptations for companies and governments to combine data in ways that threaten personal liberties. Misuse of every previous data gathering technology has eventually come to light, sometimes only decades after the facts, leading to new laws re-establishing privacy limits. Modern technology makes the potential threat much greater than in the past. Combining ‘metadata’ from online activities, mobile devices, payment systems, surveillance cameras, medical histories, and social networks can reveal every nuance of our online and offline lives.”

Frankly, the data collection genie is out of the bottle and its release can’t be reversed. Patrick Tucker writes, “Modern data science is finding that nearly any type of data can be used, much like a fingerprint, to identify the person who created it: your choice of movies on Netflix, the location signals emitted by your cell phone, even your pattern of walking as recorded by a surveillance camera. In effect, the more data there is, the less any of it can be said to be private, since the richness of that data makes pinpointing people ‘algorithmically possible,’ says Princeton University computer scientist Arvind Narayanan.” [“Has Big Data Made Anonymity Impossible?,” MIT Technology Review, 7 May 2013] The bottom line: Big data is out there and it can be abused. Since big data analytics is predicted to alter the business landscape forever (and companies want to take advantage of the insights it can offer), it is essential that companies handle that data ethically and responsibly. Elliott writes:

“Analytics is, at best, a wonderful opportunity to shine light into the dark, to reveal what was previously concealed, and make it better. People and governments must be in the forefront of establishing clear, transparent guidelines that make the right tradeoffs between the public good and citizen’s rights. We should not wait for abuses to come to light before acting.”

Unfortunately, in the United States, expecting action from Washington is much like expecting to win the lottery. Chances aren’t good. That means that companies should assume responsibility for the secure and ethical handling of the data they collect and analyze. Bill Franks writes, “When it comes to deciding how your organization will develop privacy policies for big data, there are at least three distinct sets of guidelines to consider. Without consideration for all three of these areas, you will put your organization at risk.” [“Helpful or Creepy? Avoid Crossing The Line With Big Data,” International Institute for Analytics, 7 May 2013] The three sets of guidelines to which Franks alludes involve answers to three different questions. They are:

 

  • What is legal?
  • What is ethical?
  • What will the public find acceptable?

 

“In an ideal world,” Franks writes, “these three considerations would lead to the same result. In practice, however, the three are often not in sync and can in fact, point to totally different decisions. It will be important for your organization to decide how you want to balance the results to guide your actions when the three criteria diverge.” And you thought it was going to be easy to determine the best way to handle big data. Franks points out that largest gray area involves behavior that might not be illegal but nevertheless could be unethical. In such circumstances, he asserts, “it is important to consider what is right and ethical, not just what is legal. If you’re the first to ponder a new type of analysis, you need to think through these considerations before you even start down the path.”

 

Franks goes on to point out that even if a company utilizes a clearly legal and ethical analytic methodology, customers might still have strong reactions. “What the public finds acceptable,” he writes, “can often be even more stringent than what is legal and ethical.” He concludes:

“My belief is that an organization will be well served to routinely sit down and explicitly discuss the legal, ethical, and consumer perception of its analytic policies in detail. After examining the legal, ethical, and consumer perspectives, I recommend defaulting to pursuing strategies that fall within bounds of the most restrictive of the three considerations. Given the rapid change of the legal environment and consumer acceptance of the use of their data, you can expect your decisions to be fluid and changing over time. What seems ok today may not be ok a year from now. While it may not be the most exciting process, keeping on top of your privacy policies will help avoid much bigger issues, such as legal problems and PR fiascos, down the road.”

Elliott believes that “vendors of analytics software” also have a major role to play in ensuring that data is handled legally and ethically. He believes they should “help encourage data safety and transparency, and provide the technology features that make it easy for organizations to support these initiatives.” Companies are constantly trying to strengthen customer loyalty. One of the best ways to do that is by strengthening the trust that consumers have in how the company handles their personal data. Despite claims that they anonymize data, consumers now know that true anonymization may no longer be possible. David Meyer explains:

“When it comes to protecting privacy in the digital age, anonymization is a terrifically important concept. In the context of the location data collected by so many mobile apps these days, it generally refers to the decoupling of the location data from identifiers such as the user’s name or phone number. Used in this way, anonymization is supposed to allow the collection of huge amounts of information for business purposes while minimizing the risks if, for example, someone were to hack the developer’s database.
Except, according to research published in Scientific Reports, … people’s day-to-day movement is usually so predictable that even anonymized location data can be linked to individuals with relative ease if correlated with a piece of outside information. Why? Because our movement patterns give us away. [“Why the collision of big data and privacy will require a new realpolitik,” Gigaom, 25 March 2013]

Meyer writes that we need to be realistic when it comes to data collection and analysis. “We are not going to stop all this data collection,” he writes, “so we need to develop workable guidelines for protecting people.” He agrees with Elliott that vendors pushing analytics solutions have a critical role to play. “Those developing data-centric products,” he writes, “have to start thinking responsibly – and so do the privacy brigade. Neither camp will entirely get its way: there will be greater regulation of data privacy, one way or another, but the masses will also not be rising up against the data barons anytime soon.” He doesn’t believe the masses will be rising up because he thinks only some kind of catastrophic event will motivate such an uprising. Meyer penned his article shortly before Snowden’s leaked information was published; but, those disclosures probably don’t represent the catastrophe envisioned by Meyer. He concludes, “I suspect the really useful regulation will come some way down the line, as a reactive measure. I just shudder to think what event will necessitate it.” Whatever the event, your company doesn’t want to be a part of it. That much I know for sure.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!