Home » Big Data » The Requirement for Ethics in the Age of Big Data

The Requirement for Ethics in the Age of Big Data

September 25, 2017

supplu-chain

The Equifax data breach affecting 143 million American citizens and hundreds of thousands of other people around the globe focused the public’s and government’s attention on companies collecting personal data. Big data is one of the defining characteristics of the Information Age. With the rise of big data, concern about big data ethics has also increased. Gry Hasselbalch (@mediamocracy) and Pernille Tranberg (@PernilleT), co-founders of DataEthics.eu, assert, “Consumers are increasingly concerned that companies are amassing enormous amounts of information about their activities — online and off — and using that data in unethical ways.”[1] They note politicians are starting to focus on the issue and enacting increasingly strong measures to pressure companies to do the right thing. “Now it’s time for businesses to do more on their own — not just to head off additional, potentially heavy-handed regulations,” Hasselbalch and Tranberg write, “but more so as a powerful new market differentiator.”

 

Big Data Ethics: Doing the Right Thing

 

Marc Benioff (@Benioff), CEO Salesforce, asks a couple of important questions, “In this digital age, will citizens trust the institutions and service providers who maintain their data? Will social media be used to enlighten or manipulate us?”[2] Prithvijit Roy (@prithvijit), Founder and CEO of BRIDGEi2i Analytics Solutions, asks a couple of more interesting questions, “Should there be more discussions on the moral limits of data? Should data scientists sign a code to deal sensitively with data just as doctors do?”[3] In partial answers to questions like these, Benioff responds, “We need to bring transparency into how we govern and manage this technology, and develop security models that allows us to have confidence that these systems won’t be hacked, run amuck or become tools of oppression by those who control them.” Hasselbalch and Tranberg believe doing the right thing for consumers is also good for a company. They explain, “‘Data ethics’ can give companies a competitive edge, especially when their target customers value social responsibility.” Currently, Roy points out, there are a lot more questions than answers in the area of big data ethics. He writes:

“Most discussions about the moral limits of data are restricted to regulations around the use of data. What data is usable, what are the privacy restrictions, how does that vary across countries, and how do companies incorporate that into their processes. Forresters’ Data Protection Heatmap by country shows a snapshot. But as data becomes more open and pervasive, and as analytics starts becoming more invisible, is the question around the ethics or morals of data also changing? … The key question: How is data generated and how is it used for different purposes? … If our data is used for a very different intent than its original intent, how can we control its potential impact? … If the potential misuse of data could lead to unintended consequences, what safeguards do we need to have in place? … Where does the solution lie?”

Data is inherently neither good nor bad. The analysis and use of data is what gives it a moral dimension. Hasselbalch and Tranberg observe that consumer unease about big data ethics is “grounded in the reality of almost daily headlines about yet another retailer, bank, utility company or ISP getting hacked or using data in an unethical way.”

 

The Way Ahead

 

“As with anything that could conceivably involve the personal data of customers and other private individuals,” writes Mark Palmer (@markpalmer1022), a business tech journalist, “you have to be aware that there could be ethical concerns relating to how that data is utilized. … Just because it’s permissible to gather all this information that can provide insight into the minds of consumers does not mean your responsibility ends there. Being ethical in this case requires that you insure that data always remains protected when it’s not in use.”[4] This is not just the right thing to do for those whose data is being collected, it’s in a company’s own best interests to use data ethically. Palmer notes, “The consequences for not protecting this sensitive data can be rather severe. IBM has estimated that the breach of sensitive data costs companies $4 million per breach. A good deal of the cost of this data loss comes later from class action lawsuits from customers who had their information stolen. Hackers are looking for information like credit card numbers and personal information that can be used to steal a person’s identity. You will be held responsible in civil court for the damage done to your customers if you did not take proper steps to secure that data from hackers, cyber criminals and other digital threats.”

 

Hasselbalch and Tranberg observe, “Consumers are increasingly fighting back in a variety of ways. Some are technological, such as choosing encrypted services, private search engines, ad blockers and other ‘privacy tech.’ Others are grass roots, such as lobbying politicians for new protections and boycotting businesses that they perceive as playing fast and loose with customer data. Both types of responses create problems for businesses — but they don’t have to.” One way to ensure data collection doesn’t result in problems with consumers, they assert, is “by giving customers complete control over which data is collected and how it’s used.” Sounds simple; but, Hasselbalch and Tranberg note many companies claiming to embrace this policy don’t actually live up to it. They continue:

“To avoid that fate — and having a law nicknamed after their company — organizations should implement a privacy-by-design philosophy. To be effective, that philosophy has to be applied organization-wide because so many departments have access to customer data. So it’s no surprise that Gartner predicts that by next year, half of business ethics violations will occur due to improper use of big data.”

I suggest reading Hasselbalch’s and Tranberg’s article because they suggest a number of resources available to companies to help them implement privacy-by-design programs. Palmer concludes, “Some level of transparency needs to be provided if big data strategies are to be performed ethically. This should include a clear explanation of where the data regarding individuals is coming from whether it’s from user account information, website data extraction software or some other tool. You should also try to inform people of how their data is used. Often times, it’s best to allow people to opt in to their data being used in such a way. While not everyone reads the fine print on these agreements, not having that fine print explanation for what happens to the data after an internet user agrees to the contract would certainly be unethical.”

 

Summary

 

Roy believes adoption of some sort of ethical code for data scientists would help change the conversation from nefarious uses of data to the benefits of data. Hasselbalch and Tranberg insist companies need to take big data ethics more seriously. They conclude, “The bottom line is that data ethics isn’t a fad. … The most visionary companies are highly aware of how data ethics fits into their general social corporate responsibility framework. They don’t just pay it lip service, either. Instead, they build their entire organizations around the data ethics.”

 

Footnotes
[1] Gry Hasselbalch and Pernille Tranberg, “Why organizations need a data ethics strategy—and how to create one,” Information Management, 16 August 2017.
[2] Vala Afshar, “The Power of Artificial Intelligence is to Make Better Decisions,” Huffpost, 28 January 2017.
[3] Prithvijit Roy, “Moral Limits of Data: Let’s Shift the Conversation to Good Data,” Datafloq, 15 August 2017.
[4] Mark Palmer, “How to Navigate the Ethics of Data Gathering,” Datafloq, 15 June 2017.

Related Posts: