Home » Artificial Intelligence » As Uses of Big Data in Business Grow, the Need for Ethics also Increases

As Uses of Big Data in Business Grow, the Need for Ethics also Increases

August 25, 2016


“Big data is reshaping the world,” asserts Joe Lodewyck (@joelodeaz), dean of assessment at the University of Phoenix College of Information Systems and Technology. “Only time will tell whether this is good or bad for our society — a determination that will not be founded on technology, but rather on whether or not the people managing it are capable of harnessing it for the safety, security and good of a civil society.”[1] He continues:

“Americans and consumers across the globe continuously disseminate all types of information. Health records, personal Internet browsing habits, financial data and myriad other personal information fill a massive pool of data that can be mined for patterns and trends to positively effect changes in society. But, just as easily, this data is a threat to the personal information of individuals and society as a whole. The truth is, big data’s limitless potential only can be realized if people are capable of managing information, interpreting it correctly and acting wisely.”

I’m assuming Lodewyck includes ethical behavior when he writes about the importance of people managing information and acting wisely. If true, he is not alone in his concern about the ethical use of big data. Samantha White, senior editor of CGMA Magazine, writes, “While the internet of things has provided companies with more ways to collect increasing volumes and types of data about their customers, it also poses a significant challenge: Regulation is developing at a much slower pace than the technology, making it the responsibility of the company to decide how to harness the insights offered by data from mobile phones, travel passes, and thermostats among other devices, while living up to their core ethical values.”[2] Consumers are probably uncomfortable with the notion that a for-profit company is self-regulating how it uses the data it collects about them and their activities. Unfortunately, regulations will never keep up with technological advancements. Ross Woodham, Director for Legal Affairs and Privacy at Cogeco Peer 1, explained to Jamie Carter (@jamieacarter), “We are facing a perfect storm. With new technologies being developed, data being generated by personal devices at an exponential rate, and international security a global concern, it is no wonder legislators are struggling to get to grips with new issues and moral dilemmas.”[3]


The fact that legislators are having difficulty catching up with technology does not mean companies should take lightly their self-regulation responsibilities. Regulations will eventually address technological advances and, when they do, companies that played loose with their ethical responsibilities could pay a steep price for their inattention or deliberate breach of trust. Mark Cameron (@MarkRCameron), CEO of W3 Digital, writes, “Creating genuine and meaningful competitive advantage in the digital world means focusing on generating long-term trust with your customers. Companies need to clearly define what customer data they actually need, be transparent about it and, importantly, understand how they are going to use it in a way that delivers value back to the customer.”[4] Developing trust takes a long time and that trust can be broken in an instant. White explains, “The vast quantities of data available to companies provide the opportunity to develop new strategies and target their messages, products, and services. But as collection methods, and the way such information is analysed and used, become increasingly opaque to consumers, the more their trust in companies declines.” She continues:

“Research conducted by Ipsos MORI and the Royal Statistical Society suggests that the level of trust the public have in companies to use data appropriately is lower than the public’s overall trust in businesses. Media, telecommunications, and insurance companies are particularly affected. If a company’s conduct in dealing with Big Data is perceived as less than ethical, this can adversely affect its reputation, customer relationships, and, in the long run, its revenues. Among the ethical issues thrown up by data collection, the right to privacy, which allows people to limit who has access to their personal information, is a key concern. Individuals should have meaningful control over how a corporation gathers data from them, and how it uses and shares that data, the briefing notes.”

Trust is going to be a particularly tricky notion as the personal wearable devices market grows. Cameron explains, “In the first wave of connected consumer devices — the buyer get a shiny device and a shiny free app, but loses control over some very intimate personal data. That bargain is not spelled out clearly enough, according to some.” Robin Wilton, Technical Outreach Director for Identity and Privacy at the Internet Society, told Cameron, that the better suppliers are at providing a good customer experience, the more at risk they make their consumers’ data. “The more ‘seamless’ the service,” he stated, “the less awareness the user has of what is going on … it perpetuates a situation in which the user makes privacy decisions based on an unrepresentative subset of relevant information. The imbalance of power between the user and the device/service provider persists, and there is no real notion of informed, explicit consent.” The dilemma faced by both companies and their consumers is easy to see. White offers up six questions, drawn from the IBE briefing, that companies need to consider:


1. How does the company use Big Data, and to what extent is it integrated into strategic planning? Clearly identifying the purpose for which data will be used helps to identify the critical issues that may arise. Another important question is: How does that particular use benefit the customer or wider public? For data use to benefit your organization and its stakeholders, it has to be accurate, reliable, and trustworthy. What do you do to ensure the quality and veracity of your data?


2. Does the organization send a privacy notice when personal data are collected? Is it written in clear and accessible language which allows users to give truly informed consent? For example, social media platforms do ask users to agree to terms and conditions when they register. However, research shows this does not necessarily correlate to informed consent as many users do not read through lengthy, complicated documents, but simply sign to enable them to open their accounts.


3. Does my organization assess the risks linked to the specific type of data my organization uses? Identifying any potential negative impact the use of data might have on particular groups of people and what might happen if the datasets became public, is one way of increasing awareness of the damage a potential data breach would cause. In some cases, a privacy impact assessment may be advisable. The risk of misuse of the company’s information by employees should not be underestimated.


4. Does my organization have safeguards in place to mitigate these risks? Communicating the preventive measures which are in place to bolster data security is an effective way to promote trust. These might include controls on data access and harsh penalties for its misuse.


5. Do we make sure that the tools to manage these risks are effective and measure outcomes? Audit has a key role to play in helping companies deal with these issues.


6. Do we conduct appropriate due diligence when sharing or acquiring data from third parties? When buying information from third parties, due-diligence procedures must apply as they would to other purchases. Do the suppliers uphold similar ethical standards and guarantee the accountability and transparency of these practices?


Big data trust and privacy issues are not going to fade away. The loss of consumer trust could force legislators to restrict access to data to the point that companies lose many of the benefits big data can provide. That should be the last thing companies want to have happen. Lodewyck concludes, “Everyone has a stake in the success of big data. Health care providers use information to help make people healthier. Businesses use it to innovate and grow the economy. Government can ensure our communities operate efficiently, our privacy is protected and our financial information is secure.” In other words, all stakeholders have an interest in ensuring that big data is used ethically so that the benefits it provides can continue to be enjoyed.


[1] Joe Lodewyck, “The Power, Promise and Pitfalls of Big Data,” Information Management, 17 November 2015.
[2] Samantha White, “6 ethical questions about Big Data,” CGMA Magazine, 15 June 2016.
[3] Jamie Carter, “There’s a ‘perfect storm’ at the heart of big data,” TechRadar, 28 June 2016.
[4] Mark Cameron, “Personalised marketing: Balancing what you can do with what you should do,” CMO, 4 August 2015.

Related Posts:

Full Logo


One of our team members will reach out shortly and we will help make your business brilliant!