Home » Big Data » The Continuing Battle Over Personal Data

The Continuing Battle Over Personal Data

May 4, 2021

supplu-chain

It’s no secret that the lifeblood coursing through the veins of today’s businesses is data. Not all data collection is controversial; however, some efforts to collect data have lit a fuse that continues to burn. The data creating the greatest stir is personal data. Journalist Scott Kirsner (@ScottKirsner) observes, “Today’s Web is populated by companies that want to hoover up your data, whether it’s a dating site that wants to know about your hobbies or a bank that wants financial information to quote you a mortgage interest rate. And once you click the ‘enter’ button to hand over that data, you have zero control over what happens next. It might be sold, stolen, or shared without your knowledge.”[1] Because personal data can be sold, stolen, or shared, governments around the globe are belatedly trying to help individuals protect their data. To date, the laws garnering the most attention have been the European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). Samantha Ann Schwartz (@SamanthaSchann) writes, “Data privacy management and protocol got a facelift with the enactments of the General Data Protection Regulation and the California Consumer Privacy Act. Many companies failed to meet compliance standards in time for GDPR. And for companies that met the deadline, 67% fear they won’t be able to sustain compliance, according to Tanium.”[2] That’s a problem — for both companies and consumers; especially since more national and state laws are likely to be enacted.

 

New Laws on the Way

 

Attorneys Cassandra Gaedt-Sheckter, Alexander H. Southwell, and Ryan Bergsieker report, “Californians have ushered in a law protecting individuals’ privacy unlike any other in the United States, and businesses are well-advised to evaluate its impact and prepare to comply. Proposition 24, which passed on Nov. 3rd [2020], establishes the California Privacy Rights Act (CPRA), which will take effect Jan. 1, 2023. If this seems like déjà vu, it’s because just two years ago, the California legislature passed an unprecedented privacy law, the California Consumer Privacy Act (CCPA), which the CPRA amends. The continuing shift in privacy law embodied by the CPRA is set to make a significant impact on businesses’ compliance efforts and operational risk, as well as individuals’ expectations.”[3] They add, “We expect these bold moves in California to foreshadow what will come across the country.”

 

The latest state to enact a consumer data privacy law is Virginia. Attorneys Kurt R. Hunt and Matthew A. Diaz report, “On March 2, 2021, Virginia Governor Ralph Northam signed the Consumer Data Protection Act (CDPA or law) into law. This makes Virginia the second state, behind California, to adopt a comprehensive consumer data privacy law. Like the California Privacy Rights Act and EU General Data Protection Regulation, the CDPA creates a number of privacy obligations for businesses and gives Virginia consumers more control over their personal data. … With the effective date of the CDPA two years away, businesses should start evaluating their current data processing activities and begin developing a compliance program for the CDPA, CPRA, and other consumer privacy laws likely to be enacted this year.”[4] Companies need to take these laws seriously since failure to comply can be costly.

 

According to reporter Andrea Vittorio (@alvittorio), new laws are also likely to be enacted prohibiting deceptive “opt in” requests referred to as “dark patterns.” She explains, “Dark patterns online have gotten attention mostly in academia until recently, when research started informing tech policy, according to Colin Gray, a professor at Purdue University who studies the topic. He said the phrase first emerged about a decade ago to describe digital designs that use inferences from behavioral psychology to convince consumers to act in a certain way. One example in the privacy context is choices on whether to allow cookies that track users online. Options for allowing more data collection are often presented as an easier or more visually obvious choice, while more restricted settings tend to take more clicks to reach.”[5]

 

Axel Voss, a German Member of the European Parliament and one of the fathers of the General Data Protection Regulation, suggests privacy laws must be constantly updated to keep pace with new technologies and how companies use them. He told the Financial Times, “We have to be aware that GDPR is not made for blockchain, facial or voice recognition, text and data mining  . . .  [or] artificial intelligence. The digital world is about innovation. We cannot stick with principles established in the 80s that do not reflect the new situation we are living in.”[6] The point is, companies must remain vigilant to changes in various laws that could adversely affect their bottom lines.

 

Concluding Thoughts

 

Several years ago, big data expert Kaiser Fung (@junkcharts) suggested seven principles companies should adopt for responsible data collection.[7] These principles may not make companies fully compliant with various privacy laws; however, they are a good place to start.

 

Principle 1. Opt Ins not Opt Outs. According to Fung, “The default should be opt-in: no data collection unless instructed by users. When the default setting is opt-in, businesses have to win over the users’ trust, and so they will have a much stronger incentive to clarify and explain the benefits of the data collection.”

 

Principle 2. First-person not second- or third-person permission. Fung notes that companies like Facebook can learn a lot about you by inference, such as when you “friends” upload their contact list to Facebook containing your personal data. He writes, “Permission by proxy is dishonest, and should be banned. … Just as advertisers do not want to be associated with certain websites or videos, users also do not want their information disclosed to third parties with whom they don’t want to be associated.”

 

Principle 3. Stop misdirection. Fung writes, “I’d like to see strong regulation with heavy penalties for businesses that request permission from users for specific uses of their data but then fail to police their data analysts to curb abuses. … To prevent misdirection of data, companies should have a data governance function.”

 

Principle 4. Sunshine Policy. Fung believes consumers should have full access to who has access to their data and how it is being used. He explains, “If companies believe that the trading of private data is fundamental to their business models, then they should allow users to inspect how they collected the data, and which entities received the data. Better yet, users should be given the ability to opt out of specific transactions.”

 

Principle 5. Wall off the data. Fung explains, “If companies are willing to wall off user data, and not send them to third parties, then users are more likely to share the data.” Although consumers would like companies to adopt this principle, monetizing data is how many companies make money and walling off data is not likely to become a widely implemented principle.

 

Principle 6. The right to be forgotten. According to Fung, “Companies should be required to delete user data older than say five years. Aggregate statistics older than five years should be allowed. More recent data supersede the older data, so there is negligible value in keeping the old data anyway. … The right to be forgotten reduces the number of copies of your immutable data in existence and thus reduces the chance that they get stolen.”

 

Principle 7. Stop the blackmail. Fung perceives refusal of service unless data collection is permitted as a form of blackmail. He writes, “Website operators don’t really want to ban any user so as to inflate their user counts (‘eyeballs’). This practice creates the perception of dishonesty, and is self-defeating, if the companies actually believe that the data collection benefits their users. If the business model is such that users get free service in exchange for their private data, then they should enforce strict access policies, only serving those who acknowledge the data collection.”

 

The privacy battle has become a never-ending war. Tension between consumers wanting more control over their data and businesses desiring more access to data will remain a prominent characteristic of the Digital Age. Even if legislation defines the scope and use of personal data collection, this tension will remain.

 

Footnotes
[1] Scott Kirsner, “Web inventor Tim Berners-Lee wants ‘personal empowerment’ for users, through his data startup,” Boston Globe, 29 December 2020.
[2] Samantha Ann Schwartz, “Why 67% of companies fear they can’t sustain privacy compliance,” CIO Dive, 12 February 2020.
[3] Cassandra Gaedt-Sheckter, Alexander H. Southwell, and Ryan Bergsieker, “Businesses should brace for new U.S. privacy regulations, enforcement,” PropertyCasualty360, 25 November 2020.
[4] Kurt R. Hunt and Matthew A. Diaz, “Virginia Becomes 2nd State to Adopt a Comprehensive Consumer Data Privacy Law,” The National Law Review, 8 March 2021.
[5] Andrea Vittorio, “‘Dark Patterns’ in Consumer Data Privacy Garner Policy Attention,” Bloomberg Law, 23 March 2021.
[6] Javier Espinoza, “EU must overhaul flagship data protection laws, says a ‘father’ of policy,” Financial Times, 2 March 2021.
[7] Kaiser Fung, “7 Principles of Responsible Data Collection,” Big Data, Plainly Spoken, 7 March 2018.

Related Posts: