Home » Data Privacy Day » Data Privacy Day 2024

Data Privacy Day 2024

January 25, 2024

supplu-chain

Data privacy is a big deal. It’s also a big challenge. We sit on the cusp of a new business era driven by artificial intelligence (AI); however, without data AI doesn’t work. That’s why there is a growing tension between data availability and data protection. Data will not disappear as a result of the enactment of privacy laws and regulations; however, privacy laws and regulations do make the collection, storage, and analysis of data more complicated. Because both monetary and reputational losses can result from the improper handling and protection of personal data, organizations need to make data privacy a top priority. Data Privacy Day (initially Data Protection Day) was initiated in 2006 by the Council of Europe. Two years later, Data Privacy Day was observed in the United States. In 2009, the United States House of Representatives officially recognized National Data Privacy Day. Five years later, the United Senate also recognized 28 January as National Data Privacy Day. Data Privacy Day is now celebrated internationally on 28 January.

 

Most promotional materials for Data Privacy Day are aimed at users rather than businesses. For example, the National Cybersecurity Alliance notes, “Your online activity creates a treasure trove of data. This data ranges from your interests and purchases to your online behaviors, and it is collected by websites, apps, devices, services, and companies all around the globe. This data can even include information about your physical self, like health data — think about how an app on your phone might count how many steps you take. You cannot control how each little piece of data about you and your family is collected. However, you still have a right to data privacy. You can help manage your data with a few repeatable behaviors. Your data is valuable and you deserve to have a say!”[1]

 

I think it’s safe to say that, over the past few years, consumers have become more aware and more concerned about how their data is collected, stored, and used. It was consumer advocacy that stirred governments into acting to protect personal data. Google, one of the biggest collectors of data, is keenly aware of this change in consumer temperament. Maria Helena Marinho, formerly Google’s Senior Research & Insights Manager, and Elizabeth Tran, Google’s Product Marketing Manager, observe, “Improving online privacy is one of the most important steps marketers can take to boost their brand preference and ensure that nearly half their customers don’t switch to another brand.”[2]

 

What Businesses Can and Must Do

 

Marinho and Tran note, “For customers, feeling in control is about more than just being in control. Privacy tools that allow people to change their cookie preferences and unsubscribe from email marketing can help keep customers in control of their data. But those tools are not enough to provide customers with the more substantial feeling of control that they need to trust a brand. Customers also want to know when and why they are sharing their information — and to understand the benefits they will receive from doing so.” They suggest customers want privacy interactions to be:

 

• Meaningful. Show people what they get in return for sharing their data.
• Memorable. Remind people what data they shared and when.
• Manageable. Provide tools for people to manage their privacy.

 

Journalist Danielle James reports that a survey conducted by Cisco confirms consumer concerns about data privacy. She writes, “Consumers want more transparency from companies about how personal data is used, according to Cisco’s consumer privacy survey of 2,600 adults across 12 countries, including the United States. The study found that privacy laws are mostly viewed favorably by consumers. That may be because respondents indicated they are very concerned about how personal information is used in artificial intelligence settings.”[3]

 

Futurist Mark van Rijmenam states, “As technology continues to advance at an unprecedented rate, the use of artificial intelligence has become increasingly prevalent in many areas of our lives. From generative AI that can create any content using a simple prompt to smart home devices that learn our habits and preferences, AI has the potential to revolutionize the way we interact with technology. However, as the amount of data we generate and share online grows exponentially, privacy concerns have become more pressing than ever before. Therefore, as a futurist, I think it is important to explore the topic of privacy in the age of AI. … Big Tech companies must be transparent about their data practices and ensure that the data they collect is used ethically and responsibly. They must also work to ensure that their platforms are inclusive and accessible to all rather than being controlled by a small group of powerful players.”[4] He adds, “Organizations and companies that use AI must prioritize privacy and ethical considerations in their AI systems’ design and implementation. This includes being transparent about data collection and usage, ensuring data security, regularly auditing for bias and discrimination, and designing AI systems that adhere to ethical principles.”

 

Van Rijmenam lists some of the laws and regulations that have been enacted in recent years. In the Unites States, the California Consumer Privacy Act (CCPA); the Consumer Online Privacy Rights Act (COPRA), and the SAFE DATA Act. In Europe, the General Data Protection Regulation (GDPR), In China, the Cybersecurity Law and a new personal information protection law. In Australia, the Privacy Act 1988, which critics argue is outdated and needs to be updated to address emerging privacy concerns posed by AI. He adds, “Many other countries are taking different approaches to protecting their citizens’ privacy in the age of AI, and the development of privacy laws is an ongoing process with changes and updates likely to happen in the future. While the responsibility of protecting privacy falls on many parties, including governments, companies, and individuals, it is essential for consumers to take an active role in protecting their personal information. By staying informed, utilizing privacy tools and settings, and being mindful of their online activities, consumers can help safeguard their privacy in the age of AI.”

 

Concluding Thoughts

 

Half-a-dozen years ago, Steve Wilson, Vice President and Principal Analyst at Constellation Research, Inc., wrote, “Big data and AI are infamously providing corporations and governments with the means to know us ‘better than we know ourselves.’ Businesses no longer need to survey their customers to work out their product preferences, lifestyles, or even their state of health; instead, data analytics and machine learning algorithms, fueled by vast amounts of the ‘digital exhaust’ we leave behind wherever we go online, are uncovering ever deeper insights about us. Businesses get to know us now automatically, without ever asking explicit questions.”[5] Most people recognize there is a tradeoff between the personalization consumers want and the data they are willing to share. Back in 2018, Wilson wrote, “I fully acknowledge the fit among big data, artificial intelligence and data privacy standards is complex and one in which a dynamic balance must be maintained in order for new digital business models at the intersection of these emerging norms to ward off regulatory and consumer backlash. However, as communities increase their scrutiny over profits extracted from Personal Data, I am calling on digital businesses to implement Big Privacy. That is: exercise restraint with analytics and machine learning, to be transparent about their business models, to offer consumers a fair and transparent trade for data about them, and to innovate in privacy as well as data mining. In addition, businesses must not wait for data protection laws to strengthen, but should be proactive and guided by fast-moving community standards.”

 

In the Digital Age, independent consultant Alan Morrison insists people should get over the notion that they have online privacy. He explains, “[With the emergence of] large language models (LLMs) and the enormous datasets being collected, curated and expanded for training those models are opening another view of personally identifiable activity. With enough data from enough sources, it can be easy enough to triangulate and determine who is who.”[6] Despite this pessimistic view about privacy, Morrison concludes, “Users, whether individuals or organizations, need to assert data ownership and control, particularly when it comes to their most sensitive personally identifiable data. They need to be proactive about their data. We’re well past the 2000s. Passivity is no longer an option. This is a pivotal time we’re living in, and a great time to become more self-reliant.” Data Privacy Day is good time to start practicing such vigilance.

 

Footnotes
[1] Staff, “Data Privacy Week,” National Cybersecurity Alliance, 16 November 2023.
[2] Maria Helena Marinho and Elizabeth Tran, “Customers want control over their data — and won’t hesitate to switch brands to get it,” Think with Google, February 2023.
[3] Danielle James, “Retailers are wading deeper into customer data. States are raising the alarm.” Retail Dive, 2 November 2022.
[4] Mark van Rijmenam, “Privacy in the Age of AI: Risks, Challenges and Solutions,” The Digital Speaker, 17 February 2023.
[5] Steve Wilson, “Big Privacy: The data privacy compact for the era of big data and AI,” ZDNet, 5 December 2018.
[6] Alan Morrison, “There is no privacy. Get over it.” Data Science Central, 30 March 2023.

Related Posts: