Top 10 Privacy Concerns for 2023
The last Data Privacy Week featured experts’ insights and highlighted the privacy issues that regulators and privacy teams are likely to be most concerned about in 2023.
Among the most pressing issues in 2023, the following will make waves:
1 – Sensitive Data
Going stricter, broader
The use of biometrics (such as fingerprints and facial recognition), Health data and geo location for identification and security purposes is becoming more prevalent in various industries, including finance, healthcare, and law enforcement.
These technologies have the potential to improve security and convenience, but also raise significant privacy concerns. In response to these concerns, governments around the world are introducing stricter regulations and laws to govern the use of biometrics, health data, and geolocation information.
We are observing stricter measures, more exact regulations, and a greater number of bills. So in 2023, we can expect to see more efforts to control the collection, storage, and use of this data, and to protect individuals’ privacy rights.
2 – Children’s Data
Get up to speed with COPPA
The Children’s Online Privacy Protection Act is a federal law in the United States that regulates the collection of personal information from children under the age of 13 — and its picking up speed!
The enforcement of COPPA has been increasing in recent years, with large fines being imposed on companies that violate the law, such as the $170 million fine issued to Google by the FTC.
In addition to COPPA, California has also introduced the Age Appropriate Design Code, which is set to go into effect in 2024. This law requires companies to implement specific data protection measures when they know that their services are likely to be accessed by children under the age of 18.
This includes providing privacy policies that are easy for children to understand, obtaining verifiable parental consent before collecting personal information, and providing a way for parents to review and delete their child’s personal information. As a result, We can expect to see more bills being introduced in 2023, aimed at protecting children’s privacy online.
These bills may include triggers such as “likely to be accessed by a child under 18” to ensure that companies are implementing appropriate data protection measures when children are using their services. The laws will also aim to make sure that children’s personal data is not being misused by the companies which collect it.
3 – Employee Data
Employee data rights are gaining momentum in state legislation
In California, employees are granted specific rights as consumers, and other states are following suit by implementing laws that safeguard employee data in various areas such as the use of AI in the workplace and access to personal accounts.
However, it is crucial to be aware of the potential unintended uses of employee data and to take the necessary measures to protect sensitive information and prevent an unlawful use of it—- like the sale of employee data.
Companies should ensure they are in compliance with these laws and implement robust security measures to secure employee data.
Actively inform employees
In addition, employees should be informed about their rights and how their data is being collected, used, and stored by the company.
This includes transparency about what data is being collected and why, as well as the right for employees to access, correct, and delete their personal data.
4 – Data Minimization
Minimize data, Maximize privacy
Data minimization is a growing concern among regulators in the US and Europe.
The FTC has taken action against companies such as CafePress and Drizly for not following data minimization practices, which could have significantly reduced the impact of the breaches they suffered (1) (2).
Collect now, notify and choose later?
Think again.
Even 34 state Attorneys General have agreed that the approach of gathering information now and providing notification and options later is no longer acceptable. This has been the case with JUUL labs—the maker of the popular e-cigarette brand has been accused of violating several data privacy laws, specifically, “The collection of personal information from its customers, including their names, addresses, and birth dates, without obtaining proper consent.”
Additionally, JUUL has been accused of using this data to target marketing campaigns to young people, despite the fact that the company’s products are only intended for use by adults over the age of 21.
5 – Disclosure and Dark Patterns
Is your online privacy compliance at risk due to dark patterns?
Regulators on both sides of the Atlantic are focusing on improving transparency in online disclosures and protecting consumers from “dark patterns.”
Wait a minute, what exactly are “dark patterns”?
Dark patterns are user interfaces that are designed to trick or mislead users into making certain choices or sharing personal information without their full understanding or consent.
Some examples of dark patterns include using “may” instead of “will” to make a request sound less definitive, using bulleted lists to obscure important information, and creating misleading user experiences that make it difficult for users to understand the implications of their choices.
Learn how to spot and avoid these pitfalls in your online experience
To avoid dark patterns in privacy, it is important to be aware of common practices used to manipulate individuals into sharing their personal information or agreeing to terms they may not fully understand.
Some examples of these methods include:
- Making it difficult to find or understand the opt-out option
- Pre-selecting options that favor the company’s interests over the user’s
- Using language that is difficult to understand or misleading
- Disguising the true purpose of a request for personal information
Regulators are working to make sure that these practices are not used to mislead consumers and provide them with the information they need to make informed decisions about their personal information.
6 – Consent
GDPR-style consent is here to stay
Current state privacy laws, as well as new proposals are adopting GDPR-style consent (which requires specific, freely given, informed, and unambiguous consent).
This means that companies are no longer able to condition the use of their services on the collection and use of data that is not necessary for the provision of the service.
Instead, companies now have to obtain an explicit opt-in from individuals for the use of their personal data.
This represents a significant shift in the way companies should approach data privacy and will require a change in their business practices.
7 – Going beyond ‘Disclosure’
Aligning with ‘consumer expectations’
Consumer expectation and purpose specification is an essential aspect of privacy regulations, and it is closely tied to the concept of disclosure. However, it goes beyond simply informing consumers about how their personal information will be used.
It also involves ensuring that secondary uses of personal information are compatible with consumer expectations. This means that companies must align their use of personal information with what consumers would reasonably expect.
Don’t misuse a secondary-use
The Data Protection Commission of Ireland has taken action in the Meta cases, where it was found that companies were using personal information for purposes that were not aligned with consumer expectations.
Similarly, the California Privacy Act has a detailed list of regulations that govern the compatibility of secondary uses of personal information.
It’s important to note that consumers have the right to know how their personal information is being used and to expect that it will be used in a way that is compatible with their expectations.
Companies should be transparent about their data collection and usage policies, and should not use personal information for any purpose that would be unexpected or unwanted.
8 – AI
AI enforcement is on the rise
While still in its proposal stage, The EU’s AI Act is expected to set the standard for AI regulation in the EU, and could have a significant impact on how AI is developed and used across the continent.
France, Netherlands and Norway
Task forces are being formed in France and the Netherlands:
- In France, the task force is called the “National AI Council” and it will be responsible for creating a legal framework for AI development and use.
- In the Netherlands, the “AI Coalition” will be focused on ensuring that AI is developed and used in a responsible and ethical manner.
- Both task forces are expected to work closely with industry, academia, and government to develop regulations and guidelines for the safe and responsible use of AI.
Norway published an AI in transparency law. The proposed Act takes a risk-based approach to the use of AI, applying different levels of requirements based on the risk associated with the specific use of the AI system in question.
New York and New Jersey
Both states introduced bills to regulate AI in the hiring process. These bills aim to address concerns about the potential for bias and discrimination in the use of AI-powered hiring tools, as well as to ensure transparency and accountability in the use of these technologies — protecting job applicants from discrimination and bias.
9 – Risk Assessments, DPIAs
Get started now, iterate later
It is important to begin conducting a Data Protection Impact Assessment (DPIA) as soon as possible, even if regulations for it are not yet in place.
While CPRA currently does not have regulations for DPIAs, Colorado has draft regulations.
It’s better to start working on a DPIA now and make adjustments as regulations evolve.
10 – Cross Borders
Still not off the hook
It is important to review the EU-U.S. Data Privacy Framework Principles to ensure compliance and understand the necessary steps for certification or recertification, regardless of the Schrems case.
Here at BigID, we’ll closely monitor these ongoing challenges as we continue to help organizations adapt to the constantly shifting data privacy landscape.
In the meantime – see BigID in action, and schedule a demo to speak with our privacy experts that will help you fulfill data privacy requirements, secure highly sensitive information, and continue to build consumer trust.
Contributing Authors:
Shiko Genossar, Sr. Product Manager
Heather Federman, Chief Privacy Officer
Tomer Elias, Director of Product Management