Privacy Regulation Roundup

Author(s): Safayat Moahamad, Carlos Rivera, Ahmad Jowhar, Fritz Jean-Louis, Andrew Sharp, Mike Brown

This Privacy Regulation Roundup summarizes the latest major global privacy regulatory developments, announcements, and changes. This report is updated monthly. For each relevant regulatory activity, you can find actionable Info-Tech analyst insights and links to useful Info-Tech research that can assist you with becoming compliant.

Beyond Anonymization: A New Privacy Frontier for AI

Type: Article

Published: March 2025

Affected Region: EU

Summary: The evolution of artificial intelligence (AI) is forcing organizations to strike a delicate balance between innovation and compliance with stringent data privacy laws, such as the EU General Data Protection Regulation (GDPR) and the EU AI Act. These regulations demand safeguards for data used in AI model training, spotlighting techniques like deidentification, pseudonymization, and anonymization.

Deidentification removes or alters personal identifiers but leaves data vulnerable to reidentification, keeping it within the scope of data protection laws. Pseudonymization, a form of deidentification, substitutes identifiers with artificial ones reversible via a key, offering security benefits, but it remains subject to regulation. By contrast, anonymization irreversibly strips identifiers, theoretically exempting data from such laws. Achieving this standard is increasingly difficult as technology advances, enabling reidentification through data linkage or external information.

Enter "subjective anonymization," which is a shift from the black and white view of data as either anonymous or identifiable. This approach posits that anonymization’s effectiveness hinges on context, considering:

  • Who handles the data.
  • What additional information exists.
  • The likelihood of reidentification.

Supported by recent legal opinions, it emphasizes a risk-based framework over absolute technical standards. For AI development, this flexibility could ease data use while meeting mandates for robustness, transparency, and accountability. However, it also introduces uncertainty. Regulators may push back against contextual definitions, potentially clashing with innovation. Emerging solutions like synthetic data and Edge AI promise privacy gains but carry risks, such as retaining identifiable patterns and persisting training data challenges, respectively.

Analyst Perspective: When you deal with AI innovation and data privacy, it demands a forward-thinking and adaptable approach, something I’ve seen evolve across countless global engagements with organizations. Subjective anonymization offers a lifeline, it also aligns with the GDPR’s risk-based ethos and the EU AI Act’s push for accountable AI systems. However, its success hinges on organizations embedding rigorous risk assessments, continuous monitoring, and cutting-edge techniques like differential privacy into their workflows.

Regulators must also meet industry halfway, formalizing context-driven standards to avoid stifling progress with rigid benchmarks. Collaboration among technologists, policymakers, and ethicists will be critical to crafting governance that protects individuals without handcuffing innovation.

Analyst: Carlos Rivera, Principal Advisory Director – Security & Privacy

More Reading:


Illinois Biometric Privacy Act: Impact on Cyber Insurance

Type: Legislation

Enacted: August 2024

Affected Region: USA

Summary: The Illinois Biometric Information Privacy Act (BIPA) stands well ahead of its time, as a groundbreaking piece of legislation from 2008. The law describes standards for the collection and handling of biometric data. With the increasing reliance on technology using fingerprints, facial recognition, and other biometric identifiers, BIPA’s regulations ought to be considered as legislation impacting both individual privacy and organizations processing biometric information. Such laws have motivated some insurance companies to add biometrics exclusion language in cyber insurance policies to mitigate the risk of class-action lawsuits.

Analyst Perspective: If we are to dissect the key requirements under BIPA, it boils down to transparency, consent, and data retention. First, understand that this law has extra-territorial application. This means that while it is a law that impacts citizens of Illinois, like the GDPR, the law can apply to other legal jurisdictions.

The law allows the use of biometrics, provided there is proper notification, explicit consent, and a communicated data retention policy. For organizations processing biometric data, understanding these requirements is essential to avoid potential legal repercussions.

No matter where your organization is located, if you use biometrics as part of your operation, you may be exposed to BIPA. Considering the impact on privacy and legal compliance, this legislation is of utmost importance. Therefore, it is crucial to implement proper notice and controls to ensure compliance and mitigate risks.

Analyst: Fritz Jean Louis, Principal Cybersecurity Advisor – Security & Privacy

More Reading:


23andMe: What Happens to Your DNA Data When It’s Sold?

Type: Article

Published: March 2025

Affected Region: USA

Summary: The genetic data collected by 23andMe falls into a gap in personal and medical data privacy regulations, and it’s causing headaches for its customers, as well as for a potential sale.

23andMe is a bankrupt personal genomics company that provided a personal service to synthesize and analyze the DNA of its customers. For as little as US$150, you could send the company a tube of your saliva and receive a personalized report on your ancestry. For an additional fee, 23andMe would scour your DNA record to identify if you were at a higher risk for certain disorders with a genetic component, such as diabetes, certain types of cancer, and celiac disease.

Despite capturing a great deal of public attention (the service was featured on Oprah) and a $6 billion valuation in 2021, all was not well. In October 2023, 23andMe reported a hack of seven million customer profiles, breaching profile information as well as genetic profile data for those users. Even before the hack, it was becoming apparent the company’s customers only needed to use the service once and new customers were signing up at lower than previously expected rates. Prospective customers likely second-guessed their purchase when the breach came to light.

The company filed for bankruptcy in March 2025, opening a whole new can of worms: Any company that bought 23andMe would receive full access to their customers’ genetic data.

As of this writing, the company’s market capitalization is a little under $25 million dollars, and a search for the company’s name will bring up article after article with instructions on how to delete your data from the service.

Analyst Perspective: 23andMe’s consumers might have assumed that their DNA data was covered in the United States by HIPAA, a law that sets boundaries for what information can be shared when you speak with a medical practitioner. But 23andMe isn’t subject to HIPAA, because it doesn’t meet the definition of a "Covered Entity" under HIPAA. They’re not a hospital system or a physician. They’re considered a direct-to-consumer company, and as such, there are far fewer controls to dictate what they can do with the data they collect.

Part of the value of a sale would likely be their genomic data set. But without any guarantees on how their data will be handled post-sale, individuals are deleting their data and withdrawing themselves from the service, likely further eroding the value the company’s remaining assets.

More clarity and better guarantees about how genomic data would be handled post-sale could have both protected 23andMe and its customers through the bankruptcy process.

Analyst: Andrew Sharp, Research Director – Security, Privacy, Infrastructure, and Operations

More Reading:


OPC’s New Online Tool Helps Identify Risks of Privacy Breach

Type: Article

Enacted: March 2025

Affected Region: Canada

Summary: In March, the Office of the Privacy Commissioner of Canada (OPC) announced the release of a new online tool that will help organizations and privacy professionals evaluate notification requirements during data breaches involving personal identifiable information (PII). The Privacy Breach Risk Self-Assessment guides the user through a series of questions to assist in determining whether the breach could result in a “real risk of significant harm” (RROSH) to an individual.

Key factors previously identified in federal and provincial Acts relevant to determining RROSH include: the sensitivity of the PII involved in the breach, and the probability the PII has, is, or will be misused. However, in the past this would be left to the organization and/or privacy professional involved to evaluate thresholds for these factors based on the available information regarding the breach.

Often, the determination of RROSH is cut and dried, but in many cases a decision must be made by the business that could have potential future impacts such as investigations, fines, and legal liability. The new tool provides added guardrails in this evaluation so that organizations can make informed decisions about whether they need to report a breach of security safeguards.

Analyst Perspective: Testing the tool revealed that the questions asked are logically structured and not entirely static. Based on what types of PII are selected as impacted in a breach, the tool asks about associated attributes. For example, if “Biometrics” is selected as the type of PII, attributes like facial image, fingerprints, voiceprints, and identifiable physical traits can be selected as attributes.

Another great thing about the tool is that it was designed in such a manner that the user does not need to input any identifiable information about the organization being evaluated. Users of the tool are essentially anonymous and not bound or tracked by the results.

There are some aspects of the tool that potential users should be cautious of. The evaluation methodology used is not stated, so question weightings and other factors used to produce results are unclear. Additionally, during testing, some of the results produced by the tool were unexpected based on the answers provided. So, as the tool states upfront, it "[…] is not meant to replace your organization’s or government institution’s own assessment of a breach of security safeguards,” and should only be used as a guide.

Tools assist us in all facets of life to accomplish tasks effectively, so that the results meet a higher standard of quality and can be reproduced. The OPC’s addition of this tool will expand many privacy professionals’ existing set of resources so that they can more effectively evaluate RROSH, and thus better serve individuals whose PII has been entrusted to organizations.

Analyst: Mike Brown, Advisory Director – Security & Privacy

More Reading:


The Growing Influence of Privacy on Tech Contracts

Type: Article

Published: March 2025

Affected Region: All Regions

Summary: Traditionally, data breaches were seen and treated as another contractual risk, with no specialization needed. However, as the costs and frequency of data breaches increased, more stringent privacy laws were needed that would incorporate extensive remediation efforts and additional financial and reputational implications for noncompliance. Hence, legislation like the EU’s General Data Protection Regulation (GDPR) and California’s Consumer Privacy Act (CCPA) was enacted.

The introduction of such comprehensive laws helped fundamentally reshape the playing field of privacy rules and regulations and heightened accountability for organizations. This ultimately influenced the privacy implications in contracts between technology companies and their clients. For example, clients of SaaS providers are advocating for higher data privacy liability caps and more accountability on the providers' end. This has resulted in an increase in indemnification being included in contracts, in which the SaaS provider agrees to protect individuals from losses arising out of attributable data breaches. Similar changes have been observed in contractual agreements with MSPs, with the focus on uncapped liability for providers – if they have been found to be liable for the data breach.

Analyst Perspective: The number of record fines and settlements in the EU and US shows the severe financial implications of breaches and noncompliance. At a time when organizations are expected to protect consumer data, businesses are taking proactive steps. From implementing robust data governance frameworks, to conducting comprehensive audits of their current practices, companies are doing their due diligence to adhere to stricter laws.

Furthermore, with 20 US states and counting enforcing comprehensive data privacy laws, organizations should understand the nuances of the privacy laws of the states they operate in. Consumers are becoming more aware and concerned than ever before about their data privacy, which has resulted in organizations strengthening their data privacy practices to foster customer trust and loyalty. With the rapid adoption of AI, we can observe the technology having an impact on data privacy practices, which will require the partnership between technology providers and consumers to mitigate risks in data privacy contract negotiation.

Analyst: Ahmad Jowhar, Research Analyst – Security & Privacy

More Reading:


Honda Settlement Redefines Compliance Expectations

Type: Enforcement Action

Announced: March 2025

Affected Region: USA

Summary: The recent California Privacy Protection Agency (CPPA) settlement with American Honda Motor Company, which included a $632,500 fine, offers a clear wake-up call for companies. Honda was cited for failing to offer symmetrical cookie consent options, improperly sharing consumer data without adequate contracts, and designing confusing privacy request interfaces. The court found these practices to be in violation of the California Consumer Privacy Act (CCPA).

The case sheds light on the need for cookie banners to provide equal “Accept All” and “Reject All” options, refraining from deceptive web design, and honoring opt-out requests. Organizations must regularly audit tracking technologies, maintain cookie governance policies, and ensure transparency in their use of data. Privacy request forms should be simple, accessible, and collect only the minimal data necessary for verification, aligning with CCPA’s data minimization principle. Additionally, organizations must have up-to-date contracts with vendors handling personal data, embedding CCPA-mandated terms, and aligning third-party risk management with privacy impact assessments.

Regular employee training and clearly documented privacy request procedures are also essential. Perhaps most notably, the CPPA’s requirement that Honda engage UX designers indicates a broader shift, i.e. regulators now expect intuitive, user-friendly privacy interfaces that support a privacy-by-design approach.

Analyst Perspective: The Honda settlement marks a turning point in privacy enforcement by emphasizing that compliance is not only about legal checkboxes or backend data practices, but also about user experience. Regulators are clearly signaling that deceptive designs or dark patterns, asymmetrical consent mechanisms, and confusing privacy request flows are no longer acceptable. The mandate for Honda to involve a UX designer formalizes privacy by design as an enforceable standard, pushing companies to ensure web interfaces are intuitive, equitable, and user-centric. Beyond interface design, issues like inadequate vendor contracts and unmonitored tracking technologies reveal how quickly privacy obligations can be undermined.

Organizations must now take a proactive, lifecycle approach to compliance by integrating audits, policy updates, employee training, and cross-functional collaboration. Companies that embed trust into their digital experiences through transparent consent practices, streamlined rights processes, and accountable data governance stand to strengthen both consumer confidence and brand integrity.

Analyst: Safayat Moahamad, Research Director – Security & Privacy

More Reading:


If you have a question or would like to receive these monthly briefings via email, submit a request here.

Visit our IT Critical Response Resource Center
Over 100 analysts waiting to take your call right now: +1 (703) 340 1171