This Privacy Regulation Roundup summarizes the latest major global privacy regulatory development, announcements, and changes. This report is updated on a monthly basis. For each relevant regulatory activity, you can find actionable Info-Tech analyst insights and links to useful Info-Tech research that can assist you with becoming compliant. Regulatory activities are ordered by their effective date.


Doxxing of Police Officers: Unjust Justice?

Canada USA Europe Rest of World

Type: Development
Important Date: June 10, 2020

Summary: Following the mass of anti-racism protests in early June that took the United States by storm, the information of police officers nationwide has been leaked. The demonstrations sparked by blatant examples of racism and discrimination present within American law enforcement have added fuel to a fire of unrest within the United States. An unclassified intelligence document released by the US Department of Homeland Security states that there is “medium confidence that cyber actors will possibly continue to target law enforcement officers” with doxxing. Concerns have since erupted over potential privacy and safety implications.

Analyst Perspective: Privacy, at its core, is a balance between protection and availability of personal information. The current American predicament shares similarities with the situation a few months ago in Hong Kong surrounding the release of information and identity of police during the nation’s protests. In the aforementioned case, Hong Kong’s Privacy Commissioner stepped in to request that social media platforms assist in tracking those involved in doxxing efforts.

The conflict in the current case lies in the fact that the information itself is not technically private, nor is it sensitive personal data; however, in the current social climate, it has the potential to implicate the individuals whose information is being publicized. As it stands, the information leaked is a combination of publicly available records and data that has likely been obtained from compromised email accounts as a result of cyber actors seeking justice. In a country such as the United States that has no overarching privacy law or regulation, the situation is further complicated. The actions of the doxxers call into question a larger moral dilemma around the targeted use of information made available for ulterior purposes. Perhaps, in these more tenuous cases, the onus of maintaining privacy of one’s personal data should shift to the individual as opposed to the nation, state, or organization in question.

Analyst: Cassandra Cooper, Senior Research Analyst – Security, Risk, and Compliance

More Reading:


Applications That Blur Faces of Protestors Seek to Preserve Privacy and Anonymity

Canada USA Europe Rest of World

Type: Development
Important Date: June 2020

Summary: The recent surge in protests around the world has pushed a privacy concern into the spotlight: videos taken during these events contain the faces of protestors, which law enforcement has purportedly been leveraging to identify them. In response, developers have begun to create and release applications that automatically blur faces in videos being shot on a device. This functionality allows protestors to continue taking and sharing videos of the events without fear of being identified. The best applications would be those that do not require videos taken to be uploaded and stored on the developer’s servers, as law enforcement could potentially obtain this type of data with a legal order.

Analyst Perspective: The act of sharing a video of a protest that contains unblurred faces has some interesting privacy implications – the people who arrived at the protest did not consent to have their video taken, yet personally identifiable information (PII) about them has now been captured and released to the world. This could spark enough fear in those who do not want to be identified to make them stay home from protests. Applications that blur faces in videos are taking a large step in favor of privacy-informed application design. However, protesters will have to rely on their fellow protesters to consider the privacy of others and use these types of privacy-preserving applications.

As an organization, there are a few steps you can take to develop a privacy-aware culture and help your employees better understand the implications of privacy, not just at work, but also in their own lives. First, consider incorporating content related to respecting the privacy of others into the security awareness program. Additionally, if the organization captures video of its employees in any form, consider how the privacy of those in the video could be better protected. Consider the privacy implications in any new projects or operational initiatives and ensure that these considerations feed into the planning and launch process. If privacy is incorporated from the outset, the business reduces its risk of noncompliance and damaged trust in the brand due to misapplication of fundamental privacy rights of the individuals.

Analyst: Ian Mulholland, Senior Research Analyst – Security, Risk, and Compliance

More Reading:


CCPA Further Defines the Private Right of Action

Canada USA Europe Rest of World



Type: Development
Announcement Date: June 8, 2020

Summary: In the six-month period between CCPA’s initial enaction and its enforcement date, lawsuits claiming violations of the CCPA have begun to surface. The CCPA’s Section 1798.150(a)(1) outlines that “[a]ny consumer whose nonencrypted and nonredacted personal information ... is subject to an unauthorized access and exfiltration, theft, or disclosure” due to a business violating “the duty to implement and maintain reasonable security procedures and practices” has legal grounds to pursue a private right of action. While there are specific limitations around situations in which a private right of action cannot be pursued, the following situations have spurred legal action in the name of the CCPA:

  • Data breaches
  • Opt-out provision violations
  • Unfair competition claims
  • Retroactive application
  • Definition of personal information

As additional details around the outcome of legal actions continue to take shape, it will be interesting to see how the Attorney General accounts for the significant degree of ambiguity currently present within the CCPA and, in doing so, sets a precedent for future pursuits of private right of action.

Analyst Perspective: There is always an exemption or an exception to the rule, and as the number of cases brought forth for litigation in the name of CCPA violations increases, privacy experts should expect a series of interesting scenarios to assess and analyze.

In some of the cases brought forth with respect to data breaches, plaintiffs claim lack of effort on the end of the defendants surrounding prevention of “unauthorized access and exfiltration, theft, or disclosure” of personal data. In many cases the onus of determining whether reasonable efforts were made on the end of the plaintiff now lies at the discretion of the Attorney General. Due to a certain degree of vagueness present within the CCPA, a high degree of situational analysis will be necessary and will serve to set a precedent for cases pursued following the July 1 enforcement date.

The takeaway? Wiggle room, or legal ambiguity in the case of the CCPA, won’t always work in the benefit of those seeking compliance. Companies in scope for the CCPA should take significant efforts in the measures around protecting data and providing transparency and options for access and deletion to data subjects to avoid potentially detrimental legal action.

Analyst: Cassandra Cooper, Senior Research Analyst – Security, Risk, and Compliance

More Reading:


Bombora Pursues Legal Action Against ZoomInfo for Allegedly Breaching CCPA

Canada USA Europe Rest of World



Type: Development
Announcement Date: June 11, 2020

Summary: Business-to-business intent data company Bombora is pursuing legal action against ZoomInfo for allegedly breaching the CCPA, including collection and sale of personal information without consent. Some uncertainty exists around whether this is covered by the CCPA, as this is in a B2B context and focuses on unfair competition laws in the state of California.

Bombora has accused ZoomInfo of using a free tool, called Community Edition, to collect and sell data without permission. Bombora claims that the tool gathers the contacts within a user’s address book without knowledge or permission.

DiscoverOrg acquired ZoomInfo last year. Prior to that Bombora and ZoomInfo were vendor partners. A customer of both Bombora and ZoomInfo could use Bombora data in ZoomInfo products and vice versa. Since parting ways, ZoomInfo built a tool behind the scenes to compete with Bombora.

Following the allegations, ZoomInfo issued a statement indicating that “the claims are meritless, and the lawsuit is Bombora's attempt at retaliation against ZoomInfo for ending a vendor relationship with Bombora.”

Analyst Perspective:The big question here is whether this situation constitutes a violation of the CCPA or California’s Unfair Competition Law or simply represents a former business relationship turned sour. The B2B context of ZoomInfo’s operations complicates the application of the CCPA due to Assembly Bill 1355, passed in late 2019, which provides B2B businesses with a brief exemption from the CCPA in the context of personal information collected exclusively in B2B situations. This Bill, however, does not extend to include delay of the opt-out provision, but it does delay the need to provide notice, right to access and deletion, and transparency rights to the data subject. Additionally, the moratorium provided by Bill 1355 lasts only until 2021, at which point B2B organizations will receive the same level of scrutiny as B2C companies.

Organizations should continue their efforts to improve current privacy standards and continue developing the framework to ensure personal data collection methods are compliant with future applications of privacy regulations. Whether or not ZoomInfo’s practices are compliant by the letter of the law, the company may suffer due to increased scrutiny and loss of client trust in the brand.

Analyst: William Wong, Principal Research Advisor – Security, Risk & Compliance

More Reading:


Belgian DPA Issues GDPR Fines for Referral Program

Canada USA Europe Rest of World


Type: Regulatory Enforcement
Announcement Date: June 11, 2020

Summary: The Belgian DPA recently issued fines against Twoo, a Belgium-based social media platform, for violating GDPR consent rules with a “tell-a-friend” style referral program. The DPA ruled that these types of solicitations are unlawful, as consent comes only from the existing user, as opposed to the individual that receives the invitation to join the platform. As a result, these requests are viewed as commercial solicitations that do not fall under the lawful basis classification of legitimate interest. There is, however, a twinge of controversy in the DPA’s decision for two reasons. First, it renders these types of referral programs impracticable because, to be lawful, they would need consent from the non-user before the referral is sent. Second, the ruling overturns the previous ePrivacy Directive’s accommodation of such practices.

Analyst Perspective: Make the privacy of your clients (and potential clients) your organization’s first priority. This is the latest in a series of recent rulings that favor data subjects in situations where any ambiguity with respect to the lawful basis of data collection is present. Moreover, it shows that European DPAs intend to uphold the spirit of the GDPR, empowering data subjects to take full ownership over the dissemination of their personal information.

Analyst: Logan Rohde, Research Analyst – Security, Risk, and Compliance

More Reading:


European Data Protection Boards Decides Against Discriminatory Cookie Walls

Canada USA Europe Rest of World



Type: Official Guidance
Announcement Date: May 4, 2020

Summary: Guidelines on consent under Regulation 2016/679 (WP259.01) of the GDPR have been revised in an effort to provide further clarification regarding two important questions:

  • The validity of consent provided by the data subject when interacting with “cookie walls”
  • The case study example 16 on scrolling and consent

Cookie walls prevent users from accessing content on websites if they did not accept websites that insist on using cookies. The new guidance from the EDPB revises the definition of consent and notes that the consent of a data subject can only be considered lawful if the data subject is offered control and a genuine choice with regard to accepting or declining the terms of service without discrimination. Cookie walls violate this definition of consent, as they do not allow for specific, informed, and unambiguous consent to be freely given, because there is no real choice and control for data subjects. Under revisions adopted May 4, 2020, bundling consent with the acceptance of terms or conditions that forces the process of personal data is deemed highly undesirable and represents a violation of the requirement that consent can be freely given. New examples are provided in the updated document in order to offer further clarity on this issue.

Analyst Perspective: Privacy by design is the new norm, as indicated by the outcomes of the issue of cookie walls that demand consent from users to access content. As this revision demonstrates, organizations can no longer force consent as the means to access content, which further solidifies the rights of consumers to protect their data rights. These decisions must be carefully weighed in light of other consumer privacy regulations, including the newly enforceable California Consumer Protection Act (CCPA). California follows suit of other states with pre-existing privacy laws, including Maine and Nevada, and more states are likely to follow. These incoming privacy regulations may take the lead from the GDPR and outline cookie walls as a violation of consumers’ rights to provide freely given, specific, informed, and unambiguous consent. Organizations should resist the urge to collect more information than is needed and not demand users give up their freedom of choice for cookie crumbs.

Analyst: Marc Mazur, Research Specialist – Security, Risk & Compliance

More Reading:


Google Loses Its Appeal on 50 Million GDPR Fine

Canada USA Europe Rest of World



Type: Regulatory Enforcement
Announcement Date: June 12, 2020

Summary: In a ruling on June 12, 2020, the French Conseil d’État, which serves in the capacity of supreme court of administrative justice in France, upheld the €50 million fine applied last year against Google. The fine was incurred due to Google’s lack of compliance with GDPR, specifically in providing transparency in its data consent policies and in not allowing users sufficient control over how their respective data would be used. These charges have not yet been rectified. This represents the largest GDPR fine to date, and Google has indicated that it will appeal the fine.

Analyst Perspective: Privacy has been making the news a lot these days, most recently around the privacy issues related to contact tracing. At least in the case of COVID-19 tracing, the underlying intent is the benefit of public health and society. In the case of Google and other major players in the technology space, extending the use of data collected to purposes not clearly identified or generating profits from secondary use of data is blatant flouting of the intent behind privacy regulations and serves as a detriment to fundamental rights of the individual.

In this age of big data, common platforms such as Google must remain responsible for upholding international best practices and expectations, while also aiming to support an individual’s rights by clearly identifying use and providing mechanisms to personalize behavior such as advertisement targeting. As with the social uprising around the world related to equality and fair treatment for BIPOC, mass protest is needed to voice mistrust of organizations that do not adequately protect user rights. Whether in the voting booth or, in the case of Google, a shift to a socially responsible platform, the voice of the individual must be heard and considered.

Analyst: Christine R. Coz, Principal Security Advisor – Security, Risk, and Compliance

More Reading:


Clearview AI Offers to Delete Some Faces From Its Internet-Grafted Database – Unless You’re Canadian

Canada USA Europe Rest of World


Type: Development
Announcement Date: June 10, 2020

Summary: Clearview AI has made concessions for people who wish to access its facial recognition database. Clearview AI’s database of faces has been grafted from millions of websites and consists of over a billion faces taken from various social media sites. Clearview AI has said it will allow Canadians to check whether or not their faces have been collected and added to the database; however, Canadians will not be eligible to ask for their faces to be removed unless they meet certain requirements as established by Clearview.

Clearview stated that the tool and database are meant to help police “identify perpetrators and victims of a crime,” but there is cause for concern around additional use cases for the tool. The main grievance with Clearview AI is that, without consent, the company has taken a great deal of personal information from millions of people with no clear recourse as to how citizens not within scope of federal privacy law can reclaim their data.

Analyst Perspective: Data privacy should be the default, not the afterthought, a principle evidently not upheld by Clearview AI. In Europe, the “right to be forgotten” allows citizens to petition for data about themselves found online to be de-indexed or not appear if the data is false or taken without consent. This is unfortunately not the case in Canada, where individuals must submit a request via email to Clearview AI to see if their image is in the company’s extensive database. And one more requirement: this request must be accompanied by a new headshot for cross-reference. This effectively ensures that if Clearview did not have you in their database before, they certainly will now, the irony of which is not only concerning but in clear contradiction of privacy by design.

Canadian citizens should not have to petition for their data, especially when it was obtained without consent. It begs the question of why no legal action has been taken against Clearview AI or why Ontario’s investigation has not proceeded further. GDPR protects EU citizens from such actions, but in Canada, citizens are left to fend for themselves when it comes to data protection. To remain transparent and gain consumer trust, companies should adopt a privacy by design approach to any business lines that involve extensive collection of personal data, even if not mandated on a federal level.

Analyst: Isaac Kinsella, Research Specialist – Security, Risk & Compliance

More Reading:


Hospitality Sector Is Reminded That Covid-19 Tracing Registers Are to Be Used for Health Purposes

Canada USA Europe Rest of World



Type: Official Guidance
Announcement Date: June 2020

Summary: Privacy concerns over customer contact tracing apps were raised after an Auckland woman tweeted about an email she had received, offering her a complimentary coffee from a café she had recently visited. Believing that the café had obtained her email information from the contact tracing app, she asked the Twitter world, “Is this a customer data privacy breach?”

In response to her tweet, the New Zealand Privacy Commissioner directed her to a landing page on its website, advising that COVID-19 contact tracing registers be used for the purpose of contact tracing only at the request of the Ministry of Health. During an interview with the café’s spokesperson, they explained that it was made aware to customers that “by submitting an email that they may receive future communications.”

Analyst Perspective: As the general public becomes more aware of what constitutes misuse of personal data, so too should organizations, even in the absence of governing privacy regulations. In the case above, it is unclear whether the incident was a breach since implicit consent was obtained. The question remains, is this really enough?

New Zealand’s Privacy Act states that “information obtained for one purpose should not be used for any other reason.” Given this stipulation, the café did not breach the terms due to the written notice that contact information may be used for future communications. However, guidance provided by the Ministry of Health states that “the guest register should only be used for the public health reasons specified,” indicating that privacy best practices were not adhered to.

To stay out of the limelight and remain transparent and trustworthy to customers, organizations must be vigilant in how they take advantage of or leverage personal data obtained for a specific purpose. This applies beyond the current COVID-19 contact tracing environment and must become a key consideration moving forward.

Analyst: Michelle Tran, Research Analyst – Security, Risk, and Compliance

More Reading:


Brazilian Supreme Court Decision Sets a Precedent for Data Protection as an Individual Autonomous Right

Canada USA Europe Rest of World



Type: Development
Announcement Date: May 7, 2020

Summary: A ruling by the Brazilian Supreme court has set the groundwork for data protection as an autonomous right. With a ten-to-one ruling, Brazil’s supreme court halted an executive order from the president mandating telecom companies to share the user data of over 200 million people. This request came from Brazilian Institute of Geography and Statistics, the agency of the government responsible for census research in Brazil.

Due to COVID-19, the Institute was unable to conduct its normal face-to-face interviews, and the Brazilian president issued an executive order stating that it was acceptable for the interviews to be conducted over the phone due to the current circumstances. However, the Institute wanted the subscriber data of over 200 million telecom clients to be shared with it in order to conduct the interviews. The data privacy concerns surrounding this request were raised by four different political parties and made it all the way to the Supreme Court as concern surrounded the purpose and necessity for such a large collection of personal data.

Analyst Perspective: If the processing of personal data can pose a risk to the rights of individuals, then it should be backed by the appropriate safeguards to mitigate the risks. This may result in the conduction of a data protection impact assessment (DPIA) or privacy impact assessment (PIA). Additional considerations must be evaluated due to the inherent risks to public liberties when processing personal data. As Justice Luis Roberto Barroso noted, “The use of personal data is inevitably an interference over the personal sphere of someone.” As a consequence, he said, data collection should be proportionate by verifying the following:

  • The purpose of the information processing is specified and legitimate.
  • The amount of data collected is limited to what is strictly necessary to the purpose for which it is being processed.
  • Security measures are adopted to avoid unauthorized access by third parties.

The balance that currently exists between privacy and the proper functioning of businesses and states is tumultuous at best, with limited oversight in many cases. A new understanding must come to fruition between technological innovations and the potentially harmful impact of the misuse of personal data.

Brazil’s push to recognize the protection of personal data as a fundamental right is a step in the right direction. Despite the pandemic, Brazil’s Supreme Court conceded that there was no real public interest to harvest the information of 200 million people to undergo the desired public policy, a decision that exemplifies a shift toward supporting the privacy interests of the nation’s citizens.

Analyst: Isaac Kinsella, Research Specialist – Security, Risk & Compliance

More Reading:


Japan Enacts Amendments to the Act on the Protection of Personal Information

Canada USA Europe Rest of World



Type: Regulation
Announcement Date: June 2020

Summary: The parliament of Japan has enacted a law to amend the Act on the Protection of Personal Information (APPI). The new law is expected to take effect in the last quarter of 2021 or the first half of 2022. The changes in legislation enhance the data subject’s rights, expand the responsibilities and obligations of personal information controllers, narrow regulations on data use, and raise the penalties on noncompliance of APPI. New obligations and guidance introduced include mandatory breach reporting and additional consent requirements for the offshore transfer of personal information to other entities.

The Personal Information Protection Commission (PPC) of Japan has provided tentative translations of an overview of the amended Act and a comparative list of the previous and amended provisions of the APPI.

Analyst Perspective: APPI is one of the earliest privacy laws, having been adopted in 2003 and updated in September 2015 (taking effect in May 2017) due to the inadequacy of the law in preventing privacy breaches. With GDPR enacted in 2018, the UE and Japan made an adequacy decision stating that both GDPR and the APPI have the same levels of protection. The upcoming APPI provisions will further tighten control of personal information for residents of Japan and give more responsibility and accountability to “personal information controllers” both within and outside of Japan.

Analyst: Jimmy Tom, Research Advisor – Security, Risk, and Compliance

More Reading:


If you have a question or would like to receive these monthly briefings via email, submit a request here.

Related Content

Hide Details

Search Code: 87193
Published: December 11, 2018
Last Revised: July 3, 2020

Social

Get Access

Get Instant Access
To unlock the full content, please fill out our simple form and receive instant access.