A New Era for Japanese Data Protection: 2020 Amendments to the APPI

mount fuji 3801827 1920

Authors: Takeshige Sugimoto, Akihiro Kawashima, Tobyn Aaron from S&K Brussels LPC; Authors can be contacted at [email protected].

The recent amendments to Japan’s data protection law (the Act on the Protection of Personal Information, henceforth the ‘APPI‘) contain a number of new provisions certain to alter – and for many foreign businesses, transform – the ways in which companies conduct business in or with Japan. In addition to greatly expanding data subject rights, most notably, the amendments to the APPI (the ‘2020 Amendments‘): 

(i) eliminate all former restrictions on the APPI’s extraterritorial application; 

(ii) considerably heighten companies’ disclosure and due diligence obligations with respect to overseas data transfers; 

(iii) introduce previously unregulated categories of personal information (each with corresponding obligations for companies), including ‘pseudonymously processed information’ and ‘personally referable information’; and 

(iv) for the first time, mandate notifications for qualifying data breaches.

The 2020 Amendments will be enforced by the Personal Information Protection Commission of Japan (the “PPC”), pursuant to forthcoming PPC guidelines alongside the amended Enforcement Rules for the Act on the Protection of Personal Information (the ‘amended PPC Rules‘) and the amended Cabinet Order to Enforce the Act on the Protection of Personal Information (the ‘amended Cabinet Order‘) (both published on March 24, 2021).

As the 2020 Amendments are set to enter into force on April 1, 2022, Japanese and global companies that conduct business in or with Japan, have just less than one year to bring their operations into compliance. To facilitate such efforts, this blog post describes those provisions of the 2020 Amendments likely to have the greatest impact on businesses, as well as current events in Japan which will affect their implementation and should inform the manner by which companies address enforcement risks and compliance priorities.

1. LINE Data Transfers to China: A Wake-Up Call for Japan

To appreciate the effect that the 2020 Amendments will have on the Japanese data protection space, one must first consider the current political and societal contexts in Japan in which the 2020 Amendments will be introduced – and enforced – beginning with a recent incident of note involving LINE Corporation. 

In March 2021, headlines across Japan shocked locals: Japan-based messaging app LINE, actively used and trusted by approximately 86 million Japanese citizens, had been transferring users’ personal information, including names, IDs and phone numbers, to a Chinese affiliate. It is neither unusual nor unlawful for Japanese tech companies to outsource certain of their operations, including personal information processing, overseas. But for Japanese nationals, the LINE matter is different for a number of important reasons, not least of which is the Japanese population’s awareness of the Chinese Government’s broad access rights to personal data managed by private-sector companies in China, pursuant to China’s National Intelligence Law.

LINE is not only the most utilized messaging application in Japan; it also occupies a special place in the country’s historical and cultural consciousness. When Japan was hit by the 2011 earthquake, use of voice networks failed and email exchanges were delayed, as citizens struggled to communicate with, and confirm the safety of, their loved ones. And so, LINE was born – a simple messaging and online calling tool to serve as a communications hotline in case of emergency. A decade on, LINE has become the major – and for many the only – means of communication in Japan – particularly in today’s socially-distanced world.

For the Japanese Government too, LINE serves a crucial role: national – and municipality – level government bodies use LINE for official communications, including of sensitive personal information such as for COVID-19 health data surveying. News of LINE’s transfer of user data to China, including potential access by the Chinese Government, therefore horrified private citizens and public officials both.

On March 31, 2021, the PPC launched an official investigation into LINE and its parent company, Z Holdings, over their management of personal information. Until such investigation is concluded, whether and to what extent LINE violated the APPI (and in particular, its provisions governing third party access and international transfers) will remain uncertain. Regardless, the impact of this matter on the Japanese data privacy space is already unfolding. In late March, a number of high-ranking Japanese politicians (including Mr. Akira Amari, Chairperson of the Rule-Making Strategy Representative Coalition of the Liberal Democratic Party of Japan) sent the PPC and other relevant Government ministries strongly-worded messages urging immediate action with respect to LINE, and more broadly, calling for a risk assessment to be conducted vis-à-vis all personal information transfers to China by companies in Japan.

Several days later, Japanese media reported that the PPC had requested members of both the KEIDANREN (the Japan Business Federation, comprised of 1,444 representative companies in Japan) and the Japan Association of New Economy (comprised of 534 member companies in Japan), to report their personal information transfer practices involving China, and to detail the privacy protection measures in place with respect to such transfers. For any APPI violations revealed, the PPC will issue a recommendation potentially followed with an injunctive order, the latter of which carries a criminal penalty (including possible imprisonment) if not implemented.

Importantly, recent political support for stronger data protection measures extends beyond transfers to China. For instance, Mr. Amari has also reportedly called on the PPC to broadly limit permissible overseas transfers of personal information to those countries with data protection standards equivalent to the APPI (a limitation which, if implemented, would greatly surpass restrictions on transfer under both the current APPI and the 2020 Amendments).

Although the PPC has yet to respond, it is evident that both political and popular sentiment in Japan strongly favor enhanced protections for Japanese persons’ personal information. The inevitable outcome of such sentiment, which may be further amplified depending on the PPC’s forthcoming conclusions regarding the LINE matter, will be the increasingly stringent enforcement of the APPI and its 2020 Amendments, and potentially, further amendments thereto. As recent events in Japan demonstrate, this transformation has already begun to take effect. Companies conducting business in or with Japan, whether Japanese or foreign, should therefore pay close attention to the Japanese data privacy space over the course of this year.

2. Broadened Extraterritorial Reach and International Transfer Restrictions

For ‘Personal Information Handling Business Operators’ (henceforth ‘Operators‘, a term used in joint reference to controllers and processors, upon which the APPI imposes the same obligations) arguably the greatest impact of the 2020 Amendments will derive from their drastic revisions to Article 75 (extraterritoriality) and Article 24 (international transfer).

To date, the APPI’s extraterritorial reach has been limited to a handful of its articles, primarily those governing purpose limitation and lawful acquisition of personal information (‘PI‘) by overseas Operators. From April 2022, however, Article 75 of the amended APPI will, without exception, fully bind all private-sector overseas entities, regardless of their size, which process the PI, pseudonymously processed PI or anonymously processed PI of individuals who are in Japan, in connection with supplying goods or services thereto.

With respect to international transfers, Article 24 of the current legislation prohibits the transfer of PI to a ‘third party’ outside of Japan absent the data subject’s prior consent, unless (i) the recipient country has been white-listed by the PPC or (ii) the recipient third party upholds data protection standards equivalent to the APPI (in practice, these would generally be imposed contractually). Otherwise, international transfers may also be conducted pursuant to legal obligation or necessity (for the protection of human life, public interest or governmental cooperation, provided that for each, the data subject’s consent would be difficult to obtain). The APPI’s international transfer mechanisms generally conform to those prescribed by other global data protection regimes, loosely resembling the EU GDPR’s adequacy decisions (with respect to (i) above), and standard contractual clauses or binding corporate rules (with respect to (ii) above, although there are no PPC-provided contractual clauses, and non-binding arrangements such as the APEC CPBR System are PPC-approved).

The 2020 Amendments and amended PPC Rules do not modify the above transfer mechanisms, but they do narrow their scope in two key aspects. First, pursuant to Article 24(2) of the 2020 Amendments, transfers conducted on the basis of data subject consent will henceforth require the transferring Operator (on top of preexisting notification obligations) to inform the data subject in advance as to the name of the recipient country, and the levels of PI protection provided by both that country (assessed using an “appropriate and reasonable method”) and the recipient third party. Absent such information, data subject consent will be rendered uninformed and the transfer, invalid.

Of greater impact on the transferring Operator, however, will be the second modification (pursuant to Article 24(3) of the 2020 Amendments): in the event that an international transfer is conducted in reliance on contractually or otherwise imposed APPI data protection standards (the primary transfer mechanism on which Operators in Japan rely), such contractual safeguards alone are to be rendered insufficient. Going forward, the transferring Operator must, in addition to imposing APPI-equivalent obligations upon a recipient third party, (i) take “necessary action to ensure continuous implementation” of such obligations by the recipient; and (ii) inform the data subject, upon request, regarding the actions the Operator has taken.

With respect to (i) above, the amended PPC Rules interpret “necessary action to ensure continuous implementation” as requiring the transferring Operator to: (1) periodically check the implementation status and content of the APPI-equivalent measures by the recipient third party, and assess (by an “appropriate and reasonable method”) the existence of any foreign laws which might impact such implementation; (2) take necessary and appropriate actions to remedy any obstacles that are found; and (3) suspend all PI transfer to the third-party recipient, should its continuous implementation of the APPI-equivalent measures become difficult.

In addition, following receipt of a data subject’s request for information (pursuant to (ii) above), the amended PPC Rules specify that the transferring Operator must, without undue delay, inform the requesting data subject of each of the following:

(1)  the manner by which the APPI-equivalent measures were established by (or presumably with) the recipient third party (such as a data processing agreement or memorandum of understanding, or in the case of inter-group transfers, a privacy policy);

(2)  details of the APPI-equivalent measures implemented by the recipient third party;

(3)  the frequency and method by which the transferring Operator checked such implementation;

(4)  the name of the recipient country;

(5)  whether any foreign laws may affect the implementation of the APPI-equivalent measures, and a detailed overview of such laws;

(6)  whether any obstacles to implementation exist, and a detailed overview of such obstacles; and

(7)  the measures taken by the transferring Operator upon a finding of such obstacles.

Only if provision of the above items to the data subject is likely to ‘significantly hinder’ an Operator’s business operations, might that Operator refrain from such (complete or partial) disclosure.

In practice, Operators primarily rely upon contractual safeguards and consent (in that order) to transfer PI outside of Japan. Indeed, the PPC’s list of “adequacy decisions” on which transferring Operators may alternatively rely is significantly shorter than that of the European Commission: to date, only the UK and EEA members have been deemed adequate recipients of a PI transfer from Japan. Therefore, the onerous informational and due diligence obligations incumbent upon Operators from April 2022, which affect precisely these two transfer mechanisms, are certain to impact business operations in Japan. And, given the 2020 Amendments’ unbridled extraterritoriality, this burden will be equally felt overseas. Most importantly, in the wake of the March 2021 LINE matter, compliance with the current and amended APPI, and in particular its overseas transfers restrictions, will be at the top of the PPC’s enforcement priorities.

3. Mandatory Data Breach Notifications

In addition to expanding the types of security incidents subject to the amended APPI, more notably, data breach notifications will henceforth be mandatory (in contrast, data breach notifications are subject to ‘best efforts’ under current legislation). Going forward, Operators will be required – pursuant to Article 22-2 of the 2020 Amendments and the amended PPC Rules – to promptly notify both the PPC and data subjects of the occurrence and/or potential occurrence of any data leakage, loss, damage or other similar situation which poses a ‘high’ risk to the rights and interests of data subjects (henceforth, a ‘breach‘).

The types of breaches which meet this ‘high’ risk threshold, and thus trigger a notification obligation, are described by the amended PPC Rules as those which involve, or potentially involve, any of the following: (i) sensitive (‘special care-required’) PI; (ii) financial injury caused by unauthorized usage; (iii) a wrongful purpose(s) as the cause; or (iv) greater than 1,000 affected data subjects. However, a notification is not required in the event that the Operator implemented ‘necessary measures’ to safeguard the rights and interests of data subjects (such as sophisticated encryption).

The amended PPC Rules also stipulate the required content for such notifications, although Operators are granted thirty days to provide details unknown at the time of the initial notice:

(1) overview of the breach;

(2) the types of PI affected or possibly affected by the breach;

(3) the number of data subjects affected or possibly affected by the breach;

(4) causes of the breach;

(5) existence and nature of secondary damage or risks thereof;

(6) status and nature of communications to affected data subjects;

(7) whether and how the breach has been publicized;

(8) measures implemented to prevent a recurrence; and

(9) any additional matters which may serve as a useful reference.

For those Operators ‘entrusted‘ by another Operator with the processing of PI, the 2020 Amendments provide a second option: in lieu of notifying the PPC and data subjects, such “entrusted” Operators may instead alert the “entrusting” Operator as to the breach. In practice, this likely equates to the EU GDPR’s requirement for processors to notify controllers in the event of a breach (although under the 2020 Amendments, direct accountability to the PPC and data subjects is still the default, including for “entrusted” Operators).

In the event of a breach, amended Article 30(5) additionally confers upon data subjects the right to request deletion, suspension of use and suspension of transfer, of affected PI.

4. Expansion of ‘Personal Information’ Concepts and Categories

Another major modification to the APPI is the expanded scope of the types of PI covered. In addition to eliminating the APPI’s differential treatment of temporary PI (retained for up to six months), the 2020 Amendments introduce a new category of information, ‘pseudonymously processed information‘, thereby bringing the Japanese data protection regime one additional step closer to the EU GDPR framework.

As currently drafted, the APPI recognizes only two major types of information: PI and anonymously processed information. Notably, the method of rendering anonymously processed information under the APPI – in contrast to the EU GDPR– need not be technically irreversible (unless such data originates in the UK or EEA and the transfer is based on the European Commission’s adequacy decision on Japan, in which case special PPC-drafted Supplementary Rules do require irreversibility); instead, the APPI endeavors to preserve anonymity by requiring Operators to implement appropriate security measures to prevent reidentification.

Pseudonymously processed information is defined by the 2020 Amendments as information relating to an individual, which cannot identify such individual unless collated with additional information. The stated intention behind the drafters’ introduction of the pseudonymization process is to enable Operators to (i) utilize pseudonymously processed information for internal purposes including business analytics, the development of computational models, etc., and/or (ii) retain rather than delete, for potential future statistical analysis usage, pseudonymously processed information derived from PI which are no longer necessary for the original purpose(s) for which they were collected.

The 2020 Amendments and amended PPC Rules model the pseudonymization process on anonymization, requiring the removal of any (i) description, (ii) unique ‘personal identification code’ (as defined in the APPI), and (iii) information relating to the processing method performed to enable the removal of (i) and (ii) above. The immediate result is the creation, by separation, of two types of information: pseudonymously processed information and ‘removed’ PI, where the latter is the ‘key’ enabling reidentification.

The removed PI are treated as PI under the 2020 Amendments, and as such are subject to all of the same requirements and restrictions, although Operators in possession of both removed PI and pseudonymously processed information are additionally obligated to provide enhanced security in order to safeguard the integrity of the pseudonymously processed information (pursuant to the amended PPC Rules and amended Article 35-2(2)).

Notably, and in divergence from the EU GDPR approach to pseudonymously processed information, the 2020 Amendments’ rules governing treatment of such information vary according to the Operator involved. With respect to pseudonymously processed information handled by an Operator in simultaneous possession of the removed (and separately handled) PI, amended Article 35-2 stipulates the following specific requirements:

(i)         a prohibition of the collation of such information with other data, such as the removed PI, in a manner which could identify data subjects;

(ii)       strict application of the principles of purpose limitation and necessity thereto;

(iii)     a prohibition on usage of any contact information contained therein to phone, mail, email or otherwise contact data subjects;

(iv)      a prohibition of any transfer thereof to third parties (excluding, amongst others, “entrusted” Operators pursuant to Article 23(5)), unless such transfer is permitted by law or regulation (alternatively, the transfer of pseudonymously processed information by data subject consent is permissible if such information are instead handled as PI);

(v)       in the event of their acquisition or the intended alteration of their processing purpose, limitation of the Operator’s disclosure obligation to that of notice by publication;

(vi)      non-applicability of breach notification obligations pursuant to amended Article 22-2, provided that the removed PI are not also subject to the breach; and

(vii)    the elimination of data subjects’ rights regarding their pseudonymously processed information, with the exception of their Article 35 right to receive a prompt and appropriate response to their complaints (subject to the Operator’s best efforts).

In addition to the above, the APPI’s ‘general’ requirements pursuant to Articles 19-22 will apply to pseudonymously processed information handled by an Operator which simultaneously (but separately) possesses the removed PI. Such Operator will be required to:

(i) maintain accuracy of the pseudonymously processed information (for the duration their utilization remains necessary, after which their immediate deletion – alongside the deletion of the removed PI – is required, subject to the Operator’s best efforts);

(ii) implement necessary and appropriate security measures to prevent leakage, loss or damage of the pseudonymously processed information; and

(iii) exercise necessary and appropriate supervision over employees and entrusted persons handling the pseudonymously processed information.  

In contrast, with respect to pseudonymously processed information handed by an Operator which does not simultaneously possess the removed PI, amended Article 35-3 prohibits such Operator from acquiring the removed PI and/or collating the pseudonymously processed information with other information in order to identify data subjects, and limits the applicable provisions of the 2020 Amendments to the following:

(i) the implementation of necessary and appropriate security measures to prevent leakage (a simplified version of Article 20);

(ii) the exercise of necessary and appropriate supervision over employees and entrusted persons handling such information (pursuant to Articles 21 and 22);

(iii) a prohibition on usage of any contact information contained in the pseudonymously processed information to phone, mail, email or otherwise contact data subjects;

(iv) a prohibition of any transfer of such information to third parties (excluding, amongst others, “entrusted” Operators pursuant to Article 23(5)), unless such transfer is permitted by law or regulation (alternatively, the transfer of pseudonymously processed information by data subject consent is permissible if such information are instead handled as PI); and

(v) the elimination of data subjects’ rights regarding their pseudonymously processed information, with the exception of their Article 35 right to receive a prompt and appropriate response to their complaints (subject to the Operator’s best efforts).

In addition to pseudonymously processed information, the 2020 Amendments, pursuant to Article 26-2, introduce an additional, fourth category of information – namely,personally referable information’. This fourth category includes cookies and purchase history (for example), which items may not independently be linkable to a specific individual (and thus would not constitute PI) but which could, if transferred to an Operator in possession of additional, related data, become PI. To account for such qualifying transfers, the 2020 Amendments introduce a consent requirement (such as an opt-in cookie banner).

In the case of overseas transfers, the transferring Operator must additionally inform the data subject as to the data protection system and safeguards of the recipient country and third party, as well as take ‘necessary action to ensure continuous implementation’ of APPI-equivalent safeguards by such recipient third party. Unlike for PI, the data subject does not have a right to request additional details regarding the ‘necessary action’ taken by the Operator with respect to an overseas transfer of personally referable information.

5. Preparing for the 2020 Amendments: Next Steps for Japanese and Foreign Operators

Companies conducting business in or with Japan should be mindful of the demanding nature of the 2020 Amendments to the APPI, and the stringency with which the PPC will seek to enforce them – particularly in view of the dismay caused by the LINE matter and the likelihood of efforts by the PPC to avoid similar incidents in the future.

Moreover, as the European Commission finalizes its first review of its 2019 adequacy decision on Japan, the PPC’s interpretative rules and enforcement trends may further intensify, with the aim of bringing Japanese data protection legislation closer to global standards, including the EU GDPR framework. Bearing this in mind, companies – including those not currently subject to the APPI, but which provide goods and/or services to individuals in Japan – would be wise to proactively conduct necessary modifications to their internal data protection policies and mechanisms, in order to ensure operational compliance with the amended APPI by April 2022.

For those Operators involved in international transfers of PI from Japan, absence of a PPC-issued “standard contractual clauses” template renders difficult, and from a compliance standpoint uncertain, any reliance on contractually-imposed APPI-equivalent standards pursuant to amended Article 24(3). However, one potential solution for Operators preparing to rely on this transfer mechanism for overseas PI transfers (excluding to the EEA or UK) may be the European Commission’s revised Standard Contractual Clauses (‘New SCCs‘), which are due to be published in early 2021. Subject to certain necessary modifications (of jurisdictional clauses and so forth), Operators may consider utilizing the New SCCs as a starting point, to bind recipient third parties to the stringent data protection standards and obligations of the 2020 Amendments.

Operators engaged in transferring PI should also be mindful of the 2020 Amendments’ onerous due diligence obligations with respect to overseas third parties. Prior to and during any cross-border engagements involving Japan-origin PI, Operators must actively ensure that their third-party recipients of such PI (including partners, vendors and subcontractors, as well as each of their respective partner, vendor and subcontractor recipients, and so forth) successfully implement, and continuously maintain, APPI-equivalent measures.

The 2020 Amendments’ enhanced disclosure obligations invite data subjects to hold Operators accountable with respect to the preventative and/or reactive measures Operators take – or fail to take – to protect their PI. Operators engaging foreign third parties should therefore consider reviewing and amplifying their due diligence of such entities, in addition to assessing the laws in each recipient country, in order to proactively identify and devise solutions to address potential obstacles to APPI adherence overseas.

The 2020 Amendments’ broadened extraterritorial application will also require non-Japanese companies to modify their internal data breach assessment and notification systems, to ensure that the PPC and data subjects in Japan are appropriately notified in the event of a qualifying breach; and to implement any necessary changes to their data subject communications platforms or data subject rights request forms, to enable data subjects in Japan to successfully exercise their amended APPI rights from April 1, 2022.

Once published, the PPC guidelines to the 2020 Amendments will further clarify (and potentially amplify) Operators’ compliance obligations with respect to each of the topics addressed in this blog post. The PPC’s findings in regard to LINE’s conduct may also have significant bearing on future APPI enforcement trends and risks. Therefore, in addition to implementing necessary measures to ensure operational compliance with the 2020 Amendments, companies processing covered PI and interested data privacy professionals should look out for these items over the next several months.   

Photo Credit: Ben Thai from Pixabay

For more Global Privacy thought leadership, see:

The right to be forgotten is not compatible with the Brazilian Constitution. Or is it?

India: Massive overhaul of digital regulation, with strict rules for take-down of illegal content and automated scanning of online content

Russia: New law requires express consent for making personal data available to the public and for any subsequent dissemination

FPF Hosted a CPDP 2021 Panel on US Privacy Law: The Beginning of a New Era

cpdp

By Srivats Shankar, FPF Legal Intern

For the 14th annual Computers, Privacy and Data Protection conference, which took place between 27 and 29 January, 2021, FPF hosted a panel of experts to discuss “US Privacy Law: The Beginning of a New Era”, whose recording has just been published. The panel was moderated by Dr. Gabriela Zanfir-Fortuna, who was joined by Anupam Chander, Professor of Law at Georgetown University; Jared Bomberg, Senior Counsel for the Senate Committee on Commerce, Science and Transportation; Stacey Schesser, Office of California Attorney General; and Lydia Parnes, Partner at Wilson Sonsini’s Privacy and Cybersecurity Practice.

Broadly, the panel discussed the events that have prompted the shift towards privacy protection in the US in recent years, including the latest privacy law initiatives at the state and federal level. The discussion addressed how regulators are enforcing current laws and preparing for what’s to come, and how these developments may strengthen the Trans-Atlantic relationship in the digital age.

Professor Anupam Chander discussed the most consequential developments in US privacy law in recent years, which he identified as the passage of the California Consumer Privacy Act (CCPA) in 2018, the Supreme Court decision of Carpenter v. US, and the passage of the Consumer Privacy Rights Act (CPRA) in 2020. According to Professor Chander, these developments will define the law of privacy over the next decade. 

Jared Bomberg discussed developments at the federal level in the United States, including the increasing focus by Congress on a comprehensive consumer privacy legislation. In the Senate, the two leading proposals are the Consumer Online Privacy Rights Act (COPRA), led by Senator Cantwell (D-WA) and the SAFE DATA Act, led by Senator Wicker (R-MS). Both bills have many cosponsors. Among these and other privacy bills, there is commonality regarding the right of access, correction, deletion, and portability. Meanwhile, key differences include the existence of a private right of action, the extent to which a federal law would preempt state laws, and the incorporation of fiduciary responsibilities. 

Stacey Schesser discussed the privacy law in California, including the enactment of the CCPA and the response of companies to the law. Following the passage of the GDPR, many companies have come to support compliance with the CCPA. California, by virtue of its large population and major economy, has required many businesses across the United States to come into compliance with the CCPA. Schesser notes that they have seen consumer frustration with opt-out mechanisms and deletion of personal information, alongside challenges with companies interpreting the law in different ways. However, she noted that many companies have complied with the CCPA within the 30 day notice and cure period after being notified of a violation. The initial rollout of Attorney General regulations have attempted to identify the scope of enforcement especially with reference to unique problems such as dark patterns.

Lydia Parnes discussed the enforcement of privacy law in the US. She observed that the Federal Trade Commission (FTC) has been fairly aggressive in exercising its enforcement powers. Commissioner Slaughter who became Acting Chairwoman has promoted the usage of civil penalties in privacy rights cases. These enforcement actions have become “baseline norms” for companies to follow. They don’t just affect the individual company but the industry at large. Parnes noted that the FTC has limited resources and enforcement by state agencies would be an effective way to facilitate change.

In the Q&A session, attendees raised issues of global interoperability, agency enforcement, and competition. Professor Anupam Chander emphasized the importance of the Schrems II decision, and the need for the US and Europe to come to another “modus vivendi.” This could be established without a “national” policy on privacy, to protect the information of foreign individuals whose data may be stored in the United States.

In response to a question about enforcement, Jared Bomberg emphasized that agencies like the FTC need more resources and that there is some acceptance that the FTC should continue enforcement in its existing fashion. He further noted that the Attorney General could also supplement and collaborate on enforcement. Bomberg also stressed the need for a private right of action. Market constraints also play a role in limiting the ability of the customer to protect their rights, and the current lack of transparency with the power dynamic has created a situation where customers do not understand what they have signed up for.

In closing, the panelists received a question on the likelihood of seeing a federal privacy law in the next two years. The consensus as Jared put it was that it could be “100% and 0%.” 

Watch the full recording of the panel by following this link.

The right to be forgotten is not compatible with the Brazilian Constitution. Or is it?

Brazilian Supreme Federal Court

Author: Dr. Luca Belli

Dr. Luca Belli is Professor at FGV Law School, Rio de Janeiro, where he leads the CyberBRICS Project and the Latin American edition of the Computers, Privacy and Data Protection (CPDP) conference. The opinions expressed in his articles are strictly personal. The author can be contacted at [email protected].

The Brazilian Supreme Federal Court, or “STF” in its Brazilian acronym, recently took a landmark decision concerning the right to be forgotten (RTBF), finding that it is incompatible with the Brazilian Constitution. This attracted international attention to Brazil for a topic quite distant than the sadly frequent environmental, health, and political crises.

Readers should be warned that while reading this piece they might experience disappointment, perhaps even frustration, then renewed interest and curiosity and finally – and hopefully – an increased open-mindedness, understanding a new facet of the RTBF debate, and how this is playing out at constitutional level in Brazil.

This might happen because although the STF relies on the “RTBF” label, the content behind such label is quite different from what one might expect after following the same debate in Europe. From a comparative law perspective, this landmark judgment tellingly shows how similar constitutional rights play out in different legal cultures and may lead to heterogeneous outcomes based on the constitutional frameworks of reference.   

How it started: insolvency seasoned with personal data

As it is well-known, the first global debate on what it means to be “forgotten” in the digital environment arose in Europe, thanks to Mario Costeja Gonzalez, a Spaniard who, paradoxically, will never be forgotten by anyone due to his key role in the construction of the RTBF.

Costeja famously requested to deindex from Google Search information about himself that he considered to be no longer relevant. Indeed, when anyone “googled” his name, the search engine provided as the top results some link to articles reporting Costeja’s past insolvency as a debtor. Costeja argued that, despite having been convicted for insolvency, he had already paid his debt with Justice and society many years before and it was therefore unfair that his name would continue to be associated ad aeternum with a mistake he made in the past.

The follow up is well known in data protection circles. The case reached the Court of Justice of the European Union (CJEU), which, in its landmark Google Spain Judgment (C-131/12), established that search engines shall be considered as data controllers and, therefore, they have an obligation to de-index information that is inappropriate, excessive, not relevant, or no longer relevant, when a data subject to whom such data refer requests it. Such an obligation was a consequence of Article 12.b of Directive 95/46 on the protection of personal data, a pre-GDPR provision that set the basis for the European conception of the RTBF, providing for the “rectification, erasure or blocking of data the processing of which does not comply with the provisions of [the] Directive, in particular because of the incomplete or inaccurate nature of the data.”

The indirect consequence of this historic decision, and the debate it generated, is that we have all come to consider the RTBF in the terms set by the CJEU. However, what is essential to emphasize is that the CJEU approach is only one possible conception and, importantly, it was possible because of the specific characteristics of the EU legal and institutional framework. We have come to think that RTBF means the establishment of a mechanism like the one resulting from the Google Spain case, but this is the result of a particular conception of the RTBF and of how this particular conception should – or could – be implemented.

The fact that the RTBF has been predominantly analyzed and discussed through the European lenses does not mean that this is the only possible perspective, nor that this approach is necessary the best. In fact, the Brazilian conception of the RTBF is remarkably different from a conceptual, constitutional, and institutional standpoint. The main concern of the Brazilian RTBF is not how a data controller might process personal data (this is the part where frustration and disappointment might likely arise in the reader) but the STF itself leaves the door open to such possibility (this is the point where renewed interest and curiosity may arise).

The Brazilian conception of the right to be forgotten

Although the RTBF has acquired a fundamental relevance in digital policy circles, it is important to emphasize that, until recently, Brazilian jurisprudence had mainly focused on the juridical need for “forgetting” only in the analogue sphere. Indeed, before the CJEU Google Spain decision, the Brazilian Supreme Court of Justice or “STJ” – the other Brazilian Supreme Court that deals with the interpretation of the Law, differently from the previously mentioned STF, which deals with the interpretation of constitutional matters – had already considered the RTBF as a right not to be remembered, affirmed by the individual vis-à-vis traditional media outlets.

This interpretation first emerged in the “Candelaria massacre” case, a gloomy page of Brazilian history, featuring a multiple homicide perpetrated in 1993 in front of the Candelaria Church, a beautiful colonial Baroque building in Rio de Janeiro’s downtown. The gravity and the particularly picturesque stage of the massacre led Globo TV, a leading Brazilian broadcaster, to feature the massacre in a TV show called Linha Direta. Importantly, the show included in the narration some details about a man suspected of being one of the perpetrators of the massacre but later discharged.

Understandably, the man filed a complaint arguing that the inclusion of his personal information in the TV show was causing him severe emotional distress, while also reviving suspects against him, for a crime he had already been discharged of many years before. In September 2013, further to Special Appeal No. 1,334,097, the STJ agreed with the plaintiff establishing the man’s “right not to be remembered against his will, specifically with regard to discrediting facts.” This is how the RTBF was born in Brazil.

Importantly for our present discussion, this interpretation is not born out of digital technology and does not impinge upon the delisting of specific type of information as results of search engine queries. In Brazilian jurisprudence the RTBF has been conceived as a general right to effectively limit the publication of certain information. The man included in the Globo reportage had been discharged many years before, hence he had a right to be “let alone,” as Warren and Brandeis would argue, and not to be remembered for something he had not even committed. The STJ, therefore, constructed its vision of the RTBF, based on article 5.X of the Brazilian Constitution, enshrining the fundamental right to intimacy and preservation of image, two fundamental features of privacy. 

Hence, although they utilize the same label, the STJ and CJEU conceptualize two remarkably different rights, when they refer to the RTBF. While both conceptions aim at limiting access to specific types of personal information, the Brazilian conception differs from the EU one on at least three different levels.

First, their constitutional foundations. While both conceptions are intimately intertwined with individuals’ informational self-determination, the STJ built the RTBF based on the protection of privacy, honour and image, whereas the CJEU built it upon the fundamental right to data protection, which in the EU framework is a standalone fundamental right. Conspicuously, in the Brazilian constitutional framework an explicit right to data protection did not exist at the time of the Candelaria case and only since 2020 it has been in the process of being recognized

Secondly, and consequently, the original goal of the Brazilian conception of the RTBF was not to regulate how a controller should process personal data but rather to protect the private sphere of the individual. In this perspective, the goal of STJ was not – and could not have been – to regulate the deindexation of specific incorrect or outdated information, but rather to regulate the deletion of “discrediting facts” so that the private life, honour and image of any individual might be illegitimately violated.

Finally, yet extremely importantly, the fact that, at the time of the decision, an institutional framework dedicated to data protection was simply absent in Brazil did not allow the STJ to have the same leeway of the CJEU. The EU Justices enjoyed the privilege of delegating to search engine the implementation of the RTBF because, such implementation would have received guidance and would have been subject to the review of a well-consolidated system of European Data Protection Authorities. At the EU level, DPAs are expected to guarantee a harmonious and consistent interpretation and application of data protection law. At the Brazilian level, a DPA has just been established in late 2020 and announced its first regulatory agenda only in late January 2021.

This latter point is far from trivial and, in the opinion of this author, an essential preoccupation that might have driven the subsequent RTBF conceptualization of the STJ.

The stress-test

The soundness of the Brazilian definition of the RTBF, however, was going to be tested again by the STJ, in the context of another grim and unfortunate page of Brazilian story, the Aida Curi case. This case originated with the sexual assault and subsequent homicide of the young Aida Curi, in Copacabana, Rio de Janeiro, on the evening of 14 July 1958. At the time the case crystallized considerable media attention, not only because of its mysterious circumstances and the young age of the victim, but also because the sexual assault perpetrators tried to dissimulate it by throwing the body of the victim from the rooftop of a very high building on the Avenida Atlantica, the fancy avenue right in front of the Copacabana beach.

Needless to say, Globo TV considered the case as a perfect story for yet another Linha Direta episode. Aida Curi’s relatives, far from enjoying the TV show, sued the broadcaster for moral damages and demanded the full enjoyment of their RTBF – in the Brazilian conception, of course. According to the plaintiffs, it was indeed not conceivable that, almost 50 years after the murder, Globo TV could publicly broadcast personal information about the victim – and her family – including the victim’s name and address, in addition to unauthorized images, thus bringing back a long-closed and extremely traumatic set of events.

The brothers of Aida Curi claimed reparation against Rede Globo, but the STJ, decided that the time passed was enough to mitigate the effects of anguish and pain on the dignity of Aida Curi’s relatives, while arguing that it was impossible to report the events without mentioning the victim. This decision was appealed by Ms Curi’s family members, who demanded by means of Extraordinary Appeal No. 1,010,606, that STF recognized “their right to forget the tragedy.” It is interesting to note that the way the demand is constructed in this Appeal exemplifies tellingly the Brazilian conception of “forgetting” as erasure and prohibition from divulgation.

At this point, the STF identified in the Appeal the interest of debating the issue “with general repercussion” which is a peculiar judicial process that the Court can utilize when recognizes that a given case has particular relevance and transcendence for the Brazilian legal and judicial system. Indeed, the decision of a case with general repercussion does not only bind the parties but rather establishes a jurisprudence that must be replicated by all lower-level courts.

In February 2021, the STF finally deliberated on the Aida Curi case, establishing that “the idea of ​​a right to be forgotten is incompatible with the Constitution, thus understood as the power to prevent, due to the passage of time, the disclosure of facts or data that are true and lawfully obtained and published in analogue or digital media” and that “any excesses or abuses in the exercise of freedom of expression and information must be analyzed on a case-by-case basis, based on constitutional parameters – especially those relating to the protection of honor, image, privacy and personality in general – and the explicit and specific legal provisions existing in the criminal and civil spheres.”

In other words, what the STF has deemed as incompatible with the Federal Constitution is a specific interpretation of the Brazilian version of the RTBF. What is not compatible with the Constitution is to argue that the RTBF allows to prohibit publishing true facts, lawfully obtained. At the same time, however, the STF clearly states that it remains possible for any Court of law to evaluate, on a case-by-case basis and according to constitutional parameters and existing legal provisions, if a specific episode can allow the use of the RTBF to prohibit the divulgation of information that undermine the dignity, honour, privacy, or other fundamental interests of the individual.

Hence, while explicitly prohibiting the use of the RTBF as a general right to censorship, the STF leaves room for the use of the RTBF for delisting specific personal data in an EU-like fashion, while specifying that this must be done finding guidance in the Constitution and the Law.

What next?

Given the core differences between the Brazilian and EU conception of the RTBF, as highlighted above, it is understandable in the opinion of this author that the STF adopted a less proactive and more conservative approach. This must be especially considered in light of the very recent establishment of a data protection institutional system in Brazil.

It is understandable that the STF might have preferred to de facto delegate the interpretation of when and how the RTBF could be rightfully invoked before Courts, according to constitutional and legal parameters. First, in the Brazilian interpretation of the RTBF, this right fundamentally insist on the protection of privacy – i.e. the private sphere of an individual – and, while admitting the existence of data protection concerns, these are not the main ground on which the Brazilian RTBF conception relays.

It is understandable that in a country and a region where the social need to remember and shed light on what happened in a recent history, marked by dictatorships, well-hidden atrocities, and opacity, outweighs the legitimate individual interest to prohibit the circulation of truthful and legally obtained information. In the digital sphere, however, the RTBF quintessentially translates into an extension of informational self-determination, which the Brazilian General Data Protection Law, better known as “LGPD” (Law No. 13.709 / 2018), enshrines in its article 2 as one of the “foundations” of data protection in the country and that whose fundamental character was recently recognized by the STF itself.

In this perspective, it is useful to remind the dissenting opinion of Justice Luiz Edson Fachin, in the Aida Curi case, stressing that “although it does not expressly name it, the Constitution of the Republic, in its text, contains the pillars of the right to be forgotten, as it celebrates the dignity of the human person (article 1, III), the right to privacy (article 5, X) and the right to informational self-determination – which was recognized, for example, in the disposal of the precautionary measures of the Direct Unconstitutionality Actions No. 6,387, 6,388, 6,389, 6,390 and 6,393, under the rapporteurship of Justice Rosa Weber (article 5, XII).”

It is the opinion of this author that the Brazilian debate on the RTBF in the digital sphere would be clearer if it its dimension as a right to deindexation of search engines results were to be clearly regulated. It is understandable that the STF did not dare regulating this, given its interpretation of the RTBF and the very embryonic data protection institutional framework in Brazil. However, given the increasing datafication we are currently witnessing, it would be naïve not to expect that further RTBF claims concerning the digital environment and, specifically, the way search engines process personal data will keep emerging.

The fact that the STF has left the door open to apply the RTBF in the case-by-case analysis of individual claims may reassure the reader regarding the primacy of constitutional and legal arguments in such case-by-case analysis. It may also lead the reader to – very legitimately – wonder whether such a choice is the facto the most efficient to deal with the potentially enormous number of claims and in the most coherent way, given the margin of appreciation and interpretation that each different Court may have.  

An informed debate able to clearly highlight what are the existing options and what might be the most efficient and just ways to implement them, considering the Brazilian context, would be beneficial. This will likely be one of the goals of the upcoming Latin American edition of the Computers, Privacy and Data Protection conference (CPDP LatAm) that will take place in July, entirely online, and will aim at exploring the most pressing issues for Latin American countries regarding privacy and data protection.

Photo Credit: “Brasilia – The Supreme Court” by Christoph Diewald is licensed under CC BY-NC-ND 2.0

If you have any questions about engaging with The Future of Privacy Forum on Global Privacy and Digital Policymaking contact Dr. Gabriela Zanfir-Fortuna, Senior Counsel, at [email protected].

India: Massive overhaul of digital regulation, with strict rules for take-down of illegal content and Automated scanning of online content

Taj Mahal 1209004 1920

On February 25, the Indian Government notified and published Information Technology (Guidelines for Intermediaries and Digital media Ethics Code) Rules 2021. These rules mirror the Digital Services Act (DSA) proposal of the EU to some extent, since they propose a tiered approach based on the scale of the platform, they touch on intermediary liability, content moderation, take-down of illegal content from online platforms, as well as internal accountability and oversight mechanisms, but they go beyond such rules by adding a Code of Ethics for digital media, similar to the Code of Ethics classic journalistic outlets must follow, and by proposing an “online content” labelling scheme for content that is safe for children.

The Code of Ethics applies to online news publishers, as well as intermediaries that “enable the transmission of news and current affairs”. This part of the Guidelines (the Code of Ethics) has already been challenged in the Delhi High Court by news publishers this week. 

The Guidelines have raised several types of concerns in India, from their impact on freedom of expression, impact on the right to privacy through the automated scanning of content and the imposed traceability of even end-to-end encrypted messages so that the originator can be identified, to the choice of the Government to use executive action for such profound changes. The Government, through the two Ministries involved in the process, is scheduled to testify in the Standing Committee of Information Technology of the Parliament on March 15.

New obligations for intermediaries

“Intermediaries” include “websites, apps and portals of social media networks, media sharing websites, blogs, online discussion forums, and other such functionally similar intermediaries” (as defined in rule 2(1)(m)).

Here are some of the most important rules laid out in Part II of the Guidelines, dedicated to Due Diligence by Intermediaries:

“Significant social media intermediaries” have enhanced obligations

“Significant social media intermediaries” are social media services with a number of users above a threshold which will be defined and notified by the Central Government. This concept is similar to the the DSA’s “Very Large Online Platform”, however the DSA includes clear criteria in the proposed act itself on how to identify a VLOP.

As for Significant Social Media Intermediaries” in India, they will have additional obligations (similar to how the DSA proposal in the EU scales obligations): 

These “Guidelines” seem to have the legal effect of a statute, and they are being adopted through executive action to replace Guidelines adopted in 2011 by the Government, under powers conferred to it in the Information Technology Act 2000. The new Guidelines would enter into force immediately after publication in the Official Gazette (no information as to when publication is scheduled). The Code of Ethics would enter into force three months after the publication in the Official Gazette. As mentioned above, there are already some challenges in Court against part of these rules.

Get smart on these issues and their impact

Check out these resources: 

Another jurisdiction to keep your eyes on: Australia

Also note that, while the European Union is starting its heavy and slow legislative machine, by appointing Rapporteurs in the European Parliament and having first discussions on the DSA proposal in the relevant working group of the Council, another country is set to soon adopt digital content rules: Australia. The Government is currently considering an Online Safety Bill, which was open to public consultation until mid February and which would also include a “modernised online content scheme”, creating new classes of harmful online content, as well as take-down requirements for image-based abuse, cyber abuse and harmful content online, requiring removal within 24 hours of receiving a notice from the eSafety Commissioner.

If you have any questions about engaging with The Future of Privacy Forum on Global Privacy and Digital Policymaking contact Dr. Gabriela Zanfir-Fortuna, Senior Counsel, at [email protected].

Russia: New Law Requires Express Consent for Making Personal Data Available to the Public and for Any Subsequent Dissemination

Authors: Gabriela Zanfir-Fortuna and Regina Iminova

Moscow 2742642 1920 1
Source: Pixabay.Com, by Opsa

Amendments to the Russian general data protection law (Federal Law No. 152-FZ on Personal Data) adopted at the end of 2020 enter into force today (Monday, March 1st), with some of them having the effective date postponed until July 1st. The changes are part of a legislative package that is also seeing the Criminal Code being amended to criminalize disclosure of personal data about “protected persons” (several categories of government officials). The amendments to the data protection law envision the introduction of consent based restrictions for any organization or individual that publishes personal data initially, as well as for those that collect and further disseminate personal data that has been distributed on the basis of consent in the public sphere, such as on social media, blogs or any other sources. 

The amendments:

The potential impact of the amendments is broad. The new law prima facie affects social media services, online publishers, streaming services, bloggers, or any other entity who might be considered as making personal data available to “an indefinite number of persons.” They now have to collect and prove they have separate consent for making personal data publicly available, as well as for further publishing or disseminating PDD which has been lawfully published by other parties originally.

Importantly, the new provisions in the Personal Data Law dedicated to PDD do not include any specific exception for processing PDD for journalistic purposes. The only exception recognized is processing PDD “in the state and public interests defined by the legislation of the Russian Federation”. The Explanatory Note accompanying the amendments confirms that consent is the exclusive lawful ground that can justify dissemination and further processing of PDD and that the only exception to this rule is the one mentioned above, for state or public interests as defined by law. It is thus expected that the amendments might create a chilling effect on freedom of expression, especially when also taking into account the corresponding changes to the Criminal Code.

The new rules seem to be part of a broader effort in Russia to regulate information shared online and available to the public. In this context, it is noteworthy that other amendments to Law 149-FZ on Information, IT and Protection of Information solely impacting social media services were also passed into law in December 2020, and already entered into force on February 1st, 2021. Social networks are now required to monitor content and “restrict access immediately” of users that post information about state secrets, justification of terrorism or calls to terrorism, pornography, promoting violence and cruelty, or obscene language, manufacturing of drugs, information on methods to commit suicide, as well as calls for mass riots. 

Below we provide a closer look at the amendments to the Personal Data Law that entered into force on March 1st, 2021. 

A new category of personal data is defined

The new law defines a category of “personal data allowed by the data subject to be disseminated” (PDD), the definition being added as paragraph 1.1 to Article 3 of the Law. This new category of personal data is defined as “personal data to which an unlimited number of persons have access to, and which is provided by the data subject by giving specific consent for the dissemination of such data, in accordance with the conditions in the Personal Data Law” (unofficial translation). 

The old law had a dedicated provision that referred to how this type of personal data could be lawfully processed, but it was vague and offered almost no details. In particular, Article 6(10) of the Personal Data Law (the provision corresponding to Article 6 GDPR on lawful grounds for processing) provided that processing of personal data is lawful when the data subject gives access to their personal data to an unlimited number of persons. The amendments abrogate this paragraph, before introducing an entirely new article containing a detailed list of conditions for processing PDD only on the basis of consent (the new Article 10.1).

Perhaps in order to avoid misunderstanding on how the new rules for processing PDD fit with the general conditions on lawful grounds for processing personal data, a new paragraph 2 is introduced in Article 10 of the law, which details conditions for processing special categories of personal data, to clarify that processing of PDD “shall be carried out in compliance with the prohibitions and conditions provided for in Article 10.1 of this Federal Law”.

Specific, express, unambiguous and separate consent is required

Under the new law, “data operators” that process PDD must obtain specific and express consent from data subjects to process personal data, which includes any use, dissemination of the data. Notably, under the Russian law, “data operators” designate both controllers and processors in the sense of the General Data Protection Regulation (GDPR), or businesses and service providers in the sense of the California Consumer Privacy Act (CCPA).

Specifically, under Article 10.1(1), the data operator must ensure that it obtains a separate consent dedicated to dissemination, other than the general consent for processing personal data or other type of consent. Importantly, “under no circumstances” may individuals’ silence or inaction be taken to indicate their consent to the processing of their personal data for dissemination, under Article 10.1(8).

In addition, the data subject must be provided with the possibility to select the categories of personal data which they permit for dissemination. Moreover, the data subject also must be provided with the possibility to establish “prohibitions on the transfer (except for granting access) of [PDD] by the operator to an unlimited number of persons, as well as prohibitions on processing or conditions of processing (except for access) of these personal data by an unlimited number of persons”, per Article 10.1(9). It seems that these prohibitions refer to specific categories of personal data provided by the data subject to the operator (out of a set of personal data, some categories may be authorized for dissemination, while others may be prohibited from dissemination).

If the data subject discloses personal data to an unlimited number of persons without providing to the operator the specific consent required by the new law, not only the original operator, but all subsequent persons or operators that processed or further disseminated the PDD have the burden of proof to “provide evidence of the legality of subsequent dissemination or other processing”, under Article 10.1(2), which seems to imply that they must prove consent was obtained for dissemination (probatio diabolica in this case). According to the Explanatory Note to the amendments, it seems that the intention was indeed to turn the burden of proof of legality of processing PDD from data subjects to the data operators, since the Note makes a specific reference to the fact that before the amendments the burden of proof rested with data subjects.

If the separate consent for dissemination of personal data is not obtained by the operator, but other conditions for lawfulness of processing are met, the personal data can be processed by the operator, but without the right to distribute or disseminate them – Article 10.1.(4). 

A Consent Management Platform for PDD, managed by the Roskomnadzor

The express consent to process PDD can be given directly to the operator or through a special “information system” (which seems to be a consent management platform) of the Roskomnadzor, according to Article 10.1(6). The provisions related to setting up this consent platform for PDD will enter into force on July 1st, 2021. The Roskomnadzor is expected to provide technical details about the functioning of this consent management platform and guidelines on how it is supposed to be used in the following months. 

Absolute right to opt-out of dissemination of PDD

Notably, the dissemination of PDD can be halted at any time, on request of the individual, regardless of whether the dissemination is lawful or not, according to Article 12.1(12). This type of request is akin to a withdrawal of consent. The provision includes some requirements for the content of such a request. For instance, it requires writing contact information and listing the personal data that should be terminated. Consent to the processing of the provided personal data is terminated once the operator receives the opt-out request – Article 10.1(13).

A request to opt-out of having personal data disseminated to the public when this is done unlawfully (without the data subject’s specific, affirmative consent) can also be made through a Court, as an alternative to submitting it directly to the data operator. In this case, the operator must terminate the transmission of or access to personal data within three business days from when such demand was received or within the timeframe set in the decision of the court which has come into effect – Article 10.1(14).

A new criminal offense: The prohibition on disclosure of personal data about protected persons

Sharing personal data or information about intelligence officers and their personal property is now a criminal offense under the new rules, which amended the Criminal Code. The law obliges any operators of personal data, including government departments and mobile operators, to ensure the confidentiality of personal information concerning protected persons, their relatives, and their property. Under the new law, “protected persons” include employees of the Investigative Committee, FSB, Federal Protective Service, National Guard, Ministry of Internal Affairs, and Ministry of Defense judges, prosecutors, investigators, law enforcement officers and their relatives. Moreover, the list of protected persons can be further detailed by the head of the relevant state body in which the specified persons work.

Previously, the law allowed for the temporary prohibition of the dissemination of personal data of protected persons only in the event of imminent danger in connection with official duties and activities. The new amendments make it possible to take protective measures in the absence of a threat of encroachment on their life, health and property.

What to watch next: New amendments to the general Personal Data Law are on their way in 2021

There are several developments to follow in this fast changing environment. First, at the end of January, the Russian President gave the government until August 1 to create a set of rules for foreign tech companies operating in Russia, including a requirement to open branch offices in the country.

Second, a bill (No. 992331-7) proposing new amendments to the overall framework of the Personal Data Law (No. 152-FZ) was introduced in July 2020 and was the subject of a Resolution that passed in the State Duma on February 16, allowing for a period for amendments to be submitted, until March 16. The bill is on the agenda for a potential vote in May. The changes would entail expanding the possibility to obtain valid consent through other unique identifiers which are currently not accepted by the law, such as unique online IDs, changes to purpose limitation, a possible certification scheme for effective methods to erase personal data and new competences for the Roskomnadzor to establish requirements for deidentification of personal data and specific methods for effective deidentification.

If you have any questions on Global Privacy and Data Protection developments, contact Gabriela Zanfir-Fortuna at [email protected]

Schrems II: Article 49 GDPR derogations may not be so narrow and restrictive after all?

by Rob van Eijk and Gabriela Zanfir-Fortuna

On January 28, 2021, the German Federal Ministry of the Interior organized a conference celebrating the 40th Data Protection Day, the date on which the Council of Europe’s data protection convention, known as “Convention 108”, was opened for signature. One of the invited speakers and panelists was Prof. Dr. Dr. von Danwitz, the judge-rapporteur in the CJEU Schrems I Case (C‑362/14), the CJEU Schrems II Case (C‑311/18), and the CJEU Case La Quadrature du Net and Others (C-511/18). He spoke at length about the Schrems II judgment and its significance for cementing the importance of the fundamental right to personal data protection, as well as generously replied to specific questions about the options available now to companies to lawfully transfer personal data from the European Union to the United States. In particular, his comments on the possibility to rely on Article 49 GDPR derogations were noteworthy, as they seemed to contradict the narrow approach taken by the European Data Protection Board in its interpretation.

The recording of the keynote by Prof. Dr. Dr. von Danwitz starts here (1h12m).
The recording of the panel intervention by von Danwitz starts here (2h17m).

Please note that all quotes used in this article are a translation from German.

The broader context: A fundamental right that still needs to be taken seriously

In his introductory remarks providing context to the Schrems II judgment of the CJEU, judge von Danwitz admitted that he had “quite a lot of sleepless nights over this case”. He added that, “however, it is not the task of a court to find the least problematic solution to a case”. 

According to the judge, the awe that the judgment created for some comes from the fact that the consequences of having the right to the protection of personal data elevated as fundamental right in the EU Charter and in the Treaty of Lisbon “are still not fully understood and recognized today”. He added that “the EU attaches great importance to the protection of these rights (respect for private life under Article 7 and protection of personal data under Article 8 of the Charter – n.), especially in comparison to the partial or rudimentary textual references in national constitutions”. 

Judge von Danwitz showed that he is fully aware of the breadth and the pervasiveness of cross-border data transfers in today’s digitalized society: “data transfers to third countries are not rare incidents. It is common practice to outsource certain data-based services to third countries. This may be economically useful and desirable for enterprises, but it should not compromise the level of protection of personal data”. 

He explained that “the necessary balance between the legitimate interests of economic operators and the promotion of international trade on the one hand, and the right to the protection of personal data on the other, is reflected in the legal requirements that an essentially equivalent level of protection of personal data must be ensured when data is transferred to third countries”.

The judge also acknowledged that some countries may not ensure at all this high level of protection of personal data and this fact “may have economic disadvantages for companies in individual cases”. He explained that “nevertheless, this is the necessary consequence of the fundamental decision taken by the European Union to ensure a high level of protection of personal data”. As he ultimately framed it, the entire discussion “is about the much more fundamental question of what is the society we want to live in and our aspiration to shape this society in line with European law and values”.

No legal void: The Court saw Article 46 safeguards and Article 49 derogations as filling the gap

First in his keynote and later on in the panel, judge von Danwitz explained that the Court decided to annul the Privacy Shield with immediate effect – as opposed to allowing for a grace period, because there was no legal void in its absence. He mentioned that GDPR Article 46 safeguards and Article 49 derogations “cover the absence of an adequacy decision”. 

During the panel, the discussion centered around the following question: “How, for example, am I as an entrepreneur supposed to implement these requirements in data transfer after Schrems II? Which guidelines apply?”

In response to the question, von Danwitz remarked: “The question is very legitimate. The question is this, as an enterprise, do I have to transfer data to third countries for which there is no adequacy decision by the European Commission? Yes or no? That’s the fundamental question.” 

“And if this question is to be answered to the effect that [transfers] are absolutely necessary, then I need to use, e.g., standard contractual clauses. At least that’s the standard approach. (…) If standard contractual clauses are not possible, because my process in the third country cannot comply with these clauses under the applicable national law, then, of course, there’s the question of the transfer of data by relying on [the derogations for specific situations in] Article 49 GDPR.” 

Von Danwitz mentioned that Article 49 derogations are in particular an option for intra-group transfers and that they should be more attentively explored. “(…) In my opinion, the opportunities granted by Article 49 have not been fully explored yet. I believe they are not so narrow that they restrict any kind of transfer, especially when we’re talking about transfers within one corporation or group of companies.” 

Von Danwitz indicated that the conditions from the text of the GDPR in any case must be met. For example, in the case of the derogation relying on necessity to enter a contract or for the performance of a contract, the first test is to ask “is the transfer of that data really required? Is it really necessary to fulfill the contract?” He added: “In my opinion, this gives people sufficient scope of action”.

The judge didn’t go into further details and also clarified that questions related to the scope of Article 49 derogations might be posed to the court in the future, and he doesn’t want to “preempt any judgments by making a statement now”. 

Although von Danwitz made the observations in a personal capacity, they mark a new opening in the discussion on data transfers which some refer to as a proverbial Gordian Knot. Furthermore, the remarks are important against the background of the European Data Protection Board (EDPB) Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data. The public consultation period for the recommendations has ended and the EDPB has started processing the comments submitted by stakeholders.

You can find the recordings here: in German, English, and French. The intervention in German by Prof. Dr. Dr. von Danwitz on the exploration of the Article 49 derogations starts at this bookmark: 02h23m12s.

To listen to the keynote by Prof. Dr. Dr. von Danwitz, click here (1h12m).
To listen to the panel intervention by von Danwitz, click here (2h17m).

Photo credit: arbyreed CC BY-NC-SA 2.0.

To learn more about FPF in Europe, please visit fpf.org/about/eu.