FPF Statement on White House Executive Order to Implement the European Union-U.S. Data Privacy Framework

October 7, 2022 Statement from Future of Privacy Forum’s CEO Jules Polonetsky:

With this step, the U.S. puts in place practical surveillance limitations, oversight, and individual redress that are unmatched almost anywhere else in the world in the context of national security. Leading democracies are converging on surveillance standards with this progress. Constitutional limitations prevent a U.S. system that is identical to the European Union, but the Court of Justice of the EU has helped bring about U.S. reforms that will significantly protect privacy in the context of national security. Although there are important legal discussions to have about the exact nature of the judicial redress and the oversight mechanism, as well as the restrictions on bulk collection, this is a momentous achievement.  

Particularly important is the reciprocity requirement for redress, which requires any country to implement safeguards for US citizens’ data to benefit from this system and will help advance global standards.

Read the White House Executive Order here and the White House Fact Sheet here.

FPF’s VP for Global Privacy, Dr. Gabriela Zanfir-Fortuna, spoke about the EO at an IAPP LinkedIn Live on ‘The EU-U.S. Data Privacy Framework & Next Steps for Data Transfers’ on Friday, October 7. Watch it here.

Judge declares Buenos Aires’ Fugitive Facial Recognition System Unconstitutional

On September 7, a trial judge declared the implementation of the Fugitive Facial Recognition System (SRFP, for its name in Spanish) by the Government of the City of Buenos Aires unconstitutional. The decision set an important precedent for risks associated with privacy and intimacy in public spaces in the context of public surveillance for law enforcement purposes. Remarkably, this is also one of the very few known judicial decisions in the global privacy space that clearly looks at the rights to privacy, intimacy and data protection as rights having collective relevance rather than merely individual rights. The decision revealed multiple violations of individuals’ privacy, and instances of abuse of authority by system operators. 

The SRFP was implemented in 2019 as part of the Video Surveillance System of the capital of Argentina and was previously the subject of a government suspension order in April 2020 due to reduced system efficacy caused by pandemic-related masking. The system consisted of facial recognition software installed in selected video surveillance cameras already distributed in Buenos Aires. The Urban Surveillance Center of the Police Department was responsible for visualizing and processing the images and checking them against a national database containing capture orders for fugitives of the justice system (the CONARC database). Upon finding a match, the system issued an alarm and dispatched officers to detain the alleged fugitive. 

Following the announcement of the SRFP, many civil society organizations criticized the risks to privacy and other fundamental rights (such as freedom of association) posed by the system, as well as its potential for abuse due to its wide scope and nature. In December 2020, the Observatorio de Derecho Informatico Argentino (ODIA), joined by other civil society organizations, filed an amparo1 lawsuit before an administrative court against the Government of the City of Buenos Aires for i) issuing Resolution 398, which created the SRFP; ii) approving Law 6.339, which incorporated the SRFP into the local public security law (Law 5.688); and iii) implementing the system without adequate mechanisms.

The court agreed with ODIA and declared the SRFP unconstitutional, prohibiting its operation until control and oversight mechanisms required by law are put in place.

1. Privacy as a collective right, redressable through constitutional mechanisms

The first element of the decision analyzed the standing of the ODIA to bring the lawsuit, and whether the amparo action was the appropriate way to do so. As an initial matter, the court determined the ODIA had standing to sue because the ODIA alleged a violation to the fundamental rights to privacy, intimacy, and protection of personal data. Argentinian courts recognize three categories of procedural standing rights: i) individual rights, ii) rights of collective incidence in regard to collective goods, and iii) rights of collective incidence in regard to homogeneous individual interests. The court determined the rights to privacy, intimacy, and data protection fall under the second category – rights of collective incidence in regard to the collective good. For litigation relying on such rights,  a plaintiff’s identity is not relevant, as long as the case is related to a collective incidence affecting citizens of Buenos Aires. The relevant question is whether the plaintiffs are or represent citizens, whose presence in the city makes them susceptible to a privacy violation.

The court also considered whether an amparo action was the appropriate redress for the alleged harm. The court determined that an amparo is permitted as long as the plaintiff is able to demonstrate i) an actual or imminent injury, restriction, alteration or threat to constitutional rights; ii) a manifest illegality or arbitrary actions by the authority; and iii) the possibility of judicial redress within a reasonable time. In this case, because of how the SRFP was implemented, and the risks it posed to fundamental rights, the court concluded an amparo action would provide an effective and timely remedy, as opposed to the contentious administrative procedure set forth in the Administrative and Tax Code. Additionally, as a constitutional recourse, an amparo action allowed the court to study the constitutionality of the incorporation of the SRFP into Buenos Aires’ public security law, in light of the rights and obligations in the national Constitution and applicable international treaties, such as the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108).

2. Lack of control and oversight

The second element of the court’s decision focused on the failure of the Government to adopt safeguards to counteract the risks posed by the SFRP’s implementation. Resolution 398, which approved the implementation of the system, authorized the Ministry of Justice and Security of Buenos Aires (“the Ministry”) to issue additional regulations for its “effective implementation” and invited the Public Defender’s Office to audit the system. In March 2020, the Public Defender signed a collaboration agreement with the Ministry in the context of Resolution 398, but noted that beginning in 2019 it had documented “serious flaws” in the functioning of the SFRP leading to unlawful detentions. 

The decision found several inconsistencies between the local government and Public Defender’s assertions regarding unlawful detentions arising from SFRP “false positives.” When asked about the number of detentions due to “false positives,” the government claimed there had been no wrongful detentions after the implementation of the SFRP, and that any false alarm or wrongful detention arose from potentially erroneous information in the CONARC fugitive database. Contrary to those claims, the Public Defender verified that unlawful detention of individuals due to SRFP “false positives” had occurred. These “false positive” cases were also confirmed by the National Directorate of the National Reincidence Registry of Argentina, which mentioned in certain detentions officers failed to validate the order’s information with the individual’s DNI (national document identity) or their biometrics, indicating the police were relying on the system’s alarms although they could be triggered by inaccurate information. 

The decision highlighted a pattern of unlawful detentions lasting one to three hours, where individuals were mistakenly identified as fugitives. The court noted that in several cases, the SRFP system correctly identified an individual but issued an alarm based on invalid or expired orders in the underlying CONARC system. In one example, a man was mistakenly intercepted at a metro station due to an alarm issued by the system; after some time, the officers noticed the capture order contained a different name from the one appearing in the individual’s DNI, which was provided by the SFRP registry, and later that the individual’s DNI could still be linked to the capture order within the SRFP, despite a formal request for deletion. In another example, a woman’s July 2019 interception and arrest by eight policemen at a railway station resulted from a years-old expired CONARC capture order.

Separately, the court also documented the government’s failure to implement other legally required oversight mechanisms. The public security law of Buenos Aires mandated all video surveillance systems, including the SFRP, to be included in a Registry providing operational status information for each system. The law also required the Ministry to send an annual report to a Special Committee for the Monitoring of Video Surveillance Systems (Special Committee) and the Public Defender’s Office describing the technical specifications of the software used by the SRFP, any modifications, and the criteria for the installment of video surveillance cameras in certain points of the City. However, almost two years after its implementation, the databases were never registered and the Committee never established.

3. An unreliable database

Throughout the decision, the court emphasized the problematic nature of the system’s source of information. The SRFP operated through the CONARC database, which has information about capture orders issued by national and local courts. However, according to the officials in charge of its operation, the CONARC database has “serious flaws” that, when used for the SRFP, could lead to “false positives” resulting in unlawful detentions, several of which the court described in detail. Updates to the database are usually affected by delays related to the overall functioning of the judicial system, as well as errors linking the information of a fugitive with biometric data, since the latter is provided by the National Registry of Persons (RENAPER).  

Ultimately, the court held that the SFRP is contrary to the principle of presumption of innocence. Almost anyone in the City could be erroneously identified as a fugitive and thus detained by the police. The court found that, contrary to the local government’s assertions, this risk was ongoing and widespread, and it had been this way since the system was first implemented, as demonstrated by the Public Defender’s documentation. Additionally, the judge determined that although some flaws are rooted on the CONARC database, the SRFP could not be considered lawful per se since its operation exclusively relied on that database. The court indicated that the “mere possibility” of adverse consequences, in addition to the absence of adequate control and oversight mechanisms, demonstrated that the SFRP posed a “serious risk” of a breach of the citizens’ privacy.

4. Abuse of authority findings

The decision also noted several inconsistencies in the government’s description of the system’s operation following its implementation. The Ministry argued the SRFP was a completely automated process that left no space for discretionary or arbitrary human intervention. Under the law, the SFRP could only rely on the information provided by the CONARC database, and the public security law of Buenos Aires specifically prohibited the incorporation of data from individuals that are not included in that database.  As a result, the number of records in the system should have matched the number of registries in the CONARC fugitive database. However, after obtaining the lists of registries in the CONARC database and the number of requests to the RENAPER for biometric information, the Court noticed the numbers did not match.

Comparison of CONARC and RENAPER records revealed that, including periods of time when the SRFP system was allegedly suspended, the government made 9,392,372 requests to access biometric data, in excess of the number of active fugitives within the CONARC database, which only had up to 35,000 registries. These requests demonstrate the government accessed biometric data from individuals that were not fugitives and whose information the authorities had no legitimate purpose to access. Specifically, the Court verified that at least 15,459 search records in the SRFP were about individuals that were not included in the CONARC registries. This verification, the court concluded, indicate the government of Buenos Aires had misused the SFRP. 

The Court ultimately determined the actions of the Buenos Aires Government were contrary to the data protection legal system in Argentina. The final factor in the court’s decision turned on the lack of accountability for high-level users of the SFRP system. The court found it unreasonable that seventeen unidentifiable “admin” users had unrestricted access to the sensitive information of millions of individuals, while also free to manipulate and/or erase data without any meaningful transparency or accountability mechanisms in place. The court determined that at least 356 search records for individuals whose biometric data was incorporated into the SFRP were manually erased, making it impossible to assess whether those searches were legally justified. 

Finally, the court noted that while the SFRP relied on the processing of sensitive information, an impact assessment was never performed by the system owners.

Conclusion

The court declared the implementation of the SFRP unconstitutional. The court was specific that unconstitutionality arose from the specifics of the SFRP’s implementation and not on the system itself; as a result, the system could potentially be put into operation again if authorities comply with the requirements of the judicial mandate. The court specifically noted that “when the system is implemented again” it will be mandatory that i) the Special Committee for the Monitoring of Video Surveillance Systems be established and that the Public Defender must be able to effectively exercise its oversight obligations; ii) the Registry of the surveillance systems be created; iii) a data protection impact assessment on the system be performed, and iv) the public must be consulted regarding the implementation of the SFRP. Importantly, although the court criticized the reliance of the SFRP on the CONARC database, it did not seem to prohibit the system’s reliance on it in the future.

Critically, in addition to preserving the SFRP system writ large,  the decision did not declare the law creating the SFRP and incorporating it in the public security law unconstitutional. In fact, the court did not question the law’s constitutionality under Argentina’s constitutional and conventional framework of fundamental rights and freedoms. This is a key point because the amparo action specifically enables a judge to perform this analysis. If the SFRP is implemented once again, it will be interesting to see whether the constitutionality of the law is reviewed under an amparo lawsuit and if specific instruments protecting privacy and personal data, such as Convention 108, play a significant role in the analysis.

Finally, this decision should be seen as part of a larger and decentralized push to oppose government use of facial recognition technologies growing globally over the past years. While in the European Union, the European Data Protection Supervisor, the European Data Protection Board, and the European Parliament are moving towards requesting a ban on live facial recognition technologies in public spaces as part of the legislative process of the AI Act, in the U.S. a bill was recently introduced with the objective to place “strong limits and prohibitions on law enforcement use of facial recognition technology,” limiting its use to situations when a warrant has been obtained. 

It is also important to mention that this decision could be reversed under appellate review if the government decides to appeal. Nevertheless, the trial court’s decision has been celebrated in Argentina as an important precedent for the protection of personal data and privacy, and because it exposed an abuse of authority long accused by ODIA and other organizations since the SFRP system began to operate. 

Editor: Lee Matheson


1 The amparo is recognized as a right in Article 43 of the Argentinian Constitution. It is a process or trial through which citizens can challenge the constitutionality of laws, as well as actions or omissions from authorities that affect constitutionally recognized rights and freedoms.

What Happened to the Risk-Based Approach to Data Transfers?

The following is a guest post to the FPF blog from Lokke Moerel, Professor of Global ICT Law at Tilburg University and a Dutch Cyber Security Council member. This blog is a summary of a longer academic paper which can be downloaded here.

The guest blog reflects the opinion of the author only. Guest blog posts do not necessarily reflect the views of FPF.

Introduction

In my earlier FPF guest blog on the geopolitics of trans-Atlantic data transfers, I flagged that Schrems II companies increasingly find themselves in a catch-22. Frustrations are running high as companies work towards Schrems II compliance by executing measures to mitigate the risk that US government entities can access their data. Yet, EU data protection authorities (DPAs) continue to block their way. The DPAs increasingly adopt an absolutist approach, whereby mitigating measures are disregarded irrespective of the actual risk for data protection after transfer, triggering a debate on what happened to the risk-based approach of the GDPR (RBA).  This has come to the fore in recent decisions of the DPAs as to the data transfers in the context of the use of Google Analytics. The Austrian DPA kicked things off by issuing a decision in a complaint of noyb against, i.e., Google (GA decision).1 In this decision, the Austrian DPA explicitly discards the applicability of the RBA as far as the data transfer provisions of the GDPR are concerned. In a Q&A issued by the CNIL concerning the use of Google Analytics, the CNIL also indicated that the RBA cannot be applied to data transfers.2

This is noteworthy, as, in legal literature, it is generally assumed that the RBA is incorporated in the ‘accountability principle’ of Article 24 GDPR and that this principle has a horizontal application throughout the GDPR and therefore also applies to the data transfer requirements.3 In this light, it is high time for an in-depth assessment of whether, and if so, to what extent the GDPR introduced the RBA, and specifically whether the RBA also applies to the data transfer requirements of Chapter V of the GDPR.

The conclusion will indeed be that the accountability requirement of Article 24 GDPR incorporates the RBA for all obligations of the controller in the GDPR. Where the transfer rules are stated as obligations of the controller (rather than as absolute principles), the RBA of Article 24 therefore applies. Other than the DPAs assume, this is not contradicted by the ECJ in Schrems II nor by the EDPB recommendations on additional measures following the Schrems II judgment. We will, however, also see that the EDPB is trying to rewrite the GDPR by applying the accountability principle of Article 5(2) GDPR (which does not include the RBA) rather than the accountability principle of Article 24, which does. By taking this position, the EDPB pushes its own version of the accountability principle as proposed at the time for revision of the Directive, which was, however, ultimately not adopted by EU regulators in the GDPR.

1. Reasoning Austrian DPA in GA decision

In the GA decision, the Austrian DPA rejected Google’s arguments that a RBA should be taken when assessing the impact of the data transfers in the context of Google Analytics and that the Austrian DPA applies too strict a standard when considering that the mere possibility of access is relevant and not the actual risk of U.S. public authorities accessing the data.

Specifically, the DPA reasoned that such RBA could not be derived from the wording of Art. 44 GDPR. See the decision point D.4 (underlining by Austrian DPA in the original decision):

“Art. 44 GDPR – General principles of data transmission

Any transfer of personal data already processed or to be processed after their transfer to a third country or an international organization shall only be allowed if the controller and the processor comply with the conditions laid down in this Chapter and with the other provisions of this Regulation, including any onward transfer of personal data from that third country or international organization to another third country or international organization. All provisions of this Chapter shall be applied in order to ensure that the level of protection of natural persons ensured by this Regulation is not undermined.”

On the contrary, it can be deduced from the wording of Art. 44 GDPR that for every data transfer to a third country (or to an international organization), it must be ensured that the level of protection guaranteed by the GDPR is not undermined.

The success of a complaint of a violation of Art. 44 GDPR, therefore, does not depend on whether a certain “minimum risk” is present or whether U.S. intelligence services have actually accessed data. According to the wording of this provision, a violation of Art. 44 GDPR already exists if personal data are transferred to a third country without an adequate level of protection.

In connection with those provisions of the GDPR where a risk-based approach is actually to be followed (“the higher the processing risk, the more measures are to be implemented”), the legislator has also explicitly and without doubt, standardized this. For example, the risk-based approach is provided for in Art. 24(1) and (2), Art. 25(1), Art. 30(5), Art. 32(1) and (2), Art. 34(1), Art. 35(1) and (3) or Art. 37(1)(b) and (c) GDPR. Since the legislator has standardized a risk-based approach in numerous places in the GDPR, but not in connection with the requirements of Art. 44 GDPR, it cannot be assumed that the legislator merely “overlooked” this; an analogous application of the risk-based approach to Art. 44 GDPR is therefore excluded.”

The Austrian DPA further rejected the arguments of Google that the RBA was confirmed by the European Court of Justice (ECJ) in the Schrems II judgement4 and the EDPB’s Recommendations 01/2020 on measures to complement transfer tools to ensure the level of protection of personal data under EU law.5

The Austrian DPA further states that the GDPR:

“Unlike Chapter V – see below – Art. 5(2) in conjunction with Art. 24(1) GDPR now actually take a risk-based approach. The higher the risk associated with the data processing, the higher the standard for the evidence to be submitted in order to prove compliance with the GDPR.”

2. Questions of law to be investigated

Based on the GA decision, there are a number of questions of law to be investigated:

  1. Does the RBA apply to the accountability requirements in Article 24 only, in the sense that the standard of evidence (i.e., the required accountability measures, like policies, training requirements, etc.) scales with the risk of the relevant processing rather than that the RBA applies also to the underlying obligations of the controller set out in other provisions of GDPR?
  2. Is the position under 1) supported by the fact that where the EU regulator intended to implement the RBA, this is explicitly expressed in the relevant provisions only? [which seems to be the position of the Austrian DPA]
  3. If the position under 1) is not correct, and RBA in Article 24 GDPR must be considered to constitute a horizontal provision applying a RBA also to the underlying obligations of the controller, does the RBA then relate to the obligations of controllers in Chapter IV only, or to all data protection obligations of controllers, including those of Chapter V?
  4. Does Article 5(2) indeed take a RBA for the accountability principle? [which seems to be the position of the Austrian DPA]
  5. Is the position under 1) confirmed by the ECJ in the Schrems II judgment?
  6. Is the position under 1) confirmed by the EDPB Recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data (EDPB Recommendations)?6

3. Summary Conclusions

Based on an analysis of the wording of the GDPR (see Section 5), the legislative history of the GDPR (see Section 6), the Schrems II judgment (see Section 7), and the EDPB Recommendations (see Section 8) the conclusions are:

4. Interpretation of Article 5 and 24 GDPR

According to the settled case law of the ECJ, the interpretation of a provision of EU law requires that account be taken not only of its wording and the objectives it pursues but also of its legislative context and the provisions of EU law as a whole. Also, the origins of a provision of EU law may provide information relevant to its interpretation.8

Textual analysis

Article 24 is the first provision of Chapter IV (Controller and processor) Section 1 (general obligations). Reviewing the language of Article 24 GDPR, it resembles that of Article 25 (Data protection by design and by default) and Article 30 (Security). The heading of Article 24 is “Responsibility of the controller,” and the provision starts with the qualifier “taking into account the nature, scope, context, and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall….” It is not under discussion that this implies the RBA.

The question then is whether the RBA applies to the standard of evidence (the accountability measures) or also to the underlying obligations of the controller under the GDPR themselves. The text of Article 24 reads that the controller must “ensure and to be able to demonstrate that processing is performed in accordance with this Regulation.” Where the controller explicitly has to ensure compliance by taking a RBA, it is difficult to see why the RBA in Article 24 would only apply to the level of standard of evidence (i.e., to be able to demonstrate compliance) and not to the underlying controller obligations themselves. The obligation further explicitly refers to all requirements under the Regulation.

That being said, not all provisions of the GDPR are formulated as obligations of the controller. For example, the general processing principles listed in Article 5(1) are not formulated as obligations of the controller but as absolute principles. In Article 5(2) it is subsequently provided that “the controller is responsible for, and shall be able to demonstrate compliance with paragraph 1 (“accountability”).” Noteworthy here is that this accountability requirement is not in any manner qualified, taking a RBA similar to Article 24. This seems to mean that the RBA does not apply to the material processing principles (why otherwise include Article 5(2) in the first place; in that case, Article 24 GDPR would have been sufficient).

The question then is, how does this apply to the data transfer rules of Chapter V? There is no indication whatsoever in the GDPR that the general obligation of the controller of Article 24 would not also apply to obligations of controllers under Chapter V (again Article 24 requires that controllers ensure compliance with the Regulation).

Rather, there are indications to the contrary. For example, the privacy-by-design requirements and security requirements (which also incorporate the RBA) remain applicable when transferring data (see explicitly Recital 108). In the same vein, also the accountability principle will be applicable when transferring data (provide the transfer rules are formulated as obligations of the controller rather than in absolute principles).

As the Austrian DPA notes, the general principle for transfers in Article 44 does indeed provide that “any transfer of personal data shall only take place in accordance with the conditions of this Chapter,” but (as omitted by the Austrian DPA) this general principle is explicitly made “subject to the other provisions of this Regulation.” This is logical; Chapter V on transfers cannot be considered on a standalone basis. The transfer rules aim to ensure that data receive a similar level of protection after being transferred to a third county that does not provide an adequate level of protection, not a higher protection. This is also expressed in the last sentence of Article 44:

“All provisions in this Chapter shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.”

Article 46 GDPR (transfers subject to appropriate safeguards) is further formulated not as an absolute principle (like the general processing principles of Article 5(1)) but as an obligation of the controller where it allows data transfers “if the controller (…) has provided appropriate safeguards and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available.”

The conclusions seem justified that the obligation of the controller “to provide appropriate safeguards” under Article 46 GDPR are indeed risk-based, with the exception of where Article 46(1) provides for the absolute requirements “that enforceable data subject rights and effective legal remedies for data subjects are available.”

5. Legislative history Article 5 and 24 GDPR

5.1 The EU Data Protection Directive

Historically, EU data protection legislation has been “rights-based,” and the requirements were to be applied irrespective of the level of risk involved and whether actual harm was created.9 As the WP29 (the predecessor of the EDPB) put it at the time, the EU data protection legal framework provides for a ‘minimum and non-negotiable level of protection for all individuals.’ 10 This is all the more so since the entry into force of the Treaty on the Functioning of the European Union in 2010, which granted the right to personal data protection the status of a fundamental right of the EU (see Article 8 of the EU Charter11 and Article 16(1) of TFEU12).

Noteworthy is that the protection of data transfers is not among those listed as a fundamental right. The EU transfer rules are not considered to be one of the material processing principles, as the transfer rules are a mechanism to ensure that these material processing principles will be observed, rather than being a fundamental processing principle itself.13 This being said, the transfer rules are crucial in their own right to guarantee the protection provided by the EU Data Protection Directive (Directive) and therefore are a key cornerstone of the Directive.14 This distinction is continued in the GDPR, where the material processing principles are listed in Article 5(1) GDPR (and do not include data transfer requirements), and the data transfer requirements are regulated separately in Chapter V.

5.2 Legislative reform

The Directive did not include an accountability principle, and it was only as part of the legislative review of the Directive that this principle was introduced. The main trigger for introducing the accountability principle was that the legislative review of the Directive by the EC showed that there was a widespread lack of compliance with the Directive, in particular also the data transfer requirements and that the enforcement tools of the DPAs were not sufficient to force compliance.15 On July 9, 2009, the EC launched a consultation on the EU data protection legal framework. As part of the consultation, the WP29 and EDPS issued a number of opinions, which basically advised the EC to introduce the accountability principle in the revised Directive. The proposals of the WP29 developed somewhat over time, but its last stance was adopted by the EC in its first proposal for a new Regulation.16

(a) WP29 Opinion on the accountability principle (July 2010)

In its Opinion on the accountability principle, the WP29 proposed the following concrete provision:

“Article X – Implementation of data protection principles
1. The controller shall implement appropriate and effective measures to ensure that the principles and obligations set out in the Directive are complied with.
2. The controller shall demonstrate compliance with paragraph 1 to the supervisory authority on its request.”

The provision refers to all principles and obligations of the revised Directive. The Opinion further reflects that the accountability measures (rather than the material principles themselves) should be scalable (see para. 53). As to the consequences of compliance with the accountability principle, the WP 29 (at p. 11) stresses that “fulfilling the accountability principle does not necessarily mean that a controller is in compliance with the substantive principles […], i.e., it does not offer a legal presumption of compliance nor does it replace any of those principles.”

(b) First EC proposal for a Regulation (December 25, 2012)

The EC’s first proposal for a Regulation basically implements the proposals of the WP29. According to the Explanatory Memorandum accompanying the EU Commission’s first proposal17 dated December 25, 2012, the provisions of Article 22 of the draft considered the debate on a “principle of accountability” and described in detail the obligation of responsibility of the controller to comply with the Regulation and to demonstrate compliance, by adopting internal policies and mechanisms for ensuring such compliance. The first draft of the EU Commission did not include a reference to the “accountability principle” and did not include a reference to scalability (RBA) of the accountability provisions.

Article 5 sub (f):
“processed under the responsibility and liability of the controller, who shall ensure and demonstrate for each processing operation the compliance with the provisions of this Regulation
“Article 22
Responsibility of the controller
The controller shall adopt policies and implement appropriate measures to ensure and be able to demonstrate that the processing of personal data is performed in compliance with this Regulation, including the assignment of responsibilities, and the training of staff involved in the processing operations.”
Recital (60):
Comprehensive responsibility and liability of the controller for any processing of personal data carried out by the controller or on the controller’s behalf should be established. In particular, the controller should ensure and be obliged to demonstrate the compliance of each processing operation with this Regulation.”

Note that Article 5(2) is based on Article 6(2) of the Directive, which embodied the original and narrower meaning of accountability as responsibility for compliance.

(c) Note of the Presidency to EU Council on implementation of RBA (March 1, 2013)

Further to a first examination of the EU Commission proposal, the Presidency reported to the EU Council18 that several Member States voiced their disagreement with the level of prescriptiveness of a number of obligations in the draft Regulation. Many delegations stated that the risk inherent in certain data processing operations should be the main criterion for calibrating the data protection obligations. Where the data protection risk was higher, more detailed obligations would be justified, and where it was comparably lower, the level of prescriptiveness should be reduced.19 The revised draft subsequently incorporated a ‘horizontal clause’ in Article 22 to incorporate the RBA:

“Taking into account the nature, scope and purposes of the processing and the risks for the (…) rights and freedoms of data subjects, the controller shall implement appropriate measures to ensure and be able to demonstrate that the processing of personal data is performed in compliance with this Regulation (…).”20

Art. 5 sub (f) was changed into:

“processed under the responsibility (…) of the controller (…)

Therefore basically reverting the language back to the text of its predecessor Article 6 (2) Directive.

(d) WP29 Statement on the role of a RBA in data protection legal frameworks (May 30, 2014)21

In reaction to these developments in the EU legislative process, the WP29 issued a Statement on the role of a RBA in data protection legal frameworks. From this Statement, it can be derived that the WP29 was well aware that the changes proposed by the European Parliament and the Council constituted a major change as the RBA was now introduced as a core element of the accountability principle, also impacting the underlying obligations of controllers rather than (just) the accountability measures themselves, see p. 2:

“However, the risk-based approach has gained much more attention in the discussions at the European Parliament and at the Council on the proposed General Data Protection Regulation. It has been introduced recently as a core element of the accountability principle itself (Article 22).”

The WP29 further clarified in a number of crisp statements that the RBA should (i) not apply to the key rights granted to data subjects, which apply regardless of the level of risks incurred by the processing, and (ii) that there can be different levels of accountability obligations depending on the risk posed, but that controllers should always be accountable for compliance with the data processing obligations “whatever the nature, scope, context, purposes of the processing and the risks for data subjects are.”

(e) Final text GDPR dated April 8, 2016

The EU Council ignored the WP29 Statement and adopted the final version of Article 24 GDPR.22 The EU Council, in its accompanying statement (p. 4),23 explained that it had strengthened the accountability of controllers and processors to promote a real data protection culture and introduced throughout the Regulation a risk-based approach, allowing for the modulation of the obligations imposed on controllers.

5.3 Assessment based on the legislative history of the GDPR

Inclusion of Article 5(2) seems to be based on Article 6(2) of the Directive (“It shall be for the controller to ensure that paragraph 1 is complied with”), which embodied the original and more narrow meaning of accountability as responsibility for compliance. It was at the proposal of the European Parliament to maintain the original proposal of the EC and bring this provision more into line with accountability (‘be able to demonstrate’ rather than ‘demonstrate’) and the addition of the word ‘accountability’ in brackets at the end.24 The Council proposed instead to concentrate on responsibility.25 The resulting compromise was a combination in Article 5(2) of responsibility proposed by the Council and demonstrability and the label ‘accountability’ in brackets proposed by the Parliament. 26 There are no indications in the legislative history why the accountability element in Article 5(2) was first included, then deleted, and then reinstated but without the RBA. As this provision must have meaning (why otherwise reinstate it), it seems justified to conclude that the RBA does not apply to the material processing principles of Article 5.

The actual principle of accountability, as inspired by the proposals of the WP29 found its way into Article 22 (now 24). It is unclear why the EC declined to use the term accountability principle in the text or heading of Article 22 itself. It is only in the Explanatory Memorandum (at para. 3.4.4) that it is explained that Article 22 [now 24] “takes account of the debate on a ‘principle of accountability’”. The heading further referred to the “responsibility of the controller,” which fitted more the compliance notion of Article 5(2). It is clear that the EC, in its first draft proposal for the Regulation included the accountability principle as advocated by the WP29, whereby the provision applied to the standard of evidence only and not also to the underlying obligations of the controller. Based on the legislative history it is however undisputable that subsequent changes to the initial Article 22 were introduced by the Council in order to incorporate a horizontal provision applying the RBA for all obligations of the controller, and specifically also for the data transfer obligations.

6. Assessment of Schrems II

Reviewing the ECJ judgment in Schrems II,27 the Austrian DPA is correct that the ECJ does not refer to the accountability principle or the RBA under the GDPR. The conclusion of the Austrian DPA, however, that the ECJ (therefore thus) does not take a RBA to data transfers cannot be based on this judgment. What the ECJ did in the Schrems II was raise the bar for international data transfers based on Article 46 (transfers based on appropriate safeguards) to the so-called essentially equivalent level; this in reference to the general principle for transfers of Article 44 and the EU Charter of fundamental rights (see para. 131 – 134). In the absence of an adequacy decision, the ECJ considers it the responsibility of the controller to make a transfer assessment before a transfer can take place on the basis of appropriate safeguards, which also includes an assessment of the laws and practices of the country or countries where the data are flowing to (see para. 126: where the ECJ explicitly refers to “the law and practices in force in the third country concerned” and requires “(…) ensuring, in practice, the effective protection of personal data transferred to the third country concerned.”28 The controller should then take measures to compensate for any lack of data protection by way of appropriate safeguards. It is important to note that the Court does not require that additional safeguards provide a 100% guarantee that access to data by third parties can never occur, but rather that they constitute “effective mechanisms that make it possible, in practice, to ensure compliance with the level of protection required by EU law…” (para. 137). Though the ECJ did not explicitly refer to the accountability principle of Article 24, this transfer assessment obligation of the controller seems in line with the RBA of the accountability principle of Article 24.

This is also confirmed by the dictum of Schrems II. The dictum provides that the relevant aspects of the legal system of the third country need to be taken into consideration, therefore not only the law of the relevant third country but also its practices, as also follows from para. 126 of Schrems II. The ECJ refers to relevant aspects to the non-limitative list of elements in Article 45(2) GDPR, which the EC needs to consider when performing an adequacy assessment of a third country. The list of Article 45(2) shows that the EC, in its assessment, not only needs to assess the law of the country but also “the effective functioning” of the law. In other words, all relevant aspects of the legal system are in practice.29

7. Assessment EDPB Recommendation

The EDPB in the Recommendation30 reflects the Schrems II judgment in a similar manner. The EDPB indicates that the Schrems II judgment “reminds us that the protection granted to personal data in the European Economic Area (EEA) must travel with the data wherever it goes,” that “the Court also asserts this by clarifying that the level of protection in third countries does not need to be identical to that guaranteed within the EEA but essentially equivalent,” that the “Court also upholds the validity of standard contractual clauses, as a transfer tool that may serve to ensure contractually an essentially equivalent level of protection for data transferred to third countries,” but that these “do not operate in a vacuum” and that:

“controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and, where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools. In those cases, the Court still leaves open the possibility for exporters to implement supplementary measures that fill these gaps in the protection and bring it up to the level required by EU law. The Court does not specify which measures these could be. However, the Court underlines that exporters will need to identify them on a case-by-case basis. This is in line with the principle of accountability of Article 5.2 GDPR, which requires controllers to be responsible for, and be able to demonstrate compliance with the GDPR principles relating to processing of personal data.

It is noteworthy that the EDPB explicitly refers to the accountability principle of Article 5(2), but does not in any way refer to the accountability principle of Article 24. The EDPB in para. 1 of the Recommendations explicitly considers that the accountability principle of Article 5(2) GDPR31 also applies to data transfers “since they are a form of data processing in themselves.”32 I recall (see sub 7.1 above) that the Article 5(1) lists the general processing principles, but that these do not include the data transfer principles. The EDPB is correct in considering a transfer a processing, but this then entails that the material principles apply to transfers, but this cannot carry the conclusion that transfers are thus a material principle in themselves. This goes against the system of the GDPR where the transfer rules have their own Chapter V. The underlying reason for the EDPB to find this ‘work around’ is that the accountability principle of Article 5(2), as I also concluded, does not have the RBA as to compliance of the material principles, where the accountability principle of Article 24 does have the RBA for compliance of the obligations of controllers. By taking this position, the EDPB pushes its own version of the accountability principle as proposed by the WP29 at the time for revision of the Directive, which was, however, ultimately not adopted by the EU regulator. Noteworthy is, however, that despite the reference to Article 5(2) GDPR, the final version of the Recommendation does include language (however nominally) to allow for a RBA of data transfer assessments, though the threshold seems high. A more kind interpretation is that the EDPB is confused by the fact that Article 5(2) does include the reference to “accountability,” while Article 24 does not (see sub 4 above). I, however, do not believe the EDPB is confused here, but actually pushes its version of accountability principle as it advocated from the start, while normally covering its basis by including a nominal RBA into the Recommendations itself in line with Schrems II. That the RBA is indeed (though somewhat nominally) included in the Recommendations can be derived from the changes made by the EDPB in the initial version after consultation.

The initial consultation version of the Recommendations,33 did not take a RBA as to the transfer assessment. The consultation version even specifically indicated that organizations should “not rely on subjective [factors] such as the likelihood of public authorities’ access to your data in a manner not in line with EU standards” (see para 42). Following the consultation phase, whereby many stakeholders provided input that the EDPB had wrongfully ignored the RBA of the GDPR, the above statement was no longer included in the final version. Instead, the EDPB (somewhat nominally, and without any explicit acknowledgment) included the RBA approach, though the threshold to do so is very high. This is reflected in the text by including in a number of places that the transfer assessment should not only include the laws, but also the practices in the relevant third country (see in particular para. 43),34 but most importantly by allowing controllers to proceed with the transfer without supplementary measures if they have no reason to believe that the relevant legislation will be applied in practice (see para. 43.3).

8. Conclusion

The conclusion is that the accountability requirement of Article 24 GDPR incorporates the RBA for all obligations of the controller in the GDPR. Where the transfer rules are stated as obligations of the controller (rather than as absolute principles), the RBA of Article 24 therefore applies. Other than the DPAs assume, this is not contradicted by the ECJ in Schrems II nor by the EDPB recommendations on additional measures following the Schrems II judgment. The EDPB is trying to rewrite the GDPR by applying the accountability principle of Article 5(2) GDPR (which does not include the RBA) rather than the accountability principle of Article 24, which does. By taking this position, the EDPB pushes its own version of the accountability principle as proposed at the time for revision of the Directive, which was, however, ultimately not adopted by EU regulators in the GDPR.


1  https://noyb.eu/sites/default/files/2022-01/E-DSB%20-%20Google%20Analytics_DE_bk_0.pdf. See for English translation: Standarderledigung Bescheid (noyb.eu)

2 The CNIL also issued a Q&A concerning the use of Google Analytics: https://www.cnil.fr/fr/cookies-et-autres-traceurs/regles/questions-reponses-sur-les-mises-en-demeure-de-la-cnil-concernant-lutilisation-de-google-analytics The last question of the Q&A refers to the use of RBA by controllers by taking into account the likelihood of data access requests. The CNIL indicates that the RBA approach cannot be applied and explains that as long as the access to the transferred data is possible and the safeguards governing the issuance of requests for access to data do not guarantee a level substantially equivalent to the one guaranteed in the EU, it is necessary to take additional technical measures to make such access impossible or ineffective. 

3 See, specifically on the applicability of the RBA to data transfer requirements after the Schrems II judgement: Paul Breitbarth, “A Risk-Based Approach to International Data Transfers,” EDPL, 2021, p. 547; Christopher Kuner, ‘Schrems II Re-Examined’ (VerfBlog, August 25, 2020) , https://verfassungsblog.de/schrems-ii-re-examined/; and Christopher Kuner, Lee Bygrave and Christopher Docksey, The EU General Data Protection Regulation: A Commentary. Update of Selected Articles. Oxford University Press, 2021, p. 113. Other authors discuss the RBA of the GDPR, but not specifically in the context of data transfers and the ECJ judgement in the Schrems II case.

4 Case C-311/18 Data Protection Commissioner v Facebook Ireland Limited and Maximillian Schrems [2020] ECLI:EU:C:2020:559 : CURIA – Case information (europa.eu).

5 edpb_recommendations_202001vo.2.0_supplementarymeasurestransferstools_en.pdf (europa.eu).

6 Ibid.

7 See for a similar reference also para. 158.

8 ECJ judgment of December 10, 2018, Wightman and Others, C-621/18, EU:C:2018:999, paragraph 47 and the case-law cited: CURIA – Case information (europa.eu)

9 See, Amann v Switzerland App No 27798/95 (ECtHR, February 16, 2000) §70: in order to determine whether a processing constitutes an interference, the fact that the data subject may ‘have been inconvenienced in any way’ is irrelevant: AMANN v. SWITZERLAND (coe.int).

10 Art. 29 WP, ‘Opinion 1/98 Platform for Privacy Preferences (P3P) and the Open Profiling Standard (OPS) , (1998), p. 2: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/1998/wp11_en.pdf.

11 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:12012P/TXT

12 https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:12012E/TXT:en:PDF

13 This is evidenced by the fact that in the Directive the EU transfer rules are not included in Chapter II (The General Rules on the Lawfulness of the Processing of Personal Data), but in a separate Chapter IV (Transfer of personal Data to third Countries). For a similar separation of the basic principles and the transfer rules see the Joint Proposal for a Draft of International Standards on the Protection of Privacy with regard to the processing of Personal Data (Madrid Draft Proposal for International Standards), as adopted on November 5, 2009 at The International Conference of Data Protection and Privacy Commissioners in Madrid by the participating data protection authorities, to be found at https://edps.europa.eu/sites/edp/files/publication/09-11-05_madrid_int_standards_en.pdf, where the transfer rules are included in Section 15 and the basic principles of data protection in Part II.

14 See WP 12, Working Document on Transfers of personal data to third countries: Applying Articles 25 and 26 of the EU data protection directive, July 24, 1998 (WP 12), at https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/1998/wp12_en.pdf, where the Working Party 29 lists “six content principles” of which the 6th is: “restrictions on onward transfers – further transfers of the personal data by the recipient of the original data transfer should be permitted only where the second recipient (i.e., the recipient of the onward transfer) is also subject to rules affording an adequate level of protection. The only exceptions permitted should be in line with Article 26(1) of the directive.” Since a restriction on onward transfers was at the time missing from Convention 108, the Working Party 29 considered the protection provided by the countries that had at the time ratified Convention 108 was insufficient (see WP 12, at 8). This led to adoption of a transfer rule similar to the Directive in Article 2 of the Additional Protocol to Convention 108.

15 Rand Europe, Review of the European Data Protection Directive, Technical Report dated May 2009 (Rand Report) at https://www.rand.org/pubs/corporate_pubs/CP1-2009.html. Other reviews showed similar results: see Douwe Korff, EC Study on implementation of the Data Protection Directive, Comparative study of national laws, September 2002, Human Rights Centre University of Essex, at 209, to be found at <http://papers.ssrn.com>, notes that “the powers now vested in the data protection authorities, as currently exercised, have not been able to counter continuing widespread disregard for the data protection laws in the Member States.”

16 https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2009/wp168_en.pdf, https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2010/wp173_en.pdf 

17 https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0011:FIN:EN:PDF

18 https://data.consilium.europa.eu/doc/document/ST%206607%202013%20REV%201/EN/pdf.

19  See para. 5 at https://data.consilium.europa.eu/doc/document/ST%206607%202013%20REV%201/EN/pdf.

20 See p. 23 at https://data.consilium.europa.eu/doc/document/ST-8004-2013-INIT/en/pdf

21 https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf

22 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52016AG0006%2801%29.

23 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_5419_2016_ADD_1&from=EN.

24 See Amendment 99, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52014AP0212&from=EN.

25 See p. 83 at https://data.consilium.europa.eu/doc/document/ST-9565-2015-INIT/en/pdf.

26 Cf. supra n. 3, p. 113.

27 Cf. supra n.4.

28 Ibid. see para. 126.

29 Cf. supra n.5.

30 Cf. supra n.5.

31 See para. 3 where the EDPB refers to the accountability principle and includes in footnote 12 again a reference to Article 5(2) GDPR only. See also para. 5, footnote 18; para. 48, footnote 58; and para. 76, footnote 77. The only reference to Article 24 can be found in footnote 22, which seems an oversight more than intentional.

32 The EDPB refers to para. 45 of Schrems II. However, in this paragraph the ECJ just indicates that a transfer is a processing (which is correct), but this is not in any way related to how Article 5(1) GDPR should be interpreted.

33 Cf. supra n.5.

34 Cf. supra n.4.

Call for Nominations: 13th Annual Privacy Papers for Policymakers

The Future of Privacy Forum (FPF) invites privacy scholars and authors with an interest in privacy issues to submit finished papers to be considered for FPF’s 13th annual Privacy Papers for Policymakers (PPPM) Award. This award provides researchers with the opportunity to inject ideas into the current policy discussion, bringing relevant privacy research to the attention of the US Congress, federal regulators, and international data protection agencies.

The award will be given to authors who have completed or published top privacy research and analytical work in the last year that is relevant to policymakers. The work should propose achievable short-term solutions or new means of analysis that could lead to real­ world policy solutions.

FPF is pleased to also offer a student paper award for students of undergraduate, graduate, and professional programs. Student submissions must follow the same guidelines as the general PPPM award.

We encourage you to share this opportunity with your peers and colleagues. Learn more about the Privacy Papers for Policymakers program and view previous year’s highlights and winning papers on our website.

FPF will invite winning authors to present their work at an annual event with top policymakers and privacy leaders in spring 2023 (date TBD). FPF will also publish a printed digest of the summaries of the winning papers for distribution to policymakers in the United States and abroad.

Learn more and submit your finished paper by October 21st, 2022. Please note that the deadline for student submissions is November 4th, 2022.

The “Colorado Effect?” Status Check on Colorado’s Privacy Rulemaking

Colorado is set to formally enter a rulemaking process which may establish de facto interpretations for privacy protections across the United States. With the passage of the Colorado Privacy Act (CPA) in 2021, Colorado, along with Virginia, Utah, and Connecticut, became part of an emerging group of states adopting privacy laws that share a similar framework and many core definitions with a legislative model developed (though never enacted) in Washington State. However, while the general model of legislation seen in the CPA is similar to recently enacted state privacy laws, the CPA stands alone in providing authority to the state Attorney General to issue regulations. 

Because no other similar state law has provided for this type of interpretative authority, regulations issued by the Colorado Attorney General could have far-reaching implications for how both businesses and regulators in other jurisdictions come to interpret key state privacy rights and protections. Colorado’s pre-rulemaking process recently concluded, revealing a range of possible directions that formal rulemaking could take. Below, we assess key priorities and areas of significant divergence that have been brought into focus both through public comments from stakeholders and questions posed by the Attorney General.

The Rulemaking Process

The CPA grants broad discretionary rulemaking authority to the Colorado Attorney General to issue regulations to help implement the Act. In April 2022, Colorado Attorney General Phil Weiser released a set of pre-rulemaking considerations containing a series of questions for public comment. This document offered the first hints as to the specific topics that the Colorado Department of Law (“the Department”) is considering addressing beyond opt-out mechanisms. It includes targeted questions on the CPA’s consent requirements, restrictions on so-called “dark patterns”, standards for data protection assessments, and consumers’ right to opt-out of certain automated profiling decisions. The Department’s questionnaire received 44 comments from a range of stakeholders including business groups, non-profits, civil society organizations, and think tanks (including the Future of Privacy Forum). We provide a non-comprehensive summary of significant issues addressed across these public comments below.

1. Universal Opt-Out Mechanisms

Colorado holds the distinction of being the first state to clearly require that businesses allow consumers to exercise certain privacy rights on an automated basis through technological signals (such as browser settings or plug-ins). Notably, opt-out mechanisms are the only topic on which the CPA requires rulemaking, directing the Attorney General to establish “technical specifications” for signal mechanisms that will: (1) prohibit signal providers from unfairly disadvantaging other businesses, (2) ensure that signals represent a consumer’s freely given choice to opt out, and (3) permit covered entities to authenticate that a signal is sent by a resident of the state and represents a legitimate request to opt out. The Department’s questionnaire addressed these issues and sought additional input on how signal mechanisms should apply to data collected offline.

Default Signal Settings: The CPA prohibits opt-out mechanisms that are a “default setting” and instead requires signals to represent a consumer’s “affirmative, freely given, and unambiguous” choice to opt out. The Department’s questionnaire sought feedback as to whether a consumer’s selection of a tool marketed for its privacy features without taking additional action would satisfy the requirement for user intent (an approach that regulators in California appear to have endorsed). This inquiry generated a broad range of responses. For example, a Wesleyan University professor asserted that the selection of “privacy-preserving products” including FireFox, Brave, and DuckDuckGo Privacy Essentials can unambiguously reflect an intent to opt out of targeted advertising and other forms of data monetization without requiring a user to take additional steps. Industry groups such as the Colorado Chamber of Commerce typically rejected this view, arguing that “any mechanism involving a default or pre-selected opt-out choice in effect would be an opt-in, rather than the opt-out required by the statute.” The Future of Privacy Forum called for a context-specific approach, arguing that while the installation of a single-purpose plug-in may reflect unambiguous consumer choice to opt out, the use of a multi-feature product such as a web browser would be unlikely to satisfy the CPA’s statutory requirements.

Opt-Out Signal Authentication: Under the CPA, opt-out mechanisms are required to allow recipient organizations to authenticate a signal’s user as a Colorado resident and to determine that the signal represents a legitimate opt out request. Numerous commenters expressed concern that establishing strict authentication procedures could have the effect of frustrating consumer intent in exercising their privacy rights and suggested regulatory workarounds. For example, the Colorado Privacy Policy Commission suggested a standard that opt-out signal authentication must require no more than three steps to complete. Separately, several organizations including Consumer Reports and the Network Advertising Initiative (NAI) suggested that regulations could permit authenticating residency with a user’s IP address. However, the State Privacy and Security Coalition (SPSC) and TechNet raised concerns about VPNs and other technologies that can make determining location by IP addresses unreliable, and further posited that the CPA may raise Constitutional concerns if enforcement of opt-out mechanisms extends beyond authenticated Colorado residents.

Signal Scope: A significant technical and policy challenge for the use of opt-out mechanisms is whether a signal can and should apply to data collected outside of the signal’s medium. For example, can a browser-based signal be used to exercise consumer rights over information that was previously collected at a brick-and-mortar retail store? Consumer Reports argued that while regulations should not require the collection of additional information in order to process opt out signals, a signal should apply beyond its present interaction “if the user is authenticated to the service by an identifier that applies in other contexts.” In contrast, business groups highlighted technical limitations with opt-out signals as they presently exist, for example, the Computer and Communications Industry Association (CCIA) posited that “if only browser extensions can serve as [opt out signals], the requirement to honor [opt out signals] should only extend to browsers.”

2. Consent

The CPA requires covered entities to obtain individual consent in certain circumstances, including for the processing of sensitive personal data and for incompatible secondary uses of information. The Act requires that consent be “freely given, specific, informed, and unambiguous,” closely matching the definition in other state laws and modeled on European privacy law. The Department sought information about each of these elements of consent as well as existing consent mechanisms.

Revoking Consent: Multiple organizations pointed to the lack of an explicit right to “revoke” consent as a potential gap in the statute to cover through rulemaking. The Electronic Privacy Information Center (EPIC) and The Samuelson-Glushko Technology Law & Policy Clinic at Colorado Law (TLPC) explained that while the CPA requires that it be just as easy to withdraw consent as it is to provide it in the case of overriding a universal opt out, there is no explicit right to revoke consent for other instances of data processing in the Act. Future of Privacy Forum pointed to broader rights of revocation in the GDPR and Connecticut Data Privacy Act as potential models to follow, recommending that “forthcoming regulations follow an approach similar as Connecticut by providing that consumers may, at any time, withdraw previously provided consent.” Law firm Husch Blackwell also highlighted model rights of revocation in other privacy regimes, further noting that “although it could be argued that the right to revoke consent is implicit in the CPA, it is not clear that Colorado law supports this position based on analogizing from existing court decisions.”

Implied Consent: Industry and advocacy groups alike also weighed in on when, if at all, implied consent could meet the statutory requirements of the CPA. CCIA contended that an “affirmative act” where a consumer purposefully provides personal data should not require additional consent procedures: “For instance, a consumer who intentionally submits sensitive demographic data (such as citizenship status or religious affiliation) while completing an online form should be deemed to have consented to the collection and processing of that demographic data.” On the other hand, EPIC and Consumer Reports sought stricter standards for obtaining consent. Consumer Reports proposed mandating that any request for consent include a “dedicated prompt” that “clearly and prominently describes the processing for which the company seeks to obtain consent,” while EPIC argued that consent should not be implied when a consumer exits a pop-up window that asks for consent.

3. Dark Patterns

The Colorado Privacy Act states that a consumer’s consent is not valid if obtained through the use of “dark patterns” which are defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” This language originated in the proposed DETOUR Act introduced by U.S. Senators Warner (D-VA) and Fischer (R-NE) in 2019. In the context of the CPA, the concept of dark patterns is a subset of the Act’s approach to individual consent. Nevertheless, the Department posed several specific questions on dark patterns, including whether the rules should outline specific types of dark patterns and what standards or principles could best guide design choices to avoid dark patterns.

Dark Patterns Definition and Scope: Several business groups raised concerns with the CPA’s definition of “dark patterns”, such as CTIA, which argued that the term is “vague” and leaves the door open to confusion on the part of both consumers and businesses. Numerous industry commenters encouraged the Department to avoid a prescriptive approach to the term and to instead focus on practices that amount to consumer deception or fraud, pointing to a long line of Federal Trade Commission enforcement actions in this realm. In contrast, some advocacy groups called for an expansive interpretation and application of the term “dark patterns” in order to protect consumers beyond the context of CPA’s “consent” requirements. For example, Common Sense Media recommended “prohibiting asymmetric platform design practices that limit users’ ability to change user settings, delete personal data, or delete their account.” Colorado Public Interest Research Group (CoPIRG) went a step further, recommending the development of rules that “prohibit platforms from using dark patterns in any consumer interaction.” However, it is unclear whether the Attorney General would have the statutory authority to issue expansive new restrictions on user interface designs along these lines.

4. Data Protection Assessments

Data Protection Assessments (“Assessments”) are an increasingly common requirement in privacy and data protection regimes around the globe. The CPA is no exception and requires an assessment for processing that “presents a heightened risk of harm to a consumer.” Assessments must weigh the risks and benefits of the processing activity and must be made available to the Attorney General upon request, though they are exempt from disclosure under the Colorado Open Records Act. The Department’s questions on this topic sought to clarify what circumstances should allow them to request an assessment and what requirements should exist for the form and content of the assessment.

Parameters for Requesting Assessments: TLPC recommended treating assessments as an ongoing process, with consistent feedback and input from affected consumers, controllers, and the Department of Law. In contrast, industry groups, including NAI, CCIA, CTIA, SPSC, and the Denver Metro Chamber of Commerce, asked that the Department establish specific parameters for when they may ask for an assessment to be conducted or disclosed. For example, the Alliance for Automotive Innovation (AAI) discouraged a regular cadence for iterating upon assessments, instead proposing that controllers be required to “update them only when there is a material change in processing activities that is likely to have an impact on consumer privacy.”

Form and Content of Assessments: In general, privacy advocates sought to establish more detailed parameters for the form and content for assessments, while industry representatives such as NAI, AAI, and various Chambers of Commerce sought more flexibility. For instance, while EPIC provided a list of preferred mandatory requirements, the Colorado Chamber of Commerce suggested that the Department “publish a set of voluntary factors that the controller could consider as they undertake a data protection assessment.”

5. Profiling

The CPA creates a new right to opt out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. Once again, this right is common to many emerging state privacy laws and is based on language that originated in the European Union. The Department raised numerous topics concerning profiling, including the disclosures about automated processing necessary for consumers to make informed opt out decisions, whether the rules should address specific legal or civil rights concerns or specific applications of profiling, whether there could be negative impacts of immediately implementing a request to opt out of profiling, and how the statute should apply to “partial” automated decisions.

Transparency: The application of the CPA’s transparency requirements to automated decision making systems was a significant focus for commenters. Industry comments typically sought limitations on disclosures, with the Denver Metro Chamber of Commerce arguing that “requiring granular visibility into each rapidly changing processing activity could cripple business.” CCIA further called for “explicit protections for intellectual property, trade secrets, and other legal rights of the business in question.” In contrast, EPIC called for broader disclosures about profiling activities such as “the sources and life cycle of the data processed by the system, including any brokers or other third-party sources involved in the data life cycle; and how the system has been evaluated for accuracy and fairness, including links to any audits, validation studies, or impact assessments.”

Opt Out Rights: Commenters also engaged on the range of profiling activities that should be subject to the consumer opt out right. Industry groups highlighted beneficial processing operations that could be disrupted by a broad reading of the language, including processing necessary for vehicle safety systems, fraud prevention, maintaining system integrity and security, and ad measurement and reporting. Many of these groups also called for regulations to limit the opt out right to “solely” automated decisions (that lack any human oversight), as Connecticut lawmakers have done. On this point, the Future of Privacy Forum recommended that consumer opt out rights still apply in situations where the human review of a profiling decision amounts to little more than a “rubber stamp.”

6. Miscellaneous Topics

Given the Attorney General’s broad rulemaking authority, any CPA topic is theoretically on the table for rulemaking, even if not specifically addressed in the questionnaire. Commenters sought regulatory tweaks and clarifications on many additional topics including:

Next Steps

The Attorney General has announced a goal of issuing draft regulations in the fall of 2022 (note: AG Weiser is on the ballot for Colorado’s General Election in November, the outcome of which may influence this timeline). Pursuant to the Colorado Administrative Procedure Act, publishing draft regulations will begin a formal notice-and-comment phase, which will also include at least one formal hearing. Given the importance of Colorado’s rulemaking process to the U.S. privacy landscape and the range of directions that the Attorney General could take on rulemaking (in both scope and substance), it can be expected that stakeholders will remain actively engaged in this process.

FPF Participates in FTC Event on “Commercial Surveillance and Data Security” Proposed Rulemaking

Yesterday, FPF Senior Director for U.S. Policy Stacey Gray participated in a panel discussion hosted by the Federal Trade Commission (“FTC”) regarding its Advance Notice of Proposed Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security” (comments start at 1:39:00). Feedback from the public forum is intended to help inform the Commission’s decision whether to proceed in rulemaking and what form a new market-wide rule governing consumer privacy could take.

As a panelist, Stacey Gray urged the Commission to move forward with its rulemaking proposal, noting that exponential increases in the benefits and harms of data collection in our daily lives make it the right time to establish national rules on what constitutes unlawful behavior with respect to the collection and use of personal data. Highlighting potential regulatory solutions, Gray urged the Commission to codify existing case settlements requiring accurate disclosures and reasonable data security practices and to apply the Commission’s “unfairness authority” to reform business practices that result in data-driven discrimination and harmful secondary uses of personal information.

The public forum included two expert panels, one on industry perspectives and one on consumer advocate perspectives regarding the consumer data issues implicated by the rulemaking. Furthermore, presentations from the Commissioners as well as the questions posed by the panel moderators may offer further insight into how the FTC is approaching rulemaking on consumer harms in the present digital ecosystem.

Panel 1: Industry Perspectives

The first panel was moderated by Olivier Sylvain, senior advisor to FTC Chair Khan. In addition to asking about the restrictions that a new privacy rule should create, Mr. Sylvain’s questions covered existing industry best practices (including for the retention of sensitive data), ways the Commission can incentivize best practices short of rulemaking, and current market incentives to collect data. 

While the ANPR broadly defines “commercial surveillance” to include “collection, aggregation, analysis, retention, transfer, or monetization of consumer data,” industry panelists stressed that there are a wide range of uses of personal data that create different risks, depending on context. For example, Digital Context Next’s Jason Kint argued that while first-party use of data to tailor experiences is expected by consumers, secondary uses (including targeted advertising) tend to violate these expectations. National Retail Foundation’s Paul Martino agreed that there are greater risks inherent to data collection and processing by third-party businesses, which may lack incentives to develop long term customer relationships.

In the context of best practices, panelists paid particular attention to the topic of data security. Mozilla’s Marshall Erwin described a “universally accepted” (though not universally adopted) consensus set of data security practices that includes the encryption of personal information in transit, employee access controls, and password standards. Mr. Martino further pointed to controls like multi-factor authentication, malware and antivirus software, and patching, though he stressed that there is no “one size fits all” approach to cybersecurity standards.

The Partnership on AI’s Rebecca Finlay encouraged the Commission to review data governance models emerging in jurisdictions outside the U.S. to evaluate the merits of different regulatory approaches. She specifically highlighted the privacy interests of children and the United Kingdom’s recent Age Appropriate Design Code, which includes transparency and data minimization standards. Mr. Erwin also highlighted the need to protect childrens’ privacy, while cautioning that some approaches can result in “privacy theater” with minimal tangible benefit.  

Panel 2: Consumer Perspectives Panel

The second panel was moderated by Attorney Advisor to the FTC, Rashida Richardson. Ms. Richardson’s questions underscored the Commission’s focus on civil rights and on children and teenager’s privacy, as well as its interest in ensuring that requirements placed on industry are in fact privacy and security-protective. She asked for insights from the panel on the unique impacts of online tracking and data collection on members of protected classes and on children and teenagers and the extent to which data minimization and transparency requirements are effective tools to combat the harms associated with widespread collection of personal data. Finally, she asked about the limitations of the traditional notice and consent model for protecting consumer privacy. 

Members of the panel signaled strong support for the FTC’s efforts to establish national, clear standards regarding what constitutes unfair or deceptive data collection, storage, and use. EPIC’s Caitriona Fitzgerald spoke to the inability of many individuals to understand or protect themselves from harmful data collection online in the absence of regulatory intervention. Upturn’s Harlan Yu and the Joint Center for Political and Economic Studies’ Spencer Overton, focused on marketplace harms borne by the members of historically-marginalized and protected groups in critical areas, such as housing, education, and voting. Citing examples of housing and employment discrimination enabled by widespread data collection, they urged the Commission to place limits on the ability of data brokers and other parties to collect and aggregate certain sensitive types of data. The German Marshall Fund of the U.S.’s Karen Kornbluh added that online data collection and aggregation, when it is deployed to interfere with elections or track members of the armed services, poses national security as well as privacy risks. 

FPF’s Stacey Gray noted that, when applying the unfairness standard, the Commission should be mindful of the fact that fairness determinations “inherently involve balancing, context, and policy tradeoffs,” emphasizing that, “many secondary uses of data can and should enable academic research, support for public health, fraud detection, and perhaps, to a reasonable extent, advertising-supported content.” Mr. Overton returned to this theme, noting that data-enabled targeted messaging can be positive when it provides individuals with information that is particularly relevant to them, such as messaging about sickle cell disease aimed at African-American audiences.

Commissioners Weigh In

In opening the public forum, Chair Khan noted that digital tools can deliver “huge conveniences” but also contribute to the tracking and surveillance of individuals in entirely new ways. She further emphasized the legal tests that the Commission must satisfy if it is to proceed in rulemaking. Commissioner Slaughter spoke favorably of efforts to enact comprehensive federal privacy legislation, but emphasized that until there’s a law on the books, the Commission must make use of all its enforcement tools to investigate and address unlawful behavior. Her comments highlighted harms to adolescents who are not covered by existing children’s privacy laws as well as harms resulting from AI and advanced algorithms.

Commissioner Bedoya spoke following the panel presentations, stressing the importance for the Commission to receive a broad array of first-hand consumer accounts of unfair and deceptive practices. Picking up on points raised by FPF’s Stacey Gray on the history of “unfairness” in U.S. privacy law, Bedoya also noted that the ANPR’s broad scope reflects the sum total of historical privacy frameworks in the United States, such as the Brandeis-Warren ‘Right to Privacy’ and the Fair Information Practice Principles (FIPPS), that go beyond mere ‘notice and consent’ protections. Commissioners Wilson and Phillips, who both voted against the FTC’s ANPR, did not participate in the event.

Next Steps:

In addition to the public forum, the Commission will consider written responses to the ANPR in determining whether to proceed in a new privacy and data security rulemaking; the deadline for public comment is October 21, 2022.

The Commission’s 95-question ANPR covers a broad range of topics, seeking information on the prevalence and harms of particular industry practices (including in advanced algorithms, children’s data, and targeted advertising), potential regulatory interventions (such as data minimization, consent, and transparency), and remedies (such as first-time fining authority and “algorithmic disgorgement”).

Due to its expansive nature, the ANPR has been heralded for attempting to rein in invasive and unfair business practice, while critics have alleged the proposal exceeds the Commission’s statutory authority. The Commission could pursue a range of possible directions in crafting new privacy and security rules for U.S. businesses, and stakeholders will be closely watching for additional indications from the Commission on what will come next.

View a video and transcript of the public forum here.

New Report on Limits of “Consent” in Japan’s Data Protection Law

Introduction

Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the fourteenth and final report in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).

This report provides a detailed overview of relevant laws and regulations in Japan, including:

The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.

Japan’s Data Protection Landscape

The primary legislation in Japan governing the collection, use, and disclosure of personal information by private entities is the Act on Protection of Personal Information (APPI), which took effect in 2003 and applies to any handling of the personal information of data subjects (termed “principals” in the APPI) in Japan by businesses which supply goods and services to persons in Japan, termed “personal information handling business operators” (PIHBOs).

A core principle of the APPI is that personal information may only be processed for a specific purpose (termed the “utilization purpose”), which must be specified as clearly as possible. Before handling personal information, a PIHBO must notify the data subject or the public at large of the utilization purpose for handling the information (unless an exception applies). A PIHBO may also handle personal information in a manner that is consistent with that purpose without having to obtain the data subject’s consent.

The APPI was substantially amended in 2015, 2020, and 2021. These amendments did not significantly impact the APPI’s notice and consent framework.

Following the 2015 amendments to the APPI, the PPC has been empowered to enforce the APPI and issue guidelines to aid compliance.

Regarding guidance, the PPC to date has issued comprehensive guidelines (in Japanese) on interpretation of the APPI as well as more targeted guidance on specific topics, in a question-and-answer format. The PPC’s guidance is complemented by other guidelines (In Japanese) on personal data protection in specific sectors (including finance, credit reporting, debt collection, medical care, insurance, and genomics) issued by sectoral regulators.

Regarding enforcement, the PPC is empowered to conduct investigations into PIHBOs’ personal data protection practices and issue non-binding recommendations to cease certain conduct or rectify non-compliance with certain of the APPI’s requirements. If a PIHBO fails to implement the recommendation without a legitimate excuse, or in cases where urgent action is required, the PPC is further empowered to issue a binding order for the PIHBO to take appropriate action. Failure to comply with a binding order from the PPC is a criminal offense punishable with imprisonment or a fine.

Role and Status of Consent as a Basis for Processing Personal Data in Japan

Consent is not required for all handling of personal information under the APPI. As discussed above, a PIHBO may collect and use personal information for a utilization purpose without obtaining the data subject’s consent. However, the PIHBO must still ensure that the handling is lawful and fair and in most cases, notify the data subject of how his/her personal information will be handled.

That said, consent plays a number of secondary roles and may be required for certain activities concerning personal information. By default, a PIHBO must obtain data subject’s consent before:

Consent also functions as one of several legal bases under the APPI for transferring personal information out of Japan. In this context, consent is only valid if the PIHBO first provides the data subject with certain information, including the jurisdiction to which the personal information will be transferred, details on the personal information protection system of that jurisdiction, and details of any action that the recipient will take to protect the personal information.

Though the APPI provides a number of exceptions to consent requirements, these exceptions are generally only available where provided by another law or regulation, or where there is a need to:

Additionally, the APPI also exempts certain activities, including academic research, journalism, and activities of political or religious organizations, from its requirements, including consent requirements, subject to certain obligations to secure and appropriately handle personal information.

The APPI does not define consent or specify the forms of consent that would be considered valid under the APPI. However, the PPC has issued guidelines which suggest that consent must minimally be specific and voluntary and provide examples of valid measures for obtaining consent in practice.

While express consent would qualify as valid under the APPI, there is ambiguity as to whether implied consent would qualify as valid for this purpose. Guidance from the PPC suggests that opt-in implied consent could be considered valid in appropriate cases but does not provide examples of any such cases.

However, certain sectoral guidelines, including for the medical care and debt collection sectors, do specify a number of situations in which consent can be inferred or would not be strictly required.

Read the previous reports in the series here.

FPF Welcomes Senior Fellows Covering Data Protection in Latin America and Japan

FPF welcomes two new Senior Fellows to the Global team that will provide ad-hoc insight into the state of play of data protection and privacy law developments in their regions: Pablo Palazzi for Latin America, with a focus on Argentina, and Takeshige Sugimoto for Japan.

Pablo Palazzi

copy of techlawfest graphic

Pablo A. Palazzi, who will oversee developments in Argentina and Latin America, is currently a law professor at the University of San Andres in Buenos Aires, Argentina, where he is the Director of the Center for Technology and Society (CETyS).

He is also a partner of Allende & Brea, a law firm in Buenos Aires, where he practices data protection law and internet law. He previously worked as a foreign associate at Morrison & Foerster, LLP in New York. He is admitted to practice law both in Argentina and in New York State.

The challenges for Latin America are finding a proper and adequate model to regulate data privacy, considering the region’s particularities. Latin America is a region that includes 33 countries and 660 million people.

Currently, first-generation laws are 20 years old, and only a handful of laws are based on GDPR, such as in Brazil and Ecuador. The remaining laws that were correct 20 years ago are not up to date to face the challenges that modern society requires. There is much room to enhance cooperation between DPAs in the region and to work on harmonizing the legal frameworks.

Palazzi participated actively as an external consultant in the European Commission’s adequacy assessments of Uruguay and Argentina, where he was also actively involved in drafting a data protection bill based on the GDPR in the years 2017-2018. He was a consultant for the “Red Iberoamericana de Protección de Datos” , drafting SCCs for Latin America. He is doing similar work for SCCs under the Council of Europe modernized Convention 108.

He was involved in drafting the Regulations of the national data protection act and the data protection law for the city of Buenos Aires (Law 1,845), and the drafting of the Computer Crimes Act in the year 2008. He was a member of the Advisory Committee of the Cybercrime Program of the Ministry of Justice of Argentina, helping to internalize the Budapest Convention. 

Palazzi has written several books on data protection matters in Spanish, including: “International transfer of personal data to Latin America” (Ad Hoc, 2003, LL.M thesis with prologue by Prof. Joel Reidenberg), “Credit Reporting Law” (Astrea, 2007), “Computer Crimes” (Abeledo, 2014), and “Delitos contra la intimidad informática” (CDYT, 2019). He also edited a two-volume book with several authors to celebrate the 20th anniversary of the data protection law of Argentina (“Protección de Datos: Doctrina y Jurisprudencia,” CDYT, 2021). In Europe, Palazzi coordinated the book “Challenges of privacy and data protection law – Perspectives of European U.S. law” (Larcier, 2008), edited with Prof. Yves Poullet and María Verónica Pérez Asinari.

He is a member of the editorial board of International Data Privacy Law (Oxford University), a founding member of the Latin American Data Protection Law Review (annual law review on data protection, 2012-2018), and a member of the International Association of Privacy Professionals (IAPP) where he was the KnowledgeNet chair for the Buenos Aires chapter. Palazzi also collaborated in drafting the Model Data Processing agreement at IAPP´s Privacy Bar Section. In 2022, Palazzi was awarded the Vanguard Award by IAPP for his work in the region of Latin America. He has been a frequent speaker at the CPDP conferences in Brussels and Latam, at PLI seminars in New York, the IAPP summit in Washington, DC, and the Privacy Laws & Business conference at Cambridge University.

Palazzi obtained his law degree at the School of Law of Universidad Católica. In May 2000, he received an LL.M. from Fordham Law School, where he also worked as a research assistant for Prof. Joel Reidenberg. Pablo wrote his LL.M. thesis on international transfers of personal data and the adequacy of Latin American countries. 

Takeshige Sugimoto

headshot takeshige sugimoto

Takeshige (“Take”) Sugimoto, who will oversee developments in Japan, is the Managing Director and Partner of S&K Brussels LPC, a Japanese boutique law firm specializing in data protection, privacy laws, and AI regulations in the US, EU, UK, China, and Japan. He is qualified to practice law in Japan and New York State and is a member of the Brussels Bar Association (B-List). He also serves as the Director of the Japan DPO Association, which he co-founded

spacer here

Japan’s Act on the Protection of Personal Information (APPI) is as vigorous as the GDPR in protecting individuals’ rights to person data. The APPI has established two sets of rules: one for the private sector, which stipulates obligations and penalties for the person information handling business operators, and another for the public sector, which stipulates obligations and penalties for administrative organizations and incorporated administrative agencies.

Starting in April 2023, local governments’ personal information protection systems will also enforce commons rules. This will position the APPI as the single comprehensive data protection law applicable to private and public sectors, including local governments.

It will be interesting to see how Japan can continue to play an important role in discussing the emerging risks surrounding personal data protection, such as data localization and unlimited government access.

Sugimoto’s data protection practice includes establishing and reviewing clients’ global data protection compliance systems, representation, and defense in disputes involving global data protection law issues, including but not limited to negotiations with European, UK, US, Chinese, and Japanese data protection supervisory authorities. As a Japanese lawyer, he regularly advises various clients on the APPI, taking into account his paralleled ongoing practical experiences with the EU General Data Protection Regulation (GDPR), UK GDPR, US California Consumer Privacy Act (CCPA) / Consumer Privacy Rights Act (CPRA), and China’s Personal Information Protection Law (PIPL).

As a former Brussels resident between 2013 and 2020, he has practiced European data protection laws, including both EU member states’ data protection laws under the EU Data Protection Directive of 1995 and EU GDPR as a member of major law firms’ Brussels offices. He has successfully represented numerous clients over the years in obtaining European data protection supervisory authorities’ approvals of EU Binding Corporate Rules (BCRs) for Controllers and Processors under the EU GDPR, following each of the European Data Protection Board (EDPB)’s opinions on the respective authorities’ draft approval decisions for those BCRs. Furthermore, he represents clients in their applications for approval of UK BCRs under the UK GDPR to the UK Information Commissioner’s Office (ICO). He has also assisted clients in preparing for the UK’s International Data Transfer Agreement, a new data transfer mechanism. 

Since adopting the CCPA in 2018, followed by the CCPA Regulation issued by the California Attorney General, Sugimoto has advised several major companies on their CCPA compliance projects. He has also assisted clients in updating their CCPA compliance mechanism in line with the CPRA. In addition, he has been closely following legislative activities of US federal privacy bills, including COPRA (Consumer Online Privacy Rights Act), SAFE Data Act, and ADPPA (American Data Protection and Privacy Act), as well as US Federal Trade Commission (FTC)’s rulemaking efforts on privacy and data security. 

Sugimoto has assisted several clients in complying with China’s data-related laws, including the PIPL, Data Security Law, and Cybersecurity Law. His ongoing work includes helping clients carry out personal information protection impact assessment under the PIPL, preparing PIPL-compliant consent forms, personal information entrustment addendums, data transfer agreements (SCCs), guidance on data protection management systems, internal security rules, privacy policies, data subject rights request manuals, personal information breach response manuals, and handling large data mapping projects in bilingual languages in collaboration with major Chinese law firms.

Outside of direct dealings with clients, Sugimoto has also been invited as a speaker at various data protection-related events organized by data protection supervisory authorities. In October 2021, he was invited to speak at the “Global Privacy Assembly 2021 Mexico,” where he participated as a panelist in “Panel IV: The Challenge of Compliance: The Perspective of Data Protection Officers.”

Sugimoto received an LL.B. degree from Keio University, Faculty of Law in 2004; an LL.M. degree from the University of Chicago Law School in 2012; and an MJur degree from the University of Oxford, Faculty of Law (Pembroke College) in 2013.

Subscribe to receive the FPF Monthly Briefing and follow FPF on Twitter and LinkedIn to get the latest global data protection updates.

Age-Appropriate Design Code Passes California Legislature

Update: On Sep 15, 2022, California Governor Gavin Newsom signed AB 2273, the California Age-Appropriate Design Code Act. The law will apply to businesses that provide online services, products, or features likely to be accessed by children and broadly requires businesses to implement their strongest privacy settings by default for young users up to the age of 18. AB 2273 will become enforceable on July 1, 2024.

This week, the California legislature passed AB 2273, the California Age-Appropriate Design Code Act (ADCA). The California ADCA is modeled after the UK’s Age Appropriate Design Code, and would apply to businesses that provide “an online service, product, or feature likely to be accessed by a child.” If enacted by Governor Gavin Newsom, the child-centered design law would be the first of its kind in the United States. 

The California ADCA would introduce significant new compliance obligations for US businesses that go beyond the requirements codified in COPPA – the longstanding federal children’s privacy law. Unlike COPPA, which defines “child” as an individual under 13 years old and applies to child-directed services, the California bill defines “child” as  an individual under 18 and applies to any online service that is “likely to be accessed by a child.” For covered entities, the bill would require the implementation of new protective measures for young users, such as configuring default privacy settings to those with the highest level of privacy, and places new limits on profiling, processing geolocation data, and the use of “dark patterns” to influence behavior.

What’s Next?

The California Age-Appropriate Design Code would become enforceable July 1, 2024 if enacted by Governor Newsom. The bill leaves many important questions unanswered. Covered entities may seek clarity and guidance from the California Children’s Data Protection Working Group, a new entity created by this bill. The working group would be required to submit a report to the legislature by January 1, 2024 regarding recommendations and best practices for compliance. The passing of the California ADCA reflects a growing focus on protecting children’s privacy online and many expect to see other legislatures follow California’s lead next year. 

With contributions from FPF’s Keir Lamont and Bailey Sanchez.

New Report on Limits of “Consent” in Singapore’s Data Protection Law

Introduction

Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the thirteenth in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).

This report provides a detailed overview of relevant laws and regulations in Singapore, including:

The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.

Singapore’s Data Protection Landscape

Singapore’s Personal Data Protection Act 2012 (PDPA), which was passed in November 2012 and significantly reviewed in 2020, with the stated purpose of governing the collection, use, and disclosure of personal data by organizations in a manner that recognizes not only individuals’ right to protection of their personal data but also organizations’ needs to collect, use, and disclose personal data.

The PDPA sets the baseline standard of protection for personal data in Singapore, though organizations which are subject to sector-specific laws and regulations (including in the financial services and medical sectors) must also comply with sector-specific requirements.

The PDPA also establishes a data protection authority, the Personal Data Protection Commission (PDPC) to advance the PDPA’s stated purpose, balancing protection of personal data with use of personal data for legitimate purposes. To that end, the PDPC implements policies relating to personal data protection and issues advisory documents to help organizations to understand and comply with their obligations under the PDPA.

The PDPC is also empowered to enforce the PDPA by, for example, issuing binding directions or requiring payment of a financial penalty. The PDPC is active in enforcement and regularly publishes decisions, which effectively function as a body of case law on personal data protection matters, on its website.

Role and Status of Consent as a Basis for Processing Personal Data in Singapore

Consent has played a key role in the architecture of the PDPA since the PDPA was first enacted in 2012.  The PDPA’s default requirement is an organization may only collect, use, or disclose personal data about an individual (i.e., data subject) only if:

However, this default requirement has always been subject to exceptions.

Between 2012 and 2021, the PDPA provided several long lists of exceptions to the consent requirement for collection, use, and disclosure of personal data in the Second, Third, and Fourth Schedules to the PDPA, respectively.

However, major recent amendments to the PDPA, which were passed in 2020 and took effect in 2021, replaced these various exceptions with a consolidated set of provisions allowing for collection, use, and disclosure of personal data without consent.

These provisions are set out in the First Schedule to the PDPA, under the following headings:

Additionally, the amended Second Schedule to the PDPA also provides further exceptions to consent for public interest and research purposes.

The new First and Second Schedules retain many of the old exceptions to consent. However,  a notable introduction was the “legitimate interests” section, which – though borrowing a term from European data protection law – was unique when compared with other major data protection laws internationally. The amended PDPA distinguishes between two categories of legitimate interests:

Another notable introduction is the “business improvement purposes” section. This provision allows an organization to share data with a related organization for the following purposes, subject to fulfillment of certain conditions, including, among others, necessity, reasonableness of purpose, and commitment to implement appropriate safeguards:

The 2020 amendments also expanded the situations in which an individual would be deemed by law to have consented to collection, use, or disclosure of his/her personal data. The amendments added new provisions for deemed consent by contractual necessity and deemed consent by notification. The new provision on deemed consent by notification relies on a similar risk assessment to the legitimate interests provision. Specifically, to rely on this provision, an organization must:

Independently of the requirement to obtain consent, the PDPA also restricts collection use, and disclosure of personal data to purposes that a reasonable person would consider appropriate in the circumstances. An organization that wishes to collect, use, or disclose personal data must also notify the individual of this purpose, unless an exception applies.

Finally, regulations to PDPA also establish informed consent as a legal basis for transferring personal data out of Singapore.

By default, the PDPA requires that organizations which seek to transfer personal data out of Singapore must either provide the data with, or apply to the data, a standard of protection that is comparable to that under the PDPA. However, organizations are taken to have satisfied this requirement if they obtain the individual’s consent for the transfer of the individual’s personal data out of Singapore after giving the individual a “reasonable summary in writing of the extent to which the transferred personal data will be protected to a comparable standard.”

Read the previous reports in the series here.

Blog Cover Image by Aditya Chinchure on Unsplash