What Happened to the Risk-Based Approach to Data Transfers?

The following is a guest post to the FPF blog from Lokke Moerel, Professor of Global ICT Law at Tilburg University and a Dutch Cyber Security Council member. This blog is a summary of a longer academic paper which can be downloaded here.

The guest blog reflects the opinion of the author only. Guest blog posts do not necessarily reflect the views of FPF.

Introduction

In my earlier FPF guest blog on the geopolitics of trans-Atlantic data transfers, I flagged that Schrems II companies increasingly find themselves in a catch-22. Frustrations are running high as companies work towards Schrems II compliance by executing measures to mitigate the risk that US government entities can access their data. Yet, EU data protection authorities (DPAs) continue to block their way. The DPAs increasingly adopt an absolutist approach, whereby mitigating measures are disregarded irrespective of the actual risk for data protection after transfer, triggering a debate on what happened to the risk-based approach of the GDPR (RBA).  This has come to the fore in recent decisions of the DPAs as to the data transfers in the context of the use of Google Analytics. The Austrian DPA kicked things off by issuing a decision in a complaint of noyb against, i.e., Google (GA decision).1 In this decision, the Austrian DPA explicitly discards the applicability of the RBA as far as the data transfer provisions of the GDPR are concerned. In a Q&A issued by the CNIL concerning the use of Google Analytics, the CNIL also indicated that the RBA cannot be applied to data transfers.2

This is noteworthy, as, in legal literature, it is generally assumed that the RBA is incorporated in the ‘accountability principle’ of Article 24 GDPR and that this principle has a horizontal application throughout the GDPR and therefore also applies to the data transfer requirements.3 In this light, it is high time for an in-depth assessment of whether, and if so, to what extent the GDPR introduced the RBA, and specifically whether the RBA also applies to the data transfer requirements of Chapter V of the GDPR.

The conclusion will indeed be that the accountability requirement of Article 24 GDPR incorporates the RBA for all obligations of the controller in the GDPR. Where the transfer rules are stated as obligations of the controller (rather than as absolute principles), the RBA of Article 24 therefore applies. Other than the DPAs assume, this is not contradicted by the ECJ in Schrems II nor by the EDPB recommendations on additional measures following the Schrems II judgment. We will, however, also see that the EDPB is trying to rewrite the GDPR by applying the accountability principle of Article 5(2) GDPR (which does not include the RBA) rather than the accountability principle of Article 24, which does. By taking this position, the EDPB pushes its own version of the accountability principle as proposed at the time for revision of the Directive, which was, however, ultimately not adopted by EU regulators in the GDPR.

1. Reasoning Austrian DPA in GA decision

In the GA decision, the Austrian DPA rejected Google’s arguments that a RBA should be taken when assessing the impact of the data transfers in the context of Google Analytics and that the Austrian DPA applies too strict a standard when considering that the mere possibility of access is relevant and not the actual risk of U.S. public authorities accessing the data.

Specifically, the DPA reasoned that such RBA could not be derived from the wording of Art. 44 GDPR. See the decision point D.4 (underlining by Austrian DPA in the original decision):

“Art. 44 GDPR – General principles of data transmission

Any transfer of personal data already processed or to be processed after their transfer to a third country or an international organization shall only be allowed if the controller and the processor comply with the conditions laid down in this Chapter and with the other provisions of this Regulation, including any onward transfer of personal data from that third country or international organization to another third country or international organization. All provisions of this Chapter shall be applied in order to ensure that the level of protection of natural persons ensured by this Regulation is not undermined.”

On the contrary, it can be deduced from the wording of Art. 44 GDPR that for every data transfer to a third country (or to an international organization), it must be ensured that the level of protection guaranteed by the GDPR is not undermined.

The success of a complaint of a violation of Art. 44 GDPR, therefore, does not depend on whether a certain “minimum risk” is present or whether U.S. intelligence services have actually accessed data. According to the wording of this provision, a violation of Art. 44 GDPR already exists if personal data are transferred to a third country without an adequate level of protection.

In connection with those provisions of the GDPR where a risk-based approach is actually to be followed (“the higher the processing risk, the more measures are to be implemented”), the legislator has also explicitly and without doubt, standardized this. For example, the risk-based approach is provided for in Art. 24(1) and (2), Art. 25(1), Art. 30(5), Art. 32(1) and (2), Art. 34(1), Art. 35(1) and (3) or Art. 37(1)(b) and (c) GDPR. Since the legislator has standardized a risk-based approach in numerous places in the GDPR, but not in connection with the requirements of Art. 44 GDPR, it cannot be assumed that the legislator merely “overlooked” this; an analogous application of the risk-based approach to Art. 44 GDPR is therefore excluded.”

The Austrian DPA further rejected the arguments of Google that the RBA was confirmed by the European Court of Justice (ECJ) in the Schrems II judgement4 and the EDPB’s Recommendations 01/2020 on measures to complement transfer tools to ensure the level of protection of personal data under EU law.5

The Austrian DPA further states that the GDPR:

“Unlike Chapter V – see below – Art. 5(2) in conjunction with Art. 24(1) GDPR now actually take a risk-based approach. The higher the risk associated with the data processing, the higher the standard for the evidence to be submitted in order to prove compliance with the GDPR.”

2. Questions of law to be investigated

Based on the GA decision, there are a number of questions of law to be investigated:

  1. Does the RBA apply to the accountability requirements in Article 24 only, in the sense that the standard of evidence (i.e., the required accountability measures, like policies, training requirements, etc.) scales with the risk of the relevant processing rather than that the RBA applies also to the underlying obligations of the controller set out in other provisions of GDPR?
  2. Is the position under 1) supported by the fact that where the EU regulator intended to implement the RBA, this is explicitly expressed in the relevant provisions only? [which seems to be the position of the Austrian DPA]
  3. If the position under 1) is not correct, and RBA in Article 24 GDPR must be considered to constitute a horizontal provision applying a RBA also to the underlying obligations of the controller, does the RBA then relate to the obligations of controllers in Chapter IV only, or to all data protection obligations of controllers, including those of Chapter V?
  4. Does Article 5(2) indeed take a RBA for the accountability principle? [which seems to be the position of the Austrian DPA]
  5. Is the position under 1) confirmed by the ECJ in the Schrems II judgment?
  6. Is the position under 1) confirmed by the EDPB Recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data (EDPB Recommendations)?6

3. Summary Conclusions

Based on an analysis of the wording of the GDPR (see Section 5), the legislative history of the GDPR (see Section 6), the Schrems II judgment (see Section 7), and the EDPB Recommendations (see Section 8) the conclusions are:

4. Interpretation of Article 5 and 24 GDPR

According to the settled case law of the ECJ, the interpretation of a provision of EU law requires that account be taken not only of its wording and the objectives it pursues but also of its legislative context and the provisions of EU law as a whole. Also, the origins of a provision of EU law may provide information relevant to its interpretation.8

Textual analysis

Article 24 is the first provision of Chapter IV (Controller and processor) Section 1 (general obligations). Reviewing the language of Article 24 GDPR, it resembles that of Article 25 (Data protection by design and by default) and Article 30 (Security). The heading of Article 24 is “Responsibility of the controller,” and the provision starts with the qualifier “taking into account the nature, scope, context, and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall….” It is not under discussion that this implies the RBA.

The question then is whether the RBA applies to the standard of evidence (the accountability measures) or also to the underlying obligations of the controller under the GDPR themselves. The text of Article 24 reads that the controller must “ensure and to be able to demonstrate that processing is performed in accordance with this Regulation.” Where the controller explicitly has to ensure compliance by taking a RBA, it is difficult to see why the RBA in Article 24 would only apply to the level of standard of evidence (i.e., to be able to demonstrate compliance) and not to the underlying controller obligations themselves. The obligation further explicitly refers to all requirements under the Regulation.

That being said, not all provisions of the GDPR are formulated as obligations of the controller. For example, the general processing principles listed in Article 5(1) are not formulated as obligations of the controller but as absolute principles. In Article 5(2) it is subsequently provided that “the controller is responsible for, and shall be able to demonstrate compliance with paragraph 1 (“accountability”).” Noteworthy here is that this accountability requirement is not in any manner qualified, taking a RBA similar to Article 24. This seems to mean that the RBA does not apply to the material processing principles (why otherwise include Article 5(2) in the first place; in that case, Article 24 GDPR would have been sufficient).

The question then is, how does this apply to the data transfer rules of Chapter V? There is no indication whatsoever in the GDPR that the general obligation of the controller of Article 24 would not also apply to obligations of controllers under Chapter V (again Article 24 requires that controllers ensure compliance with the Regulation).

Rather, there are indications to the contrary. For example, the privacy-by-design requirements and security requirements (which also incorporate the RBA) remain applicable when transferring data (see explicitly Recital 108). In the same vein, also the accountability principle will be applicable when transferring data (provide the transfer rules are formulated as obligations of the controller rather than in absolute principles).

As the Austrian DPA notes, the general principle for transfers in Article 44 does indeed provide that “any transfer of personal data shall only take place in accordance with the conditions of this Chapter,” but (as omitted by the Austrian DPA) this general principle is explicitly made “subject to the other provisions of this Regulation.” This is logical; Chapter V on transfers cannot be considered on a standalone basis. The transfer rules aim to ensure that data receive a similar level of protection after being transferred to a third county that does not provide an adequate level of protection, not a higher protection. This is also expressed in the last sentence of Article 44:

“All provisions in this Chapter shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.”

Article 46 GDPR (transfers subject to appropriate safeguards) is further formulated not as an absolute principle (like the general processing principles of Article 5(1)) but as an obligation of the controller where it allows data transfers “if the controller (…) has provided appropriate safeguards and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available.”

The conclusions seem justified that the obligation of the controller “to provide appropriate safeguards” under Article 46 GDPR are indeed risk-based, with the exception of where Article 46(1) provides for the absolute requirements “that enforceable data subject rights and effective legal remedies for data subjects are available.”

5. Legislative history Article 5 and 24 GDPR

5.1 The EU Data Protection Directive

Historically, EU data protection legislation has been “rights-based,” and the requirements were to be applied irrespective of the level of risk involved and whether actual harm was created.9 As the WP29 (the predecessor of the EDPB) put it at the time, the EU data protection legal framework provides for a ‘minimum and non-negotiable level of protection for all individuals.’ 10 This is all the more so since the entry into force of the Treaty on the Functioning of the European Union in 2010, which granted the right to personal data protection the status of a fundamental right of the EU (see Article 8 of the EU Charter11 and Article 16(1) of TFEU12).

Noteworthy is that the protection of data transfers is not among those listed as a fundamental right. The EU transfer rules are not considered to be one of the material processing principles, as the transfer rules are a mechanism to ensure that these material processing principles will be observed, rather than being a fundamental processing principle itself.13 This being said, the transfer rules are crucial in their own right to guarantee the protection provided by the EU Data Protection Directive (Directive) and therefore are a key cornerstone of the Directive.14 This distinction is continued in the GDPR, where the material processing principles are listed in Article 5(1) GDPR (and do not include data transfer requirements), and the data transfer requirements are regulated separately in Chapter V.

5.2 Legislative reform

The Directive did not include an accountability principle, and it was only as part of the legislative review of the Directive that this principle was introduced. The main trigger for introducing the accountability principle was that the legislative review of the Directive by the EC showed that there was a widespread lack of compliance with the Directive, in particular also the data transfer requirements and that the enforcement tools of the DPAs were not sufficient to force compliance.15 On July 9, 2009, the EC launched a consultation on the EU data protection legal framework. As part of the consultation, the WP29 and EDPS issued a number of opinions, which basically advised the EC to introduce the accountability principle in the revised Directive. The proposals of the WP29 developed somewhat over time, but its last stance was adopted by the EC in its first proposal for a new Regulation.16

(a) WP29 Opinion on the accountability principle (July 2010)

In its Opinion on the accountability principle, the WP29 proposed the following concrete provision:

“Article X – Implementation of data protection principles
1. The controller shall implement appropriate and effective measures to ensure that the principles and obligations set out in the Directive are complied with.
2. The controller shall demonstrate compliance with paragraph 1 to the supervisory authority on its request.”

The provision refers to all principles and obligations of the revised Directive. The Opinion further reflects that the accountability measures (rather than the material principles themselves) should be scalable (see para. 53). As to the consequences of compliance with the accountability principle, the WP 29 (at p. 11) stresses that “fulfilling the accountability principle does not necessarily mean that a controller is in compliance with the substantive principles […], i.e., it does not offer a legal presumption of compliance nor does it replace any of those principles.”

(b) First EC proposal for a Regulation (December 25, 2012)

The EC’s first proposal for a Regulation basically implements the proposals of the WP29. According to the Explanatory Memorandum accompanying the EU Commission’s first proposal17 dated December 25, 2012, the provisions of Article 22 of the draft considered the debate on a “principle of accountability” and described in detail the obligation of responsibility of the controller to comply with the Regulation and to demonstrate compliance, by adopting internal policies and mechanisms for ensuring such compliance. The first draft of the EU Commission did not include a reference to the “accountability principle” and did not include a reference to scalability (RBA) of the accountability provisions.

Article 5 sub (f):
“processed under the responsibility and liability of the controller, who shall ensure and demonstrate for each processing operation the compliance with the provisions of this Regulation
“Article 22
Responsibility of the controller
The controller shall adopt policies and implement appropriate measures to ensure and be able to demonstrate that the processing of personal data is performed in compliance with this Regulation, including the assignment of responsibilities, and the training of staff involved in the processing operations.”
Recital (60):
Comprehensive responsibility and liability of the controller for any processing of personal data carried out by the controller or on the controller’s behalf should be established. In particular, the controller should ensure and be obliged to demonstrate the compliance of each processing operation with this Regulation.”

Note that Article 5(2) is based on Article 6(2) of the Directive, which embodied the original and narrower meaning of accountability as responsibility for compliance.

(c) Note of the Presidency to EU Council on implementation of RBA (March 1, 2013)

Further to a first examination of the EU Commission proposal, the Presidency reported to the EU Council18 that several Member States voiced their disagreement with the level of prescriptiveness of a number of obligations in the draft Regulation. Many delegations stated that the risk inherent in certain data processing operations should be the main criterion for calibrating the data protection obligations. Where the data protection risk was higher, more detailed obligations would be justified, and where it was comparably lower, the level of prescriptiveness should be reduced.19 The revised draft subsequently incorporated a ‘horizontal clause’ in Article 22 to incorporate the RBA:

“Taking into account the nature, scope and purposes of the processing and the risks for the (…) rights and freedoms of data subjects, the controller shall implement appropriate measures to ensure and be able to demonstrate that the processing of personal data is performed in compliance with this Regulation (…).”20

Art. 5 sub (f) was changed into:

“processed under the responsibility (…) of the controller (…)

Therefore basically reverting the language back to the text of its predecessor Article 6 (2) Directive.

(d) WP29 Statement on the role of a RBA in data protection legal frameworks (May 30, 2014)21

In reaction to these developments in the EU legislative process, the WP29 issued a Statement on the role of a RBA in data protection legal frameworks. From this Statement, it can be derived that the WP29 was well aware that the changes proposed by the European Parliament and the Council constituted a major change as the RBA was now introduced as a core element of the accountability principle, also impacting the underlying obligations of controllers rather than (just) the accountability measures themselves, see p. 2:

“However, the risk-based approach has gained much more attention in the discussions at the European Parliament and at the Council on the proposed General Data Protection Regulation. It has been introduced recently as a core element of the accountability principle itself (Article 22).”

The WP29 further clarified in a number of crisp statements that the RBA should (i) not apply to the key rights granted to data subjects, which apply regardless of the level of risks incurred by the processing, and (ii) that there can be different levels of accountability obligations depending on the risk posed, but that controllers should always be accountable for compliance with the data processing obligations “whatever the nature, scope, context, purposes of the processing and the risks for data subjects are.”

(e) Final text GDPR dated April 8, 2016

The EU Council ignored the WP29 Statement and adopted the final version of Article 24 GDPR.22 The EU Council, in its accompanying statement (p. 4),23 explained that it had strengthened the accountability of controllers and processors to promote a real data protection culture and introduced throughout the Regulation a risk-based approach, allowing for the modulation of the obligations imposed on controllers.

5.3 Assessment based on the legislative history of the GDPR

Inclusion of Article 5(2) seems to be based on Article 6(2) of the Directive (“It shall be for the controller to ensure that paragraph 1 is complied with”), which embodied the original and more narrow meaning of accountability as responsibility for compliance. It was at the proposal of the European Parliament to maintain the original proposal of the EC and bring this provision more into line with accountability (‘be able to demonstrate’ rather than ‘demonstrate’) and the addition of the word ‘accountability’ in brackets at the end.24 The Council proposed instead to concentrate on responsibility.25 The resulting compromise was a combination in Article 5(2) of responsibility proposed by the Council and demonstrability and the label ‘accountability’ in brackets proposed by the Parliament. 26 There are no indications in the legislative history why the accountability element in Article 5(2) was first included, then deleted, and then reinstated but without the RBA. As this provision must have meaning (why otherwise reinstate it), it seems justified to conclude that the RBA does not apply to the material processing principles of Article 5.

The actual principle of accountability, as inspired by the proposals of the WP29 found its way into Article 22 (now 24). It is unclear why the EC declined to use the term accountability principle in the text or heading of Article 22 itself. It is only in the Explanatory Memorandum (at para. 3.4.4) that it is explained that Article 22 [now 24] “takes account of the debate on a ‘principle of accountability’”. The heading further referred to the “responsibility of the controller,” which fitted more the compliance notion of Article 5(2). It is clear that the EC, in its first draft proposal for the Regulation included the accountability principle as advocated by the WP29, whereby the provision applied to the standard of evidence only and not also to the underlying obligations of the controller. Based on the legislative history it is however undisputable that subsequent changes to the initial Article 22 were introduced by the Council in order to incorporate a horizontal provision applying the RBA for all obligations of the controller, and specifically also for the data transfer obligations.

6. Assessment of Schrems II

Reviewing the ECJ judgment in Schrems II,27 the Austrian DPA is correct that the ECJ does not refer to the accountability principle or the RBA under the GDPR. The conclusion of the Austrian DPA, however, that the ECJ (therefore thus) does not take a RBA to data transfers cannot be based on this judgment. What the ECJ did in the Schrems II was raise the bar for international data transfers based on Article 46 (transfers based on appropriate safeguards) to the so-called essentially equivalent level; this in reference to the general principle for transfers of Article 44 and the EU Charter of fundamental rights (see para. 131 – 134). In the absence of an adequacy decision, the ECJ considers it the responsibility of the controller to make a transfer assessment before a transfer can take place on the basis of appropriate safeguards, which also includes an assessment of the laws and practices of the country or countries where the data are flowing to (see para. 126: where the ECJ explicitly refers to “the law and practices in force in the third country concerned” and requires “(…) ensuring, in practice, the effective protection of personal data transferred to the third country concerned.”28 The controller should then take measures to compensate for any lack of data protection by way of appropriate safeguards. It is important to note that the Court does not require that additional safeguards provide a 100% guarantee that access to data by third parties can never occur, but rather that they constitute “effective mechanisms that make it possible, in practice, to ensure compliance with the level of protection required by EU law…” (para. 137). Though the ECJ did not explicitly refer to the accountability principle of Article 24, this transfer assessment obligation of the controller seems in line with the RBA of the accountability principle of Article 24.

This is also confirmed by the dictum of Schrems II. The dictum provides that the relevant aspects of the legal system of the third country need to be taken into consideration, therefore not only the law of the relevant third country but also its practices, as also follows from para. 126 of Schrems II. The ECJ refers to relevant aspects to the non-limitative list of elements in Article 45(2) GDPR, which the EC needs to consider when performing an adequacy assessment of a third country. The list of Article 45(2) shows that the EC, in its assessment, not only needs to assess the law of the country but also “the effective functioning” of the law. In other words, all relevant aspects of the legal system are in practice.29

7. Assessment EDPB Recommendation

The EDPB in the Recommendation30 reflects the Schrems II judgment in a similar manner. The EDPB indicates that the Schrems II judgment “reminds us that the protection granted to personal data in the European Economic Area (EEA) must travel with the data wherever it goes,” that “the Court also asserts this by clarifying that the level of protection in third countries does not need to be identical to that guaranteed within the EEA but essentially equivalent,” that the “Court also upholds the validity of standard contractual clauses, as a transfer tool that may serve to ensure contractually an essentially equivalent level of protection for data transferred to third countries,” but that these “do not operate in a vacuum” and that:

“controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and, where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools. In those cases, the Court still leaves open the possibility for exporters to implement supplementary measures that fill these gaps in the protection and bring it up to the level required by EU law. The Court does not specify which measures these could be. However, the Court underlines that exporters will need to identify them on a case-by-case basis. This is in line with the principle of accountability of Article 5.2 GDPR, which requires controllers to be responsible for, and be able to demonstrate compliance with the GDPR principles relating to processing of personal data.

It is noteworthy that the EDPB explicitly refers to the accountability principle of Article 5(2), but does not in any way refer to the accountability principle of Article 24. The EDPB in para. 1 of the Recommendations explicitly considers that the accountability principle of Article 5(2) GDPR31 also applies to data transfers “since they are a form of data processing in themselves.”32 I recall (see sub 7.1 above) that the Article 5(1) lists the general processing principles, but that these do not include the data transfer principles. The EDPB is correct in considering a transfer a processing, but this then entails that the material principles apply to transfers, but this cannot carry the conclusion that transfers are thus a material principle in themselves. This goes against the system of the GDPR where the transfer rules have their own Chapter V. The underlying reason for the EDPB to find this ‘work around’ is that the accountability principle of Article 5(2), as I also concluded, does not have the RBA as to compliance of the material principles, where the accountability principle of Article 24 does have the RBA for compliance of the obligations of controllers. By taking this position, the EDPB pushes its own version of the accountability principle as proposed by the WP29 at the time for revision of the Directive, which was, however, ultimately not adopted by the EU regulator. Noteworthy is, however, that despite the reference to Article 5(2) GDPR, the final version of the Recommendation does include language (however nominally) to allow for a RBA of data transfer assessments, though the threshold seems high. A more kind interpretation is that the EDPB is confused by the fact that Article 5(2) does include the reference to “accountability,” while Article 24 does not (see sub 4 above). I, however, do not believe the EDPB is confused here, but actually pushes its version of accountability principle as it advocated from the start, while normally covering its basis by including a nominal RBA into the Recommendations itself in line with Schrems II. That the RBA is indeed (though somewhat nominally) included in the Recommendations can be derived from the changes made by the EDPB in the initial version after consultation.

The initial consultation version of the Recommendations,33 did not take a RBA as to the transfer assessment. The consultation version even specifically indicated that organizations should “not rely on subjective [factors] such as the likelihood of public authorities’ access to your data in a manner not in line with EU standards” (see para 42). Following the consultation phase, whereby many stakeholders provided input that the EDPB had wrongfully ignored the RBA of the GDPR, the above statement was no longer included in the final version. Instead, the EDPB (somewhat nominally, and without any explicit acknowledgment) included the RBA approach, though the threshold to do so is very high. This is reflected in the text by including in a number of places that the transfer assessment should not only include the laws, but also the practices in the relevant third country (see in particular para. 43),34 but most importantly by allowing controllers to proceed with the transfer without supplementary measures if they have no reason to believe that the relevant legislation will be applied in practice (see para. 43.3).

8. Conclusion

The conclusion is that the accountability requirement of Article 24 GDPR incorporates the RBA for all obligations of the controller in the GDPR. Where the transfer rules are stated as obligations of the controller (rather than as absolute principles), the RBA of Article 24 therefore applies. Other than the DPAs assume, this is not contradicted by the ECJ in Schrems II nor by the EDPB recommendations on additional measures following the Schrems II judgment. The EDPB is trying to rewrite the GDPR by applying the accountability principle of Article 5(2) GDPR (which does not include the RBA) rather than the accountability principle of Article 24, which does. By taking this position, the EDPB pushes its own version of the accountability principle as proposed at the time for revision of the Directive, which was, however, ultimately not adopted by EU regulators in the GDPR.


1  https://noyb.eu/sites/default/files/2022-01/E-DSB%20-%20Google%20Analytics_DE_bk_0.pdf. See for English translation: Standarderledigung Bescheid (noyb.eu)

2 The CNIL also issued a Q&A concerning the use of Google Analytics: https://www.cnil.fr/fr/cookies-et-autres-traceurs/regles/questions-reponses-sur-les-mises-en-demeure-de-la-cnil-concernant-lutilisation-de-google-analytics The last question of the Q&A refers to the use of RBA by controllers by taking into account the likelihood of data access requests. The CNIL indicates that the RBA approach cannot be applied and explains that as long as the access to the transferred data is possible and the safeguards governing the issuance of requests for access to data do not guarantee a level substantially equivalent to the one guaranteed in the EU, it is necessary to take additional technical measures to make such access impossible or ineffective. 

3 See, specifically on the applicability of the RBA to data transfer requirements after the Schrems II judgement: Paul Breitbarth, “A Risk-Based Approach to International Data Transfers,” EDPL, 2021, p. 547; Christopher Kuner, ‘Schrems II Re-Examined’ (VerfBlog, August 25, 2020) , https://verfassungsblog.de/schrems-ii-re-examined/; and Christopher Kuner, Lee Bygrave and Christopher Docksey, The EU General Data Protection Regulation: A Commentary. Update of Selected Articles. Oxford University Press, 2021, p. 113. Other authors discuss the RBA of the GDPR, but not specifically in the context of data transfers and the ECJ judgement in the Schrems II case.

4 Case C-311/18 Data Protection Commissioner v Facebook Ireland Limited and Maximillian Schrems [2020] ECLI:EU:C:2020:559 : CURIA – Case information (europa.eu).

5 edpb_recommendations_202001vo.2.0_supplementarymeasurestransferstools_en.pdf (europa.eu).

6 Ibid.

7 See for a similar reference also para. 158.

8 ECJ judgment of December 10, 2018, Wightman and Others, C-621/18, EU:C:2018:999, paragraph 47 and the case-law cited: CURIA – Case information (europa.eu)

9 See, Amann v Switzerland App No 27798/95 (ECtHR, February 16, 2000) §70: in order to determine whether a processing constitutes an interference, the fact that the data subject may ‘have been inconvenienced in any way’ is irrelevant: AMANN v. SWITZERLAND (coe.int).

10 Art. 29 WP, ‘Opinion 1/98 Platform for Privacy Preferences (P3P) and the Open Profiling Standard (OPS) , (1998), p. 2: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/1998/wp11_en.pdf.

11 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:12012P/TXT

12 https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:12012E/TXT:en:PDF

13 This is evidenced by the fact that in the Directive the EU transfer rules are not included in Chapter II (The General Rules on the Lawfulness of the Processing of Personal Data), but in a separate Chapter IV (Transfer of personal Data to third Countries). For a similar separation of the basic principles and the transfer rules see the Joint Proposal for a Draft of International Standards on the Protection of Privacy with regard to the processing of Personal Data (Madrid Draft Proposal for International Standards), as adopted on November 5, 2009 at The International Conference of Data Protection and Privacy Commissioners in Madrid by the participating data protection authorities, to be found at https://edps.europa.eu/sites/edp/files/publication/09-11-05_madrid_int_standards_en.pdf, where the transfer rules are included in Section 15 and the basic principles of data protection in Part II.

14 See WP 12, Working Document on Transfers of personal data to third countries: Applying Articles 25 and 26 of the EU data protection directive, July 24, 1998 (WP 12), at https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/1998/wp12_en.pdf, where the Working Party 29 lists “six content principles” of which the 6th is: “restrictions on onward transfers – further transfers of the personal data by the recipient of the original data transfer should be permitted only where the second recipient (i.e., the recipient of the onward transfer) is also subject to rules affording an adequate level of protection. The only exceptions permitted should be in line with Article 26(1) of the directive.” Since a restriction on onward transfers was at the time missing from Convention 108, the Working Party 29 considered the protection provided by the countries that had at the time ratified Convention 108 was insufficient (see WP 12, at 8). This led to adoption of a transfer rule similar to the Directive in Article 2 of the Additional Protocol to Convention 108.

15 Rand Europe, Review of the European Data Protection Directive, Technical Report dated May 2009 (Rand Report) at https://www.rand.org/pubs/corporate_pubs/CP1-2009.html. Other reviews showed similar results: see Douwe Korff, EC Study on implementation of the Data Protection Directive, Comparative study of national laws, September 2002, Human Rights Centre University of Essex, at 209, to be found at <http://papers.ssrn.com>, notes that “the powers now vested in the data protection authorities, as currently exercised, have not been able to counter continuing widespread disregard for the data protection laws in the Member States.”

16 https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2009/wp168_en.pdf, https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2010/wp173_en.pdf 

17 https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0011:FIN:EN:PDF

18 https://data.consilium.europa.eu/doc/document/ST%206607%202013%20REV%201/EN/pdf.

19  See para. 5 at https://data.consilium.europa.eu/doc/document/ST%206607%202013%20REV%201/EN/pdf.

20 See p. 23 at https://data.consilium.europa.eu/doc/document/ST-8004-2013-INIT/en/pdf

21 https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf

22 https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52016AG0006%2801%29.

23 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST_5419_2016_ADD_1&from=EN.

24 See Amendment 99, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52014AP0212&from=EN.

25 See p. 83 at https://data.consilium.europa.eu/doc/document/ST-9565-2015-INIT/en/pdf.

26 Cf. supra n. 3, p. 113.

27 Cf. supra n.4.

28 Ibid. see para. 126.

29 Cf. supra n.5.

30 Cf. supra n.5.

31 See para. 3 where the EDPB refers to the accountability principle and includes in footnote 12 again a reference to Article 5(2) GDPR only. See also para. 5, footnote 18; para. 48, footnote 58; and para. 76, footnote 77. The only reference to Article 24 can be found in footnote 22, which seems an oversight more than intentional.

32 The EDPB refers to para. 45 of Schrems II. However, in this paragraph the ECJ just indicates that a transfer is a processing (which is correct), but this is not in any way related to how Article 5(1) GDPR should be interpreted.

33 Cf. supra n.5.

34 Cf. supra n.4.

Call for Nominations: 13th Annual Privacy Papers for Policymakers

The Future of Privacy Forum (FPF) invites privacy scholars and authors with an interest in privacy issues to submit finished papers to be considered for FPF’s 13th annual Privacy Papers for Policymakers (PPPM) Award. This award provides researchers with the opportunity to inject ideas into the current policy discussion, bringing relevant privacy research to the attention of the US Congress, federal regulators, and international data protection agencies.

The award will be given to authors who have completed or published top privacy research and analytical work in the last year that is relevant to policymakers. The work should propose achievable short-term solutions or new means of analysis that could lead to real­ world policy solutions.

FPF is pleased to also offer a student paper award for students of undergraduate, graduate, and professional programs. Student submissions must follow the same guidelines as the general PPPM award.

We encourage you to share this opportunity with your peers and colleagues. Learn more about the Privacy Papers for Policymakers program and view previous year’s highlights and winning papers on our website.

FPF will invite winning authors to present their work at an annual event with top policymakers and privacy leaders in spring 2023 (date TBD). FPF will also publish a printed digest of the summaries of the winning papers for distribution to policymakers in the United States and abroad.

Learn more and submit your finished paper by October 21st, 2022. Please note that the deadline for student submissions is November 4th, 2022.

The “Colorado Effect?” Status Check on Colorado’s Privacy Rulemaking

Colorado is set to formally enter a rulemaking process which may establish de facto interpretations for privacy protections across the United States. With the passage of the Colorado Privacy Act (CPA) in 2021, Colorado, along with Virginia, Utah, and Connecticut, became part of an emerging group of states adopting privacy laws that share a similar framework and many core definitions with a legislative model developed (though never enacted) in Washington State. However, while the general model of legislation seen in the CPA is similar to recently enacted state privacy laws, the CPA stands alone in providing authority to the state Attorney General to issue regulations. 

Because no other similar state law has provided for this type of interpretative authority, regulations issued by the Colorado Attorney General could have far-reaching implications for how both businesses and regulators in other jurisdictions come to interpret key state privacy rights and protections. Colorado’s pre-rulemaking process recently concluded, revealing a range of possible directions that formal rulemaking could take. Below, we assess key priorities and areas of significant divergence that have been brought into focus both through public comments from stakeholders and questions posed by the Attorney General.

The Rulemaking Process

The CPA grants broad discretionary rulemaking authority to the Colorado Attorney General to issue regulations to help implement the Act. In April 2022, Colorado Attorney General Phil Weiser released a set of pre-rulemaking considerations containing a series of questions for public comment. This document offered the first hints as to the specific topics that the Colorado Department of Law (“the Department”) is considering addressing beyond opt-out mechanisms. It includes targeted questions on the CPA’s consent requirements, restrictions on so-called “dark patterns”, standards for data protection assessments, and consumers’ right to opt-out of certain automated profiling decisions. The Department’s questionnaire received 44 comments from a range of stakeholders including business groups, non-profits, civil society organizations, and think tanks (including the Future of Privacy Forum). We provide a non-comprehensive summary of significant issues addressed across these public comments below.

1. Universal Opt-Out Mechanisms

Colorado holds the distinction of being the first state to clearly require that businesses allow consumers to exercise certain privacy rights on an automated basis through technological signals (such as browser settings or plug-ins). Notably, opt-out mechanisms are the only topic on which the CPA requires rulemaking, directing the Attorney General to establish “technical specifications” for signal mechanisms that will: (1) prohibit signal providers from unfairly disadvantaging other businesses, (2) ensure that signals represent a consumer’s freely given choice to opt out, and (3) permit covered entities to authenticate that a signal is sent by a resident of the state and represents a legitimate request to opt out. The Department’s questionnaire addressed these issues and sought additional input on how signal mechanisms should apply to data collected offline.

Default Signal Settings: The CPA prohibits opt-out mechanisms that are a “default setting” and instead requires signals to represent a consumer’s “affirmative, freely given, and unambiguous” choice to opt out. The Department’s questionnaire sought feedback as to whether a consumer’s selection of a tool marketed for its privacy features without taking additional action would satisfy the requirement for user intent (an approach that regulators in California appear to have endorsed). This inquiry generated a broad range of responses. For example, a Wesleyan University professor asserted that the selection of “privacy-preserving products” including FireFox, Brave, and DuckDuckGo Privacy Essentials can unambiguously reflect an intent to opt out of targeted advertising and other forms of data monetization without requiring a user to take additional steps. Industry groups such as the Colorado Chamber of Commerce typically rejected this view, arguing that “any mechanism involving a default or pre-selected opt-out choice in effect would be an opt-in, rather than the opt-out required by the statute.” The Future of Privacy Forum called for a context-specific approach, arguing that while the installation of a single-purpose plug-in may reflect unambiguous consumer choice to opt out, the use of a multi-feature product such as a web browser would be unlikely to satisfy the CPA’s statutory requirements.

Opt-Out Signal Authentication: Under the CPA, opt-out mechanisms are required to allow recipient organizations to authenticate a signal’s user as a Colorado resident and to determine that the signal represents a legitimate opt out request. Numerous commenters expressed concern that establishing strict authentication procedures could have the effect of frustrating consumer intent in exercising their privacy rights and suggested regulatory workarounds. For example, the Colorado Privacy Policy Commission suggested a standard that opt-out signal authentication must require no more than three steps to complete. Separately, several organizations including Consumer Reports and the Network Advertising Initiative (NAI) suggested that regulations could permit authenticating residency with a user’s IP address. However, the State Privacy and Security Coalition (SPSC) and TechNet raised concerns about VPNs and other technologies that can make determining location by IP addresses unreliable, and further posited that the CPA may raise Constitutional concerns if enforcement of opt-out mechanisms extends beyond authenticated Colorado residents.

Signal Scope: A significant technical and policy challenge for the use of opt-out mechanisms is whether a signal can and should apply to data collected outside of the signal’s medium. For example, can a browser-based signal be used to exercise consumer rights over information that was previously collected at a brick-and-mortar retail store? Consumer Reports argued that while regulations should not require the collection of additional information in order to process opt out signals, a signal should apply beyond its present interaction “if the user is authenticated to the service by an identifier that applies in other contexts.” In contrast, business groups highlighted technical limitations with opt-out signals as they presently exist, for example, the Computer and Communications Industry Association (CCIA) posited that “if only browser extensions can serve as [opt out signals], the requirement to honor [opt out signals] should only extend to browsers.”

2. Consent

The CPA requires covered entities to obtain individual consent in certain circumstances, including for the processing of sensitive personal data and for incompatible secondary uses of information. The Act requires that consent be “freely given, specific, informed, and unambiguous,” closely matching the definition in other state laws and modeled on European privacy law. The Department sought information about each of these elements of consent as well as existing consent mechanisms.

Revoking Consent: Multiple organizations pointed to the lack of an explicit right to “revoke” consent as a potential gap in the statute to cover through rulemaking. The Electronic Privacy Information Center (EPIC) and The Samuelson-Glushko Technology Law & Policy Clinic at Colorado Law (TLPC) explained that while the CPA requires that it be just as easy to withdraw consent as it is to provide it in the case of overriding a universal opt out, there is no explicit right to revoke consent for other instances of data processing in the Act. Future of Privacy Forum pointed to broader rights of revocation in the GDPR and Connecticut Data Privacy Act as potential models to follow, recommending that “forthcoming regulations follow an approach similar as Connecticut by providing that consumers may, at any time, withdraw previously provided consent.” Law firm Husch Blackwell also highlighted model rights of revocation in other privacy regimes, further noting that “although it could be argued that the right to revoke consent is implicit in the CPA, it is not clear that Colorado law supports this position based on analogizing from existing court decisions.”

Implied Consent: Industry and advocacy groups alike also weighed in on when, if at all, implied consent could meet the statutory requirements of the CPA. CCIA contended that an “affirmative act” where a consumer purposefully provides personal data should not require additional consent procedures: “For instance, a consumer who intentionally submits sensitive demographic data (such as citizenship status or religious affiliation) while completing an online form should be deemed to have consented to the collection and processing of that demographic data.” On the other hand, EPIC and Consumer Reports sought stricter standards for obtaining consent. Consumer Reports proposed mandating that any request for consent include a “dedicated prompt” that “clearly and prominently describes the processing for which the company seeks to obtain consent,” while EPIC argued that consent should not be implied when a consumer exits a pop-up window that asks for consent.

3. Dark Patterns

The Colorado Privacy Act states that a consumer’s consent is not valid if obtained through the use of “dark patterns” which are defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” This language originated in the proposed DETOUR Act introduced by U.S. Senators Warner (D-VA) and Fischer (R-NE) in 2019. In the context of the CPA, the concept of dark patterns is a subset of the Act’s approach to individual consent. Nevertheless, the Department posed several specific questions on dark patterns, including whether the rules should outline specific types of dark patterns and what standards or principles could best guide design choices to avoid dark patterns.

Dark Patterns Definition and Scope: Several business groups raised concerns with the CPA’s definition of “dark patterns”, such as CTIA, which argued that the term is “vague” and leaves the door open to confusion on the part of both consumers and businesses. Numerous industry commenters encouraged the Department to avoid a prescriptive approach to the term and to instead focus on practices that amount to consumer deception or fraud, pointing to a long line of Federal Trade Commission enforcement actions in this realm. In contrast, some advocacy groups called for an expansive interpretation and application of the term “dark patterns” in order to protect consumers beyond the context of CPA’s “consent” requirements. For example, Common Sense Media recommended “prohibiting asymmetric platform design practices that limit users’ ability to change user settings, delete personal data, or delete their account.” Colorado Public Interest Research Group (CoPIRG) went a step further, recommending the development of rules that “prohibit platforms from using dark patterns in any consumer interaction.” However, it is unclear whether the Attorney General would have the statutory authority to issue expansive new restrictions on user interface designs along these lines.

4. Data Protection Assessments

Data Protection Assessments (“Assessments”) are an increasingly common requirement in privacy and data protection regimes around the globe. The CPA is no exception and requires an assessment for processing that “presents a heightened risk of harm to a consumer.” Assessments must weigh the risks and benefits of the processing activity and must be made available to the Attorney General upon request, though they are exempt from disclosure under the Colorado Open Records Act. The Department’s questions on this topic sought to clarify what circumstances should allow them to request an assessment and what requirements should exist for the form and content of the assessment.

Parameters for Requesting Assessments: TLPC recommended treating assessments as an ongoing process, with consistent feedback and input from affected consumers, controllers, and the Department of Law. In contrast, industry groups, including NAI, CCIA, CTIA, SPSC, and the Denver Metro Chamber of Commerce, asked that the Department establish specific parameters for when they may ask for an assessment to be conducted or disclosed. For example, the Alliance for Automotive Innovation (AAI) discouraged a regular cadence for iterating upon assessments, instead proposing that controllers be required to “update them only when there is a material change in processing activities that is likely to have an impact on consumer privacy.”

Form and Content of Assessments: In general, privacy advocates sought to establish more detailed parameters for the form and content for assessments, while industry representatives such as NAI, AAI, and various Chambers of Commerce sought more flexibility. For instance, while EPIC provided a list of preferred mandatory requirements, the Colorado Chamber of Commerce suggested that the Department “publish a set of voluntary factors that the controller could consider as they undertake a data protection assessment.”

5. Profiling

The CPA creates a new right to opt out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. Once again, this right is common to many emerging state privacy laws and is based on language that originated in the European Union. The Department raised numerous topics concerning profiling, including the disclosures about automated processing necessary for consumers to make informed opt out decisions, whether the rules should address specific legal or civil rights concerns or specific applications of profiling, whether there could be negative impacts of immediately implementing a request to opt out of profiling, and how the statute should apply to “partial” automated decisions.

Transparency: The application of the CPA’s transparency requirements to automated decision making systems was a significant focus for commenters. Industry comments typically sought limitations on disclosures, with the Denver Metro Chamber of Commerce arguing that “requiring granular visibility into each rapidly changing processing activity could cripple business.” CCIA further called for “explicit protections for intellectual property, trade secrets, and other legal rights of the business in question.” In contrast, EPIC called for broader disclosures about profiling activities such as “the sources and life cycle of the data processed by the system, including any brokers or other third-party sources involved in the data life cycle; and how the system has been evaluated for accuracy and fairness, including links to any audits, validation studies, or impact assessments.”

Opt Out Rights: Commenters also engaged on the range of profiling activities that should be subject to the consumer opt out right. Industry groups highlighted beneficial processing operations that could be disrupted by a broad reading of the language, including processing necessary for vehicle safety systems, fraud prevention, maintaining system integrity and security, and ad measurement and reporting. Many of these groups also called for regulations to limit the opt out right to “solely” automated decisions (that lack any human oversight), as Connecticut lawmakers have done. On this point, the Future of Privacy Forum recommended that consumer opt out rights still apply in situations where the human review of a profiling decision amounts to little more than a “rubber stamp.”

6. Miscellaneous Topics

Given the Attorney General’s broad rulemaking authority, any CPA topic is theoretically on the table for rulemaking, even if not specifically addressed in the questionnaire. Commenters sought regulatory tweaks and clarifications on many additional topics including:

Next Steps

The Attorney General has announced a goal of issuing draft regulations in the fall of 2022 (note: AG Weiser is on the ballot for Colorado’s General Election in November, the outcome of which may influence this timeline). Pursuant to the Colorado Administrative Procedure Act, publishing draft regulations will begin a formal notice-and-comment phase, which will also include at least one formal hearing. Given the importance of Colorado’s rulemaking process to the U.S. privacy landscape and the range of directions that the Attorney General could take on rulemaking (in both scope and substance), it can be expected that stakeholders will remain actively engaged in this process.

FPF Participates in FTC Event on “Commercial Surveillance and Data Security” Proposed Rulemaking

Yesterday, FPF Senior Director for U.S. Policy Stacey Gray participated in a panel discussion hosted by the Federal Trade Commission (“FTC”) regarding its Advance Notice of Proposed Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security” (comments start at 1:39:00). Feedback from the public forum is intended to help inform the Commission’s decision whether to proceed in rulemaking and what form a new market-wide rule governing consumer privacy could take.

As a panelist, Stacey Gray urged the Commission to move forward with its rulemaking proposal, noting that exponential increases in the benefits and harms of data collection in our daily lives make it the right time to establish national rules on what constitutes unlawful behavior with respect to the collection and use of personal data. Highlighting potential regulatory solutions, Gray urged the Commission to codify existing case settlements requiring accurate disclosures and reasonable data security practices and to apply the Commission’s “unfairness authority” to reform business practices that result in data-driven discrimination and harmful secondary uses of personal information.

The public forum included two expert panels, one on industry perspectives and one on consumer advocate perspectives regarding the consumer data issues implicated by the rulemaking. Furthermore, presentations from the Commissioners as well as the questions posed by the panel moderators may offer further insight into how the FTC is approaching rulemaking on consumer harms in the present digital ecosystem.

Panel 1: Industry Perspectives

The first panel was moderated by Olivier Sylvain, senior advisor to FTC Chair Khan. In addition to asking about the restrictions that a new privacy rule should create, Mr. Sylvain’s questions covered existing industry best practices (including for the retention of sensitive data), ways the Commission can incentivize best practices short of rulemaking, and current market incentives to collect data. 

While the ANPR broadly defines “commercial surveillance” to include “collection, aggregation, analysis, retention, transfer, or monetization of consumer data,” industry panelists stressed that there are a wide range of uses of personal data that create different risks, depending on context. For example, Digital Context Next’s Jason Kint argued that while first-party use of data to tailor experiences is expected by consumers, secondary uses (including targeted advertising) tend to violate these expectations. National Retail Foundation’s Paul Martino agreed that there are greater risks inherent to data collection and processing by third-party businesses, which may lack incentives to develop long term customer relationships.

In the context of best practices, panelists paid particular attention to the topic of data security. Mozilla’s Marshall Erwin described a “universally accepted” (though not universally adopted) consensus set of data security practices that includes the encryption of personal information in transit, employee access controls, and password standards. Mr. Martino further pointed to controls like multi-factor authentication, malware and antivirus software, and patching, though he stressed that there is no “one size fits all” approach to cybersecurity standards.

The Partnership on AI’s Rebecca Finlay encouraged the Commission to review data governance models emerging in jurisdictions outside the U.S. to evaluate the merits of different regulatory approaches. She specifically highlighted the privacy interests of children and the United Kingdom’s recent Age Appropriate Design Code, which includes transparency and data minimization standards. Mr. Erwin also highlighted the need to protect childrens’ privacy, while cautioning that some approaches can result in “privacy theater” with minimal tangible benefit.  

Panel 2: Consumer Perspectives Panel

The second panel was moderated by Attorney Advisor to the FTC, Rashida Richardson. Ms. Richardson’s questions underscored the Commission’s focus on civil rights and on children and teenager’s privacy, as well as its interest in ensuring that requirements placed on industry are in fact privacy and security-protective. She asked for insights from the panel on the unique impacts of online tracking and data collection on members of protected classes and on children and teenagers and the extent to which data minimization and transparency requirements are effective tools to combat the harms associated with widespread collection of personal data. Finally, she asked about the limitations of the traditional notice and consent model for protecting consumer privacy. 

Members of the panel signaled strong support for the FTC’s efforts to establish national, clear standards regarding what constitutes unfair or deceptive data collection, storage, and use. EPIC’s Caitriona Fitzgerald spoke to the inability of many individuals to understand or protect themselves from harmful data collection online in the absence of regulatory intervention. Upturn’s Harlan Yu and the Joint Center for Political and Economic Studies’ Spencer Overton, focused on marketplace harms borne by the members of historically-marginalized and protected groups in critical areas, such as housing, education, and voting. Citing examples of housing and employment discrimination enabled by widespread data collection, they urged the Commission to place limits on the ability of data brokers and other parties to collect and aggregate certain sensitive types of data. The German Marshall Fund of the U.S.’s Karen Kornbluh added that online data collection and aggregation, when it is deployed to interfere with elections or track members of the armed services, poses national security as well as privacy risks. 

FPF’s Stacey Gray noted that, when applying the unfairness standard, the Commission should be mindful of the fact that fairness determinations “inherently involve balancing, context, and policy tradeoffs,” emphasizing that, “many secondary uses of data can and should enable academic research, support for public health, fraud detection, and perhaps, to a reasonable extent, advertising-supported content.” Mr. Overton returned to this theme, noting that data-enabled targeted messaging can be positive when it provides individuals with information that is particularly relevant to them, such as messaging about sickle cell disease aimed at African-American audiences.

Commissioners Weigh In

In opening the public forum, Chair Khan noted that digital tools can deliver “huge conveniences” but also contribute to the tracking and surveillance of individuals in entirely new ways. She further emphasized the legal tests that the Commission must satisfy if it is to proceed in rulemaking. Commissioner Slaughter spoke favorably of efforts to enact comprehensive federal privacy legislation, but emphasized that until there’s a law on the books, the Commission must make use of all its enforcement tools to investigate and address unlawful behavior. Her comments highlighted harms to adolescents who are not covered by existing children’s privacy laws as well as harms resulting from AI and advanced algorithms.

Commissioner Bedoya spoke following the panel presentations, stressing the importance for the Commission to receive a broad array of first-hand consumer accounts of unfair and deceptive practices. Picking up on points raised by FPF’s Stacey Gray on the history of “unfairness” in U.S. privacy law, Bedoya also noted that the ANPR’s broad scope reflects the sum total of historical privacy frameworks in the United States, such as the Brandeis-Warren ‘Right to Privacy’ and the Fair Information Practice Principles (FIPPS), that go beyond mere ‘notice and consent’ protections. Commissioners Wilson and Phillips, who both voted against the FTC’s ANPR, did not participate in the event.

Next Steps:

In addition to the public forum, the Commission will consider written responses to the ANPR in determining whether to proceed in a new privacy and data security rulemaking; the deadline for public comment is October 21, 2022.

The Commission’s 95-question ANPR covers a broad range of topics, seeking information on the prevalence and harms of particular industry practices (including in advanced algorithms, children’s data, and targeted advertising), potential regulatory interventions (such as data minimization, consent, and transparency), and remedies (such as first-time fining authority and “algorithmic disgorgement”).

Due to its expansive nature, the ANPR has been heralded for attempting to rein in invasive and unfair business practice, while critics have alleged the proposal exceeds the Commission’s statutory authority. The Commission could pursue a range of possible directions in crafting new privacy and security rules for U.S. businesses, and stakeholders will be closely watching for additional indications from the Commission on what will come next.

View a video and transcript of the public forum here.

New Report on Limits of “Consent” in Japan’s Data Protection Law

Introduction

Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the fourteenth and final report in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).

This report provides a detailed overview of relevant laws and regulations in Japan, including:

The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.

Japan’s Data Protection Landscape

The primary legislation in Japan governing the collection, use, and disclosure of personal information by private entities is the Act on Protection of Personal Information (APPI), which took effect in 2003 and applies to any handling of the personal information of data subjects (termed “principals” in the APPI) in Japan by businesses which supply goods and services to persons in Japan, termed “personal information handling business operators” (PIHBOs).

A core principle of the APPI is that personal information may only be processed for a specific purpose (termed the “utilization purpose”), which must be specified as clearly as possible. Before handling personal information, a PIHBO must notify the data subject or the public at large of the utilization purpose for handling the information (unless an exception applies). A PIHBO may also handle personal information in a manner that is consistent with that purpose without having to obtain the data subject’s consent.

The APPI was substantially amended in 2015, 2020, and 2021. These amendments did not significantly impact the APPI’s notice and consent framework.

Following the 2015 amendments to the APPI, the PPC has been empowered to enforce the APPI and issue guidelines to aid compliance.

Regarding guidance, the PPC to date has issued comprehensive guidelines (in Japanese) on interpretation of the APPI as well as more targeted guidance on specific topics, in a question-and-answer format. The PPC’s guidance is complemented by other guidelines (In Japanese) on personal data protection in specific sectors (including finance, credit reporting, debt collection, medical care, insurance, and genomics) issued by sectoral regulators.

Regarding enforcement, the PPC is empowered to conduct investigations into PIHBOs’ personal data protection practices and issue non-binding recommendations to cease certain conduct or rectify non-compliance with certain of the APPI’s requirements. If a PIHBO fails to implement the recommendation without a legitimate excuse, or in cases where urgent action is required, the PPC is further empowered to issue a binding order for the PIHBO to take appropriate action. Failure to comply with a binding order from the PPC is a criminal offense punishable with imprisonment or a fine.

Role and Status of Consent as a Basis for Processing Personal Data in Japan

Consent is not required for all handling of personal information under the APPI. As discussed above, a PIHBO may collect and use personal information for a utilization purpose without obtaining the data subject’s consent. However, the PIHBO must still ensure that the handling is lawful and fair and in most cases, notify the data subject of how his/her personal information will be handled.

That said, consent plays a number of secondary roles and may be required for certain activities concerning personal information. By default, a PIHBO must obtain data subject’s consent before:

Consent also functions as one of several legal bases under the APPI for transferring personal information out of Japan. In this context, consent is only valid if the PIHBO first provides the data subject with certain information, including the jurisdiction to which the personal information will be transferred, details on the personal information protection system of that jurisdiction, and details of any action that the recipient will take to protect the personal information.

Though the APPI provides a number of exceptions to consent requirements, these exceptions are generally only available where provided by another law or regulation, or where there is a need to:

Additionally, the APPI also exempts certain activities, including academic research, journalism, and activities of political or religious organizations, from its requirements, including consent requirements, subject to certain obligations to secure and appropriately handle personal information.

The APPI does not define consent or specify the forms of consent that would be considered valid under the APPI. However, the PPC has issued guidelines which suggest that consent must minimally be specific and voluntary and provide examples of valid measures for obtaining consent in practice.

While express consent would qualify as valid under the APPI, there is ambiguity as to whether implied consent would qualify as valid for this purpose. Guidance from the PPC suggests that opt-in implied consent could be considered valid in appropriate cases but does not provide examples of any such cases.

However, certain sectoral guidelines, including for the medical care and debt collection sectors, do specify a number of situations in which consent can be inferred or would not be strictly required.

Read the previous reports in the series here.

FPF Welcomes Senior Fellows Covering Data Protection in Latin America and Japan

FPF welcomes two new Senior Fellows to the Global team that will provide ad-hoc insight into the state of play of data protection and privacy law developments in their regions: Pablo Palazzi for Latin America, with a focus on Argentina, and Takeshige Sugimoto for Japan.

Pablo Palazzi

copy of techlawfest graphic

Pablo A. Palazzi, who will oversee developments in Argentina and Latin America, is currently a law professor at the University of San Andres in Buenos Aires, Argentina, where he is the Director of the Center for Technology and Society (CETyS).

He is also a partner of Allende & Brea, a law firm in Buenos Aires, where he practices data protection law and internet law. He previously worked as a foreign associate at Morrison & Foerster, LLP in New York. He is admitted to practice law both in Argentina and in New York State.

The challenges for Latin America are finding a proper and adequate model to regulate data privacy, considering the region’s particularities. Latin America is a region that includes 33 countries and 660 million people.

Currently, first-generation laws are 20 years old, and only a handful of laws are based on GDPR, such as in Brazil and Ecuador. The remaining laws that were correct 20 years ago are not up to date to face the challenges that modern society requires. There is much room to enhance cooperation between DPAs in the region and to work on harmonizing the legal frameworks.

Palazzi participated actively as an external consultant in the European Commission’s adequacy assessments of Uruguay and Argentina, where he was also actively involved in drafting a data protection bill based on the GDPR in the years 2017-2018. He was a consultant for the “Red Iberoamericana de Protección de Datos” , drafting SCCs for Latin America. He is doing similar work for SCCs under the Council of Europe modernized Convention 108.

He was involved in drafting the Regulations of the national data protection act and the data protection law for the city of Buenos Aires (Law 1,845), and the drafting of the Computer Crimes Act in the year 2008. He was a member of the Advisory Committee of the Cybercrime Program of the Ministry of Justice of Argentina, helping to internalize the Budapest Convention. 

Palazzi has written several books on data protection matters in Spanish, including: “International transfer of personal data to Latin America” (Ad Hoc, 2003, LL.M thesis with prologue by Prof. Joel Reidenberg), “Credit Reporting Law” (Astrea, 2007), “Computer Crimes” (Abeledo, 2014), and “Delitos contra la intimidad informática” (CDYT, 2019). He also edited a two-volume book with several authors to celebrate the 20th anniversary of the data protection law of Argentina (“Protección de Datos: Doctrina y Jurisprudencia,” CDYT, 2021). In Europe, Palazzi coordinated the book “Challenges of privacy and data protection law – Perspectives of European U.S. law” (Larcier, 2008), edited with Prof. Yves Poullet and María Verónica Pérez Asinari.

He is a member of the editorial board of International Data Privacy Law (Oxford University), a founding member of the Latin American Data Protection Law Review (annual law review on data protection, 2012-2018), and a member of the International Association of Privacy Professionals (IAPP) where he was the KnowledgeNet chair for the Buenos Aires chapter. Palazzi also collaborated in drafting the Model Data Processing agreement at IAPP´s Privacy Bar Section. In 2022, Palazzi was awarded the Vanguard Award by IAPP for his work in the region of Latin America. He has been a frequent speaker at the CPDP conferences in Brussels and Latam, at PLI seminars in New York, the IAPP summit in Washington, DC, and the Privacy Laws & Business conference at Cambridge University.

Palazzi obtained his law degree at the School of Law of Universidad Católica. In May 2000, he received an LL.M. from Fordham Law School, where he also worked as a research assistant for Prof. Joel Reidenberg. Pablo wrote his LL.M. thesis on international transfers of personal data and the adequacy of Latin American countries. 

Takeshige Sugimoto

headshot takeshige sugimoto

Takeshige (“Take”) Sugimoto, who will oversee developments in Japan, is the Managing Director and Partner of S&K Brussels LPC, a Japanese boutique law firm specializing in data protection, privacy laws, and AI regulations in the US, EU, UK, China, and Japan. He is qualified to practice law in Japan and New York State and is a member of the Brussels Bar Association (B-List). He also serves as the Director of the Japan DPO Association, which he co-founded

spacer here

Japan’s Act on the Protection of Personal Information (APPI) is as vigorous as the GDPR in protecting individuals’ rights to person data. The APPI has established two sets of rules: one for the private sector, which stipulates obligations and penalties for the person information handling business operators, and another for the public sector, which stipulates obligations and penalties for administrative organizations and incorporated administrative agencies.

Starting in April 2023, local governments’ personal information protection systems will also enforce commons rules. This will position the APPI as the single comprehensive data protection law applicable to private and public sectors, including local governments.

It will be interesting to see how Japan can continue to play an important role in discussing the emerging risks surrounding personal data protection, such as data localization and unlimited government access.

Sugimoto’s data protection practice includes establishing and reviewing clients’ global data protection compliance systems, representation, and defense in disputes involving global data protection law issues, including but not limited to negotiations with European, UK, US, Chinese, and Japanese data protection supervisory authorities. As a Japanese lawyer, he regularly advises various clients on the APPI, taking into account his paralleled ongoing practical experiences with the EU General Data Protection Regulation (GDPR), UK GDPR, US California Consumer Privacy Act (CCPA) / Consumer Privacy Rights Act (CPRA), and China’s Personal Information Protection Law (PIPL).

As a former Brussels resident between 2013 and 2020, he has practiced European data protection laws, including both EU member states’ data protection laws under the EU Data Protection Directive of 1995 and EU GDPR as a member of major law firms’ Brussels offices. He has successfully represented numerous clients over the years in obtaining European data protection supervisory authorities’ approvals of EU Binding Corporate Rules (BCRs) for Controllers and Processors under the EU GDPR, following each of the European Data Protection Board (EDPB)’s opinions on the respective authorities’ draft approval decisions for those BCRs. Furthermore, he represents clients in their applications for approval of UK BCRs under the UK GDPR to the UK Information Commissioner’s Office (ICO). He has also assisted clients in preparing for the UK’s International Data Transfer Agreement, a new data transfer mechanism. 

Since adopting the CCPA in 2018, followed by the CCPA Regulation issued by the California Attorney General, Sugimoto has advised several major companies on their CCPA compliance projects. He has also assisted clients in updating their CCPA compliance mechanism in line with the CPRA. In addition, he has been closely following legislative activities of US federal privacy bills, including COPRA (Consumer Online Privacy Rights Act), SAFE Data Act, and ADPPA (American Data Protection and Privacy Act), as well as US Federal Trade Commission (FTC)’s rulemaking efforts on privacy and data security. 

Sugimoto has assisted several clients in complying with China’s data-related laws, including the PIPL, Data Security Law, and Cybersecurity Law. His ongoing work includes helping clients carry out personal information protection impact assessment under the PIPL, preparing PIPL-compliant consent forms, personal information entrustment addendums, data transfer agreements (SCCs), guidance on data protection management systems, internal security rules, privacy policies, data subject rights request manuals, personal information breach response manuals, and handling large data mapping projects in bilingual languages in collaboration with major Chinese law firms.

Outside of direct dealings with clients, Sugimoto has also been invited as a speaker at various data protection-related events organized by data protection supervisory authorities. In October 2021, he was invited to speak at the “Global Privacy Assembly 2021 Mexico,” where he participated as a panelist in “Panel IV: The Challenge of Compliance: The Perspective of Data Protection Officers.”

Sugimoto received an LL.B. degree from Keio University, Faculty of Law in 2004; an LL.M. degree from the University of Chicago Law School in 2012; and an MJur degree from the University of Oxford, Faculty of Law (Pembroke College) in 2013.

Subscribe to receive the FPF Monthly Briefing and follow FPF on Twitter and LinkedIn to get the latest global data protection updates.

Age-Appropriate Design Code Passes California Legislature

Update: On Sep 15, 2022, California Governor Gavin Newsom signed AB 2273, the California Age-Appropriate Design Code Act. The law will apply to businesses that provide online services, products, or features likely to be accessed by children and broadly requires businesses to implement their strongest privacy settings by default for young users up to the age of 18. AB 2273 will become enforceable on July 1, 2024.

This week, the California legislature passed AB 2273, the California Age-Appropriate Design Code Act (ADCA). The California ADCA is modeled after the UK’s Age Appropriate Design Code, and would apply to businesses that provide “an online service, product, or feature likely to be accessed by a child.” If enacted by Governor Gavin Newsom, the child-centered design law would be the first of its kind in the United States. 

The California ADCA would introduce significant new compliance obligations for US businesses that go beyond the requirements codified in COPPA – the longstanding federal children’s privacy law. Unlike COPPA, which defines “child” as an individual under 13 years old and applies to child-directed services, the California bill defines “child” as  an individual under 18 and applies to any online service that is “likely to be accessed by a child.” For covered entities, the bill would require the implementation of new protective measures for young users, such as configuring default privacy settings to those with the highest level of privacy, and places new limits on profiling, processing geolocation data, and the use of “dark patterns” to influence behavior.

What’s Next?

The California Age-Appropriate Design Code would become enforceable July 1, 2024 if enacted by Governor Newsom. The bill leaves many important questions unanswered. Covered entities may seek clarity and guidance from the California Children’s Data Protection Working Group, a new entity created by this bill. The working group would be required to submit a report to the legislature by January 1, 2024 regarding recommendations and best practices for compliance. The passing of the California ADCA reflects a growing focus on protecting children’s privacy online and many expect to see other legislatures follow California’s lead next year. 

With contributions from FPF’s Keir Lamont and Bailey Sanchez.

New Report on Limits of “Consent” in Singapore’s Data Protection Law

Introduction

Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the thirteenth in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).

This report provides a detailed overview of relevant laws and regulations in Singapore, including:

The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.

Singapore’s Data Protection Landscape

Singapore’s Personal Data Protection Act 2012 (PDPA), which was passed in November 2012 and significantly reviewed in 2020, with the stated purpose of governing the collection, use, and disclosure of personal data by organizations in a manner that recognizes not only individuals’ right to protection of their personal data but also organizations’ needs to collect, use, and disclose personal data.

The PDPA sets the baseline standard of protection for personal data in Singapore, though organizations which are subject to sector-specific laws and regulations (including in the financial services and medical sectors) must also comply with sector-specific requirements.

The PDPA also establishes a data protection authority, the Personal Data Protection Commission (PDPC) to advance the PDPA’s stated purpose, balancing protection of personal data with use of personal data for legitimate purposes. To that end, the PDPC implements policies relating to personal data protection and issues advisory documents to help organizations to understand and comply with their obligations under the PDPA.

The PDPC is also empowered to enforce the PDPA by, for example, issuing binding directions or requiring payment of a financial penalty. The PDPC is active in enforcement and regularly publishes decisions, which effectively function as a body of case law on personal data protection matters, on its website.

Role and Status of Consent as a Basis for Processing Personal Data in Singapore

Consent has played a key role in the architecture of the PDPA since the PDPA was first enacted in 2012.  The PDPA’s default requirement is an organization may only collect, use, or disclose personal data about an individual (i.e., data subject) only if:

However, this default requirement has always been subject to exceptions.

Between 2012 and 2021, the PDPA provided several long lists of exceptions to the consent requirement for collection, use, and disclosure of personal data in the Second, Third, and Fourth Schedules to the PDPA, respectively.

However, major recent amendments to the PDPA, which were passed in 2020 and took effect in 2021, replaced these various exceptions with a consolidated set of provisions allowing for collection, use, and disclosure of personal data without consent.

These provisions are set out in the First Schedule to the PDPA, under the following headings:

Additionally, the amended Second Schedule to the PDPA also provides further exceptions to consent for public interest and research purposes.

The new First and Second Schedules retain many of the old exceptions to consent. However,  a notable introduction was the “legitimate interests” section, which – though borrowing a term from European data protection law – was unique when compared with other major data protection laws internationally. The amended PDPA distinguishes between two categories of legitimate interests:

Another notable introduction is the “business improvement purposes” section. This provision allows an organization to share data with a related organization for the following purposes, subject to fulfillment of certain conditions, including, among others, necessity, reasonableness of purpose, and commitment to implement appropriate safeguards:

The 2020 amendments also expanded the situations in which an individual would be deemed by law to have consented to collection, use, or disclosure of his/her personal data. The amendments added new provisions for deemed consent by contractual necessity and deemed consent by notification. The new provision on deemed consent by notification relies on a similar risk assessment to the legitimate interests provision. Specifically, to rely on this provision, an organization must:

Independently of the requirement to obtain consent, the PDPA also restricts collection use, and disclosure of personal data to purposes that a reasonable person would consider appropriate in the circumstances. An organization that wishes to collect, use, or disclose personal data must also notify the individual of this purpose, unless an exception applies.

Finally, regulations to PDPA also establish informed consent as a legal basis for transferring personal data out of Singapore.

By default, the PDPA requires that organizations which seek to transfer personal data out of Singapore must either provide the data with, or apply to the data, a standard of protection that is comparable to that under the PDPA. However, organizations are taken to have satisfied this requirement if they obtain the individual’s consent for the transfer of the individual’s personal data out of Singapore after giving the individual a “reasonable summary in writing of the extent to which the transferred personal data will be protected to a comparable standard.”

Read the previous reports in the series here.

Blog Cover Image by Aditya Chinchure on Unsplash

Looking Back to Forge Ahead: Challenges of Developing an “African Conception” of Privacy

In this post for the FPF Blog, Mercy King’ori explores the cultural and societal underpinnings of “privacy” in Africa, looking throughout history, from pre-colonial times, and beyond the modern external influences on the legislative processes resulting in general data protection laws across the continent. The first essential point to start off from is understanding that Africa is not a monolith, it is multi-cultural and context differs across communities. Thus any generalizations in this blog post should be read in this light. 

Introduction

Few things depend on context, like privacy, which strongly hinges on how people within various communities and other social organizations perceive it. While the need for privacy may be universal, the particularities of its social acceptance and articulation differ depending on cultural norms that vary among communities. Whitman succinctly captured the cultural cause of the diverse forms of privacy when he posited that “culture informs greatly the different intuitive sensibilities that cause people to feel that a practice is privacy invasive while others do not feel that way”1

For example, in delineating the root cause of the differences in privacy expectations among Americans and Europeans, Whitman traces the emergence of the need for privacy among Europeans to the need for dignity, while for Americans, such a need emerged from the desire to express their freedom. By contrast, cultural practices in communities in other parts of the world may take a different view of privacy, such as the Japanese, who historically regarded privacy as a symbol of self-centeredness2.

In Africa, the formal understanding of privacy is still evolving. Given the influential nature of European and American cultures and institutions to the current world order, their conceptions of privacy have been greatly relied on to characterize the need for privacy in Africa. 

This has not been without consequences for the recipient societies. In Africa, where the recognition of privacy did not emerge from the need to achieve dignity and liberty (two values that are elusive in most of Africa’s history), scholars3 who use the European or American concept of the term as an implicit frame of reference have concluded that the need for privacy was largely absent on the continent, especially when contrasted with communal concepts such as Ubuntu of South Africa which place the community before the individual. However, as privacy discussions continue to grow in prominence in Africa, the question of whether an African conception of privacy that takes into account the cultural nuances such as strong kinship bonds continue to emerge.

This blog seeks to explore the cultural underpinnings and evolution of privacy in Africa both by examining the historical and modern challenges to fully developing a notion of privacy that takes into account the distinctiveness of the communities in the continent and whether such a conception can exist. 

To do so, it begins with a discussion and critique of the dominant notion that was strongly held regarding the existence of privacy in Africa as well as the societal and historical context under which such a notion may have emerged. This is followed by an account of the events (historical and current) that have influenced the development (or lack thereof) of an African conception of privacy. Such an examination provides two key insights:

To conclude, the blog discusses whether a conception of privacy from an African perspective is even possible at this point and whether such a conception is needed given the previous hindered attempts at developing an African conception of privacy.

Dominant Discourse on Privacy in Africa

For a long time, it was claimed that privacy was not valued in Africa. This dominant position can be attributed to the misperception that the communal and collectivist nature of most traditional African communities that marked the pre-colonial era meant that there was no privacy. This stance implied that an individual could not order their lives without the consent of the community.4 In most traditional African societies, the idea of personhood based on individualism was seen as conflicting with accepted social norms, especially those that involved a shared sense of interdependence. For most communities, a person’s identity depended on the community identity. From this lens, the close kinship in most African societies appears from the most mundane aspects of life to the most complex issues such as communal land ownership. 

This understanding of communal life (and the secondary place of individualism) in Africa has influenced how privacy is understood in Africa. Indeed, when individualism forms the basis of the conceptualization of privacy, it is clear why many who adopt such a framework accept as true that privacy did not meaningfully exist in Africa. Privacy in a communal way of life was still not imagined and played a negligible role in how the early discourse around privacy and data protection evolved, which solidified the dominant discourse about Africa that still shapes some perceptions today. 

However, the idea that African societies lacked individualism (as one of the determinants of the need for privacy) and therefore did not meaningfully articulate a concept of privacy needs to be challenged. While it is still unclear on the extensive role individualism played in structuring social relations in these societies, there is evidence that it did exist in some form in pre-colonial Africa. For example, when communities grew and different interests emerged, the individual began to isolate against the collective. Family members would leave their communal ways of living in pursuit of self-reliance and personal initiatives. 5 6  

Furthermore, the absence of information on the definitive nature of privacy in this era may also follow from how communities primarily passed down knowledge. In pre-colonial Africa, oral traditions held a major place7 as a means of disseminating information and communal customs. Traditionally, messages were passed down orally from one generation to another, often in the form of proverbs, songs, folktales and other narrations. Such messages helped people make sense of the world and were used to teach children and adults about important aspects of their culture, including privacy. For instance, the Agikuyu, an ethnic community in East Africa, have a proverb that speaks to the need to preserve privacy for matters of the home (“Cia mucii itiumaga ndira” – Home affairs must not go into the open).

However, weaknesses in record-keeping due to the dominance of oral traditions8 may have limited the availability of anecdotal evidence of privacy in traditional African communities and hampered efforts to formalize it into a more nuanced account of its evolution in the continent. Notably, such limitations in written evidence have also affected many other aspects of African society beyond privacy. One effect of this may be that certain cultural values were elevated beyond others, with the former being translated into laws. This could explain the absence of the right to privacy in the African Charter on Human and People’s Rights, while collective rights form a unique aspect of the Charter.9 Because of this, while African social order manifested features of individuality, communal living became more discernible than individualism to outside observers and consequently led to perceptions that privacy played little role in pre-colonial Africa. The arrival of colonialism reinforced these views, albeit in a different way, and greatly altered the indigenous development of privacy on the continent. 

Colonialism, Post-Colonialism, Independence, and Privacy

The colonial era was and remains an impactful period of time for most African communities in many ways, including privacy. The events of this period adversely affected any efforts to recognize privacy as a fundamental societal value. Colonialism began with the partition of Africa that gave rise to the formal geographical boundaries that currently exist. This gave new shape, meaning, and direction to the inherent kinship bonds within communities, which began to disintegrate as a result of colonial strategies such as divide and rule.

There was resistance to the colonial practices that set out to tear down the communal structures of the time.10 As the focus was on protecting the communal way of living, aspects of individualism discussed above seem to have remained intact within many communities.11 From a privacy perspective, this was a conducive condition–as individualism among disrupted communities remained intact. However, the imperialistic circumstances of the time made it unlikely that privacy and its related value of autonomy would be asserted. The power imbalances that existed between the indigenous communities and colonial governments created an unconducive environment for the development of a shared understanding and rights-respecting notion of privacy.

Colonial administrations contributed to many gross violations towards the dignity of community members. For example, in Kenya, the British introduced in 1920 through the Native Registration Amendment Ordinance,  a means of identification, the Kipande system.12 This involved the use of an identity document that contained the personal details, fingerprints, area of residence, and employment records of the holder–categories of information that modern privacy and data protection law considers personal and sensitive personal data. The identity document was issued to male Africans who worked in settler farms to administer a labor registration system. Holders of a kipande were required to wear it around their necks, clearly identifying their information and status as a farm worker to colonial administrators. 

At the time of its use, this paternalistic system caused an uproar and generated much resentment –both towards the oppressive means it embodied and the larger relationship between the settlers and Africans it represented.13 The holders viewed it as a symbol of humiliation and a loss of self-identity14, and many political associations of the time denounced it as a form of repression and control.15 Indeed, the Kipande system was a true reflection of the modern understanding of a surveillance system, one that could have generated concerns based on modern-day privacy principles.16 The fact that its opponents did not articulate their resentment around a conception of privacy reflects a missed opportunity for communities at the time to develop their own expectations of privacy. Intrusive identification systems still pose privacy challenges in many parts of the continent.   

Fast forward to when most African countries gained independence around the 1960s and 1970s. Independence created new legal structures such as Constitutions and Bill of Rights that were not indigenous to African communities. Prior to 1950, colonial administrations in many colonies did not consider Bills of Rights seriously, especially in those under British rule.17 This was primarily due to the official policy of rejecting a rights-based approach in constitutional ordering.18 For example, Ghana’s 1957 independence Constitution did not include a Bill of Rights. Nevertheless, there was emerging international consensus regarding human rights which began to create an atmosphere that was conducive to the adoption of Bills of Rights in the colonies post independence.19 Thus when the British government granted independence to these countries, the Constitutions were created with a Bill of Rights.20 At the time Europe was ratifying conventions such as the European Convention on Human Rights and transposing similar principles to their colonies.21 The introduction of human rights to colonial dependencies saw the introduction of the right to privacy making it the initial formal reference to privacy in most African countries.

Privacy Interlude: The fall of Independence Constitutions, rise of authoritarian governments and the revival of constitutional arrangements

Soon after independence, the reins of leadership were handed over to the founding fathers of Africa. Under political pressure, these leaders abrogated the independence Constitutions on grounds that Western forms of government couldn’t flourish in Africa as they are based on alien principles.22 To be sure, the ambitions of those in power and the general geopolitical conditions of the time aggravated the failure of the Constitutions more than any intrinsic flaws in the Constitutions themselves.23 Regardless, this saw parts of Constitutions, such as those guaranteeing privacy rights, eliminated. 

Later on, in the 1970s and 1980s, when economic crises hit Africa, Structural Adjustment Programs (SAPs) and stabilization policies for the purpose of economic development were introduced. SAPs involved the transfer of funds to different African economies tied to the fulfillment of certain conditions.24 One of the conditions concerned the reinstatement of Bills of Rights, which in turn saw the reintroduction of the right to privacy in many Constitutions.25

An African Conception of Privacy in the face of Globalization?

The period that followed has been crucial for privacy and data protection in Africa. As African societies became more active participants in the globalized world order, legal efforts to shape the perceptions of the need for privacy have increased in frequency and importance, as seen in the growing number of privacy and data protection laws. Privacy in Africa is now not only viewed through the lens of individualism (seen as gaining prominence over collective living). 

There are two main motivating factors for the expanded need for privacy. First, privacy is crucial to protect people from human rights violations resulting from technological advances. Second, privacy has emerged as a key requirement for Africa’s participation in the global digital economy. This desire to participate in global trade facilitated by information technology has influenced many countries to adopt a regulatory system that reduces legal hurdles and uncertainty.26 In order to accomplish this, many African states have been inspired by privacy laws that have grown to represent internationally accepted best practices such as the OECD Guidelines on the Protection of Privacy and Transborder Data Flows, the defunct EU Data Protection Directive of 1995, and its modern version, the EU General Data Protection Regulation (GDPR), or Council of Europe’s Convention 108 and 108+. Bearing in mind the difference between privacy and data protection, information privacy in many countries is now protected under comprehensive data protection laws, while a handful of countries have made the distinction between privacy and data protection directly in their laws. This process has been aided by development partners such as the EU Commission under the HIPSSA-ITU project, which sought to harmonize information and communication laws in Sub-Saharan Africa and saw at least two regional data protection frameworks modeled from the EU Directive of 1995.

This process of transplanting the language and body of foreign privacy and data protection laws has dealt a serious blow to attempts to develop an African conception of privacy, notwithstanding the challenges of implementing such a transplanted structure. On the one hand, the source of the challenges of implementation is not clear. Could it be the introduction of laws that do not reflect our knowledge of and commitment to the underlying values or a mere lack of political will? Arguably, it may be too early to determine this as most laws are still nascent. On the other hand, transplanting can be defended in light of changing perceptions of privacy, as more Africans become aware of the need for privacy protection as a means to protect their dignity and defend their freedoms, especially from the excesses of governments. It is on this basis that many criticize existing privacy and data protection laws as containing illusory and ineffective safeguards.27 

Nonetheless, the infusion of indigenous aspects into privacy and data protection laws indicates that perhaps all is not lost. The Malabo Convention, the continental treaty on data protection and cybersecurity, contains provisions that mandate the recognition of communal rights in the creation of data protection laws.28 Similarly, indications of class action suits (pointing to recognition of privacy violations affecting groups of people) that permit communities to assert their right to privacy can be found under SADC Model law of Southern Africa29 as well as the ECCAS Model Law/CEMAC Directive of Central Africa30. Such efforts elucidate the types of African cultural aspects that should be considered when implementing a privacy and data protection framework. However, this has not cascaded down into many national frameworks, which mostly rely on legal instruments adopted from Europe. 

Given the evolution of privacy in Africa, whether Africa will ever develop its own foundation of privacy or whether an African conception of privacy is necessary at this point remain open questions. Can the external and internal influences that Africa has experienced help define the socio-political foundations of privacy, like Europe and the U.S., whose values of privacy were founded on ideals with histories reaching back to the revolutionary era of the 18th century? Or has Africa leapfrogged into a conception of privacy that actually suits the stringent privacy requirements of the time? The jury is still out.


1 James Q. Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty
https://www.yalelawjournal.org/article/the-two-western-cultures-of-privacy-dignity-versus-liberty

2 Hiroshi Miyashita, The evolving concept of data privacy in Japanese law https://academic.oup.com/idpl/article/1/4/229/731520

3 Hanno Olinger, Western Privacy and/or Ubuntu? Some Critical Comments on the Influences in the Forthcoming Data Privacy Bill in South Africa, 2016 https://www.tandfonline.com/doi/abs/10.1080/10572317.2007.10762729 “Ubuntu can be described as a community-based mindset in which the welfare of the group is greater than the welfare of a single individual in the group.”

4 Alex B. Makulilo, African Data Privacy Laws https://link.springer.com/book/10.1007/978-3-319-47317-8

5 Ibrahim Anoba, A Libertarian Thought on Individualism and African Morality, https://www.africanliberty.org/2017/05/21/a-libertarian-thought-on-individualism-and-african-morality-by-ibrahim-anoba/

6 Adeshina Afolayan, Individualism, Communitarianism and African Philosophy: A Review Essay on Exploring the Ethics of Individualism and Communitarianism,   https://news.clas.ufl.edu/individualism-communitarianism-and-african-philosophy-a-review-essay-on-exploring-the-ethics-of-individualism-and-communitarianism/

7 Traces of written history can be found in African countries such as the great Timbuktu manuscripts of Mali https://artsandculture.google.com/experiment/the-timbuktu-manuscripts/BQE6pL2U3Qsu2A?hl=en

8 Acquinatta N. Zimu-Biyela, Taking Stock of Oral History Archives in a Village in KwaZulu-Natal Province, South Africa: Are Preservation and Publishing Feasible? http://www.scielo.org.za/scielo.php?script=sci_arttext&pid=S0259-94222022000300013&lng=en&nrm=iso

9 Article 17 (2), African Charter on Human and People’s Rights (1981). 

10 Barabra Potthast-Jutkeit, The history of family and colonialism: Examples from Africa, Latin America, and the Caribbean https://www.tandfonline.com/doi/abs/10.1016/S1081-602X%2897%2990001-4

11 Walter D. Mignolo, How Colonialism Preempted Modernity https://www.tandfonline.com/doi/abs/10.1080/03086534.2011.598039?journalCode=fich20

12 Jaap van der Straaten, Hundred Years of Servitude. From Kipande to Huduma Namba in Kenya  https://www.readcube.com/articles/10.2139%2Fssrn.3543457

13 Kipande Registration System (Kenya) https://api.parliament.uk/historic-hansard/commons/1946/jul/31/kipande-registration-system-kenya

14 Amos J. Beyan, The Development of Kikuyu Politics During the Depression, 1930-1939 https://www.jstor.org/stable/45193123

15 Idem

16 Howard Stein, Structural Adjustment and the African Crisis: A Theoretical Appraisal, https://www.jstor.org/stable/40325948

17 Charles O. H. Parkinson, Bills of Rights and Decolonization: The Emergence of Domestic Human Rights Instruments in Britain’s Overseas Territories,  https://academic.oup.com/icon/article/7/2/355/758671

18 Idem

19 Idem

20 Idem

21 Idem

22 Victor T. Levine, The Fall and Rise of Constitutionalism in West Africa https://www.jstor.org/stable/161678

23 Idem

24 Howard Stein and Machiko Nissanke, Structural Adjustment and the African Crisis: A Theoretical Appraisal https://www.jstor.org/stable/40325948

25 Nicola Gennaioli and Ilia Rainer, The Modern Impact of Precolonial Centralization in Africa https://www.jstor.org/stable/40216120

26 See the HIPPSA-ITU project for Harmonization of ICT Policies in Sub-Saharan Africa

27 Ogheneruemu Oneyibo, A Zimbabwean Data Protection Act to rule them all. Or not https://techpoint.africa/2021/12/17/zimbabwe-data-protection-act/

28 Article 8(2), Malabo Convention

29 Article 40, SADC Model law

30 Article 38 ECCAS Model Law/ CEMAC Directive

FPF Addresses ‘Opt-Out Preference Signals’ in Comments on California Draft Privacy Regulations

Yesterday, the Future of Privacy Forum (FPF) filed comments with the California Privacy Protection Agency regarding the Agency’s initial set of draft regulations to implement the California Privacy Rights Act amendments to the California Consumer Privacy Act.

FPF’s comments are directed towards ensuring that both individuals and businesses have clarity for the implementation and exercise of consumer rights through an emerging class of privacy tools known as ‘opt-out preference signals.’

Specifically, FPF recommended that the Agency’s final regulations governing preference signals and the mechanisms that transmit signals (such as web browsers and plug-ins) include the following clarifications:

  1. Resolve questions for the exercise of opt-out signals directed to websites while encouraging innovation in privacy controls for emerging digital and physical contexts.
  2. Clarify business disclosures in response to signals to ensure that individuals have access to relevant information about the exercise of their privacy rights.
  3. Encourage the development of signal mechanisms that allow consumers to exercise granular control of their privacy rights with respect to specific businesses.
  4. Ensure that the use of preference signals objectively represents an individual’s intent to invoke their privacy rights.
  5. Establish a multistakeholder process for ongoing Agency approval and review of preference signals and signal mechanisms.