Artificial Intelligence (AI) has become an integral part of our lives, transforming how we interact, work, and make decisions. From virtual assistants and recommendation systems to autonomous vehicles and medical diagnostics, AI technologies have made remarkable progress. However, as AI continues to advance, it is essential to understand how it works and the ethical considerations that accompany it.
This updated report places generative AI within the larger AI Landscape to address foundational questions about the operation and development of the technology, including generative AI’s use of personal information, the ability of individuals to meaningfully utilize access, correction, or deletion rights, as well as means and methods available to minimize inaccurate information and hallucinations in outputs.
As generative AI becomes mainstream through tools such as Open AI’s ChatGPT and Google’s Bard, it introduces new and transformational use cases for AI in everyday life, including the workplace. However, there are also risks and ethical considerations to manage throughout the lifecycle of these systems. A better understanding of all the kinds of AI systems and how they relate to one another benefits organizations, policymakers, and the general public is essential. The re-release of The Spectrum of Artificial Intelligence – Companion to the FPF AI Infographic strives to do this.
The First Japan Privacy Symposium: G7 DPAs discussed their approach to reign in AI, and other regulatory priorities
The Future of Privacy Forum and S&K Brussels LPC hosted the first Japan Privacy Symposium in Tokyo, on June 22, 2023, following the G7 Data Protection and Privacy Commissioners roundtable. The Symposium brought global thought leadership on the interaction of data protection and privacy law with AI, as well as insights into the current regulatory priorities of the G7 Data Protection Authorities (DPAs) to an audience of more than 250 in-house privacy leaders, lawyers, consultants and journalists from Japan and the region.
The program started with a keynote address from Commissioner Shuhei Ohshima (Japan’s Personal Information Protection Commission), who shared details about the results of the G7 DPAs Roundtable from the day before. Two panels followed, featuring Rebecca Kelly Slaughter (Commissioner, U.S. Federal Trade Commission), Wojciech Wiewiórowski (European Data Protection Supervisor, EU), Philippe Dufresne (Federal Privacy Commissioner, Canada), Ginevra Cerrina Feroni (Vice President of the Garante, Italy), John Edwards (Information Commissioner, UK), and Bertrand du Marais (Commissioner, CNIL, France). Jules Polonetsky, FPF CEO, and Takeshige Sugimoto, Managing Partner at S&K Brussels LPC and FPF Senior Fellow, hosted the Symposium.
The G7 DPA Agenda, built on three pillars: Data Free Flow with Trust, emerging technologies, and enforcement cooperation
The DPAs of the G7 nations started to meet annually in 2020, following the initiative of the UK’s Information Commissioner Office during UK’s G7 Presidency that year. This is a new venue for international cooperation of DPAs, limited to Commissioners from Canada, France, Germany, Italy, Japan, the United Kingdom, the United States, and the European Union. Throughout the year, the DPAs maintain a permanent channel of communication and implement a work plan adopted during their annual Roundtable.
In his keynote at the Japan Privacy Symposium, Commissioner Shuhei Oshshima laid out the results of this year’s Roundtable, held in Tokyo on June 20 and 21. The Commissioner highlighted three pillars guiding the group’s cooperation this year: (I) Data Free Flow with Trust (DFFT), (II) emerging technologies, and (III) enforcement cooperation.
The G7 Commissioners’ Communique expressed overall support for the DFFT political initiative, welcoming the reference to DPAs as stakeholders in the future Institutional Arrangement for Partnership (IAP), a new structure the G7 Digital Ministers announced earlier in April to operationalize the DFFT. However, in the Communique, the G7 DPAs emphasized that they “must have a key role in contributing on topics that are within their competence in this Arrangement.” It is noteworthy that, among their competencies, most G7 DPAs have the authority to order the cessation of data transfers across borders if legal requirements are not met (see, for instance, this case from the CNIL – the French DPA, this case from the European Data Protection Supervisor, or this case from the Italian Garante).
The IAP seems to provide a key role for governments themselves currently, in addition to stakeholders and “the broader multidisciplinary community of data governance experts from different backgrounds,” according to Annex I of the Ministerial Declaration announcing the Partnership. The DPAs are singled out only as an example of such experts.
In the Action Plan adopted in Tokyo, the G7 DPAs included clues as to how they see the operationalization of DFFT playing out: through interoperability and convergence of existing transfer tools. As such, they endeavor to “share knowledge on tools for secure and trustworthy transfers, notably through the comparison of Global Cross-Border Privacy Rules (CBPR) and EU certification requirements, and through the comparison of existing model contractual clauses.” (In an analysis touching broadly beyond the G7 jurisdictions, the Future of Privacy Forum published a report earlier this year emphasizing many commonalities, but also some divergence, among three sets of model contractual clauses proposed by the EU, the Iberoamerican Network of DPAs, and ASEAN).
Arguably, though, DFFT was not the main point on the G7 DPAs agenda. They had adopted a separate and detailed Statement on generative AI. In his keynote, Commissioner Shuhei Ohshima remarked that “generative AI adoption has increased significantly.” In order to promote trustworthy deployment and use of the new technology “the importance of DPAs is increasing also on a daily basis,” the Commissioner added.
Generative AI is not being deployed in a legislative void, and data protection law is the immediately applicable legal framework
Top of mind for G7 data protection and privacy regulators is AI, and generative AI in particular. “AI is not a law-free zone,” said FTC Commissioner Slaughter during her panel at the Symposium, being very clear that “existing laws on the books in the US and other jurisdictions apply to AI, just like they apply to adtech, [and] social media.” This is apparent across the G7 jurisdictions: in March, the Italian DPA issued an order against OpenAI to stop processing personal data of users in Italy following concerns that ChatGPT breached the General Data Protection Regulation (GDPR); in May, the Canadian Federal Privacy Commissioner opened an investigation into ChatGPT jointly with provincial privacy authorities; and, in June, Japan’s PIPC issued an administrative letter warning OpenAI that it needs to comply with requirements from the Act on the Protection of Personal Information, particularly regarding the processing of sensitive data.
At the Japan Privacy Symposium, Ginevra Cerrina Feroni, VP of the Garante, shared the key concerns guiding the agency’s enforcement action against OpenAI, which was the first such action in the world. She highlighted several risks, including a lack of transparency about how OpenAI collects and processes personal data to deliver the ChatGPT service; uncertainty regarding a lawful ground for processing personal data, as required by the GDPR; a lack of avenues to comply with the rights of data subjects, such as access, erasure, and correction; and, finally, the potential exposure of minors to inappropriate content, due to inadequate age gating.
After engaging in a constructive dialogue with OpenAI the Garante suspended the order, seeing improvements in previously flagged aspects. “OpenAI published a privacy notice to users worldwide to inform them how personal data is used in algorithmic training, and emphasized the right to object to such processing,” the Garante Vice President explained. She continued, noting that OpenAI “provided users with the right to reject their personal data being used for training the algorithms while using the service, in a dedicated way that is more easily accessible. They also enabled the ability of users to request deletion of inaccurate information, because – and this is important – they say they are technically unable to correct errors.” However, Vice President Cerrina Feroni mentioned that the investigation is ongoing and that the European Data Protection Board is currently coordinating actions among EU DPAs on this matter.
The EDPS added that purpose limitation is among his chief concerns with services like ChatGPT, and generative AI more broadly. “Generative AI is meant to advance communication with human beings, but it does not provide fact-finding or fact-checking. We should not expect this as a top feature of Large Language Models. These programs are not an encyclopedia; they are just meant to be fluent, hence the rise of possibilities for them to hallucinate,” Supervisor Wiewiorowski said.
Canadian Privacy Commissioner Philippe Dufresne emphasized that how we relate to generative AI from a privacy regulatory perspective “is an international issue.” Commissioner Dufresne also added, “a point worth repeating is that privacy must be treated as a fundamental right.” This is important, as “when we talk about privacy as a fundamental right, we point out how privacy is essential to other fundamental human rights within a democracy, like freedom of expression and all other rights. If we look at privacy like that, we must see that by protecting privacy, we are protecting all these other rights. Insofar as AI touches on these, I do see privacy being at the core of all of it,” Commissioner Dufresne concluded.
The G7 DPAs’ Statement on Generative AI outlines their key concerns, such as lack of legal authority to process personal data at all stages
In the aforementioned Generative AI Statement, the G7 data protection regulators laid out their main concerns in relation to how personal data is processed through this emerging type of computer program and service. First and foremost, the commissioners are concerned that processing of personal data lacks legal authority during all three relevant stages of developing and deploying generative AI systems: for the data sets used to train, validate and test generative AI models; for processing personal data resulting from the interactions of individuals with generative AI tools during their use; and, for the content that is generated by generative AI tools.
The commissioners also highlighted the need for security safeguards to protect against threats and attacks that seek to invert generative AI models, and that would technically prevent extractions or reproductions of personal data originally processed in datasets used to train the models. They also advocated for mitigation and monitoring measures to ensure personal data created by generative AI is accurate, complete, and up-to-date, as well as free from discriminatory, unlawful, or otherwise unjustifiable effects.
It is clear that data protection and privacy commissioners are proactive about ensuring generative AI systems are compatible with privacy and data protection laws. Only two weeks after their roundtable in Tokyo, it was reported that the US FTC initiated an investigation against OpenAI. And this proactive approach is intentional. As UK’s Information Commissioner, John Edwards, made clear, the commissioners are “keen to ensure” that they “do not miss this essential moment in the development of this new technology in a way that [they] missed the moment of building the business models underpinning social media and online advertising.” “We are here and watching,” he said.
Regardless of the adoption of new AI-focused laws, DPAs would remain central to AI governance
The Commissioners also discussed the wave of legislative initiatives targeting AI in their jurisdictions. AI systems are not built and deployed in a legislative void: data protection law is largely and immediately relevant, as is consumer protection law, product liability rules, and intellectual property law. In this environment, what is the added value of specific, targeted legislation addressing AI?
Addressing the EU AI Act proposal, European Data Protection Supervisor Wiewiórowski noted that the EU’s initiation of the legislation is not because the legislator thought there was a vacuum. “We saw that there were topics to be addressed more specifically for AI systems. There was a question whether we approach it as a product, service, or some kind of new phenomenon as far as legislation is concerned,” he added. As for the role of the DPAs once the AI Act will be adopted, he brought up the fact that in the EU, data protection is a fundamental right: which means that all legislation or policy solutions governing processing of personal data in a way or another must be looked at through this lens. As supervisory authorities tasked with guaranteeing this fundamental right, DPAs will continue playing a role.
The framework ensuring the enforcement of the AI Act is still under debate, as EU Member States are tasked with designating competent national authorities, and the European Parliament hopes to create a supranational collaborative body to play a role in enforcement. However, one thing is certain: in the proposal, the EDPS has been designated the competent authority to ensure that EU agencies and bodies comply with the EU AI Act.
The CNIL seems to be eyeing the designation as EU AI Act enforcer as well. Commissioner du Marais pointed out that “since 1978, the French Act on IT and Freedom has banned automated decisions. We have a fairly long and established body of case law.” Earlier this year, the CNIL created a dedicated department including data and computer scientists among staff to monitor how AI systems comply with legal obligations stemming from data protection law. “To be frank, we don’t know yet what will come out of the legislative process, but we have started to prepare ourselves. We have also been designated by domestic law as supervisory and certification authority for AI during the 2024 Olympic Games.”
The Garante has a long track record of enforcing data protection law on algorithmic systems and decision-making that impacted the rights of individuals. “The role of the Garante in safeguarding digital rights has always been prominent, even when the issue was not yet widely recognized by the public,” said Vice President Cerrina Feroni. Indeed, as shown by extensive research published last year by the Future of Privacy Forum, European DPAs have long been enforcing data protection law in cases where automated decision-making was central. The Garante led impactful investigations against several gig economy apps and their algorithms’ impacts on people.
Canada is also in the midst of legislating AI, introducing a bill last year that is currently under debate. “There is similarity with the European proposal, but [the Canadian bill] focuses more on high impact AI systems and on preventing harms and biased outputs and decision-making. It provides significant financial fines,” Commissioner Dufresne explained. As part of the bill, enforcement is currently assigned to the relevant ministry in the Canadian government. The Privacy Commissioner explained that the regulatory activity would be coordinated with his office, but also with the competition, media, and human rights regulators in Canada. When contributing recommendations during the legislative process, Commissioner Dufresne noted that he suggested “privacy to be a key principle.” In light of his vision that privacy as a fundamental right is essential for the realization of other fundamental rights, the Commissioner had a clear message that “the DPAs need to be front and center” of the future of AI governance.
UK Commissioner Edwards echoed the value of entrenched collaboration among digital regulators, adding that the UK already has an official “Digital Regulators Cooperation Forum,” established with its own staff. The entity “is important to provide a coherent regulatory framework,” he said.
Children’s privacy is a top priority across borders, with new regulatory approaches showing promising results
One of the key concerns that the G7 DPAs have in relation to generative AI is how the new services are dealing with children’s privacy. In fact, the regulators have made it one of their top priorities to broadly pursue the protection of children’s privacy when regulating social media services, targeted advertising, or online gaming, among others.
Building on a series of recent high-profile cases brought by the FTC in this space, Commissioner Slaughter couldn’t have been clearer: “Kids are a huge priority issue for the FTC.” She reminded the audience that COPPA (Children’s Online Privacy Protection Act) has been around for more than two decades, and it is one of the strongest federal privacy laws in the US: “The FTC is committed to enforcing it aggressively.” Commissioner Slaughter explained that the FTC’s actions, such as their recent case against Epic Games, include considerations related to teenagers as well, even if they are not technically covered by COPPA protections, but are covered by the “unfair practices” doctrine of the FTC.
UK Commissioner John Edwards gave a detailed account of the impact of the UK’s Age Appropriate Design Code in the design of online services provided to children, which was launched by his office in 2020. “We have seen genuine changes, including privacy settings being automatically set to very high for children. We have seen children and parents and carers being given more control over privacy settings. And we have seen that children are no longer nudged to lower privacy settings, with clearer tools and steps in place for them to exercise their data protection rights. We have also seen ads blocked for children,” Commissioner Edwards said, pointing out that these are significant improvements for the online experience of children. These results have been obtained primarily through a collaborative approach with the service providers, who have implemented changes after their services were subject to audits conducted by the regulator.
Children’s and teenagers’ privacy is also top of mind for the CNIL. Among a series of guidance, recommendations, and actions, the French regulator is adding another layer to its approach – digital education. “We have made education a strategic priority. We have a partnership with the Ministry of Education and we have available a platform to certify digital skills for children, as well as with resources for kids and parents,” Commissioner du Marais said. Regarding regulatory priorities, he emphasized attention to age verification tools. Among the principles the French regulator favors for age verification are no direct collection of identity documents, no age estimates based on web browsing history, and no processing of biometric data to recognize an individual. The CNIL has asked websites not to carry out age verification themselves, and to instead rely on third-party solutions.
The discussions of the G7 DPA Commissioners who participated in the first edition of the Japan Privacy Symposium laid out a vibrant and complex regulatory landscape, centered around new challenges posed to societal values and rights of individuals by AI technology, but also making advancements in perennial topics like cross-border data transfers and children’s privacy. More meaningful and deeper enforcement cooperation is to be expected among the G7 Commissioners, whose Action Plan espoused their commitment to move towards constant exchanges related to enforcement actions and to revitalize existing global enforcement cooperation networks, like GPEN (Global Privacy Enforcement Network). Next year, the G7 DPA Commissioners will meet in Rome.
Editor: Alexander Thompson
A New Domicile for Comprehensive Privacy in Delaware
On June 30, 2023, in the final hours of the Delaware legislative session, lawmakers in Dover passed House Bill 154, the Delaware Personal Data Privacy Act (“DPDPA”). If enacted by Governor Carey, the DPDPA will take effect on January 1, 2025 and follows the general model established by the Connecticut Data Privacy Act (CTDPA), with some notable differences. Delaware will become the twelfth U.S. state to adopt a comprehensive data privacy law to govern the collection, use, and transfer of personal data.
1. Broad Scope
The DPDPA establishes the lowest primary coverage threshold of any state comprehensive privacy law passed so far, applying to organizations that control or process the data of at least 35,000 Delaware residents annually. Typically, state-level comprehensive privacy laws cover organizations that control or process the data of at least 100,000 state citizens each year. The DPDPA’s scope was likely tailored to fit Delaware’s small size and population: by land area Delaware is smaller than any other U.S. state save Rhode Island, and has one of the lowest populations in the country, estimated by U.S. Census data at 1.018 million in 2022.
The Act exempts specific data that is subject to existing laws, including Health Insurance Portability and Accountability Act (HIPAA) and Fair Credit Reporting Act (FCRA)-covered data while broadly carving out Gramm-Leach-Bliley Act (GLBA) covered entities. However, the DPDPA diverges from most other state-level comprehensive privacy laws by not broadly exempting non-profits or higher education institutions.
2. Timely Sensitive Data Categories
The DPDPA establishes a category of “sensitive” personal information that is subject to greater protections, which includes categories such as “[d]ata revealing racial or ethnic origin,” “religious beliefs,” and “[p]recise geolocation data.” However, the DPDPA expands this list beyond that seen in many other states, including “status as transgender or nonbinary,” which is also recognized as a sensitive information category in Oregon’s recently-passed comprehensive privacy law, and “mental or physical health condition or diagnosis (including pregnancy).”
Although all currently enacted comprehensive privacy laws recognize some version of “mental or physical health condition or diagnosis” as sensitive, the DPDPA is the first state-level comprehensive privacy law to explicitly include pregnancy as a category of sensitive data. The recently-passed Connecticut Senate Bill 3 (SB 3), which partially updates the Connecticut Data Privacy Act (CTDPA), also specifically classifies data related to pregnancy and reproductive health as sensitive. Both SB 3 and the DPDPA likely reflect lawmaker focus on the privacy of reproductive health and pregnancy data in the wake of the Supreme Court’s overturning of Roe v. Wade.
3. Protections for Teens
The DPDPA forbids covered entities from selling or processing for targeted advertising purposes the data of consumers that the controller knows, or willfully disregards, are between the ages of 13 and 17 without consent. This prohibition goes farther than similarly-structured prohibitions in California, Connecticut, and Montana, which place restrictions on the sale and processing of the data of consumers between the ages of 13 and 15. The DPDPA’s broader coverage of teen’s data reflects the ongoing attention to youth privacy that has permeated state legislatures this session. While this is the first time a state-level comprehensive privacy law has structured it’s protections to cover teens up to the age of 17 (although CT SB 3 creates similar protections for 13-17-year olds), child-directed privacy and online safety laws, including the California Age-Appropriate Design Code and Utah Senate Bill 152, have increasingly applied to the data and activity of teenagers up to age 17.
4. Expanded Rights to Access and Delete
In line with other comprehensive privacy laws, the DPDPA grants consumers the right to require controllers to delete their personal data. Unlike comparable laws, however, the DPDPA requires controllers to delete data obtained about a person from a third-party source (such as a data broker) except for “a record of the deletion request and the minimum data necessary for the purpose of ensuring the consumer’s personal data remains deleted from the controller’s records,” which they may not use for any other purpose. In contrast, other state privacy laws typically permit controllers to retain data obtained about a person from third-party sources so long as they opt that person out of the processing of their personal data for all non-exempt purposes. The DPDPA also creates a unique affirmative right to “obtain a list of the categories of third parties to whom the controller has disclosed the consumer’s personal data.”
5. Unique Treatment of Nonprofits
Delaware joins Colorado and Oregon in not generally carving out nonprofit organizations in its scope. Like Oregon, however, the Delaware law carves out nonprofits that combat insurance fraud. The DPDPA also creates a novel data-level exemption for the “[p]ersonal data of a victim of or witness to child abuse, domestic violence, human trafficking, sexual assault, violent felony, or stalking that is collected, processed, or maintained by a nonprofit organization that provides services to victims of or witnesses to child abuse, domestic violence, human trafficking, sexual assault, violent felony, or stalking.”
6. UOOM Uncertainty
The DPDPA would be the seventh comprehensive state privacy law to permit consumers to exercise certain rights on a default basis through what is commonly known as a “Universal Opt-Out Mechanism” (UOOM), joining California, Colorado, Connecticut, Montana, Texas, and Oregon. The UOOMs that are currently in use often take the form of a browser extension, which sends out an automatic signal to web pages visited by a consumer with the extension enabled, notifying it that they would like to exercise a certain consumer right.
The DPDPA establishes that consumers have the right to opt out of the processing of their personal information for: targeted advertising, data sales, and profiling for the purposes of automated decision-making with significant impact on the consumer. Drafting ambiguities make it unclear whether the DPDPA permits opting-out of profiling via device signals, which would be a first for a state comprehensive privacy law. The DPDPA does not allow for rulemaking.
FPF Paper, “The Thin Red Line …,” Receives the Council of Europe’s 2023 Stefano Rodotà Award
On Friday, June 16th, members of the FPF team joined the 44th Plenary meeting of the Council of Europe’s Committee of Convention 108 in Strasbourg, France to accept a tremendous research honor. On this occasion, Katerina Demetzou, Senior Counsel for Global Privacy, Dr. Gabriela Zanfir-Fortuna, VP for Global Privacy, and Sebastião Barros Vale, former Senior Counsel for Europe at FPF, received the 2023 Stefano Rodotà Data Protection award in the category of ‘best academic article’ for their paper, “The Thin Red Line: Refocusing Data Protection Law on Automated Decision-Making, A Global Perspective with Lessons from Case-Law.” Demetzou and Barros Vale were present in Strasbourg during the Plenary meeting to present the paper and lift the award.
The Council of Europe (CoE), founded in 1949, is an international organization with 46 Member States and 6 Observer States. All Council of Europe Member States have signed up to the European Convention of Human Rights (ECHR), a treaty designed to protect human rights, democracy, and the rule of law. The European Court of Human Rights (ECtHR) oversees the implementation of the ECHR in the Member States.
Demetzou, Barros Vale, and Dr. Gabriela Zanfir-Fortuna, VP for Global Privacy at FPF, are honored to receive recognition at the birthplace of the CoE’s historic 1981 treaty, the Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data. Convention 108 established the first legally binding international instrument in data protection, and the CoE adopted the modernized Convention 108+ in 2018. A year later, the Convention 108+ Committee established the Stefano Rodotà Award to honor the memory and legacy of Stefano Rodotà (1933-2017), a leading Italian Professor and politician, and one of the founding fathers of data protection law in Europe. The Stefano Rodotà Award is awarded to precedent-setting and innovative research in the field of data protection. The dedicated academic, intellectual, and political influence of President Rodotà lives on through Demetzou, Barros Vale, and Dr. Zanfir-Fortuna’s global exploration of data protection instruments safeguarding individuals against harms from ADM in emerging technologies.
“The Thin Red Line” analyzes the legal protection provided by data protection law to individuals that are being subjected to automated decision making (ADM) on the basis of their personal data. To that end, the authors dedicate the first part of their research to an analysis of European Data Protection Authority (DPAs) enforcement actions of the GDPR on ADM cases. The second section looks to Brazil, Mexico, Argentina, Colombia, China and South Africa to explore protections against harmful ADM found in non-EU general data protection laws. The article concludes that even in cases where a processing operation does not meet the high threshold set by Article 22 GDPR (‘solely by automated means’), DPAs have made use of an array of legal principles, rights, and obligations to protect individuals against ADM practices, nonetheless. With the exception of Colombia, the other non-EU jurisdictions have a specific ADM provision. In all cases studied, the general data protection laws provide a broad material scope, such that any automated processing operation, solely or not, is regulated according to relevant provisions. Additionally, all laws studied include strong transparency and fairness requirements.
While the debate on a European Regulation for AI is ongoing, this paper aims to contribute to the discussion by highlighting that in cases where algorithms and AI systems process personal data, the GDPR is enforceable and protects individuals. Despite extensive legal scholarship on Article 22 GDPR, FPF’s experts identified a gap in previous literature through their global examination of existing enforcements and interpretations from regulators.
After the award ceremony, Demetzou was especially grateful for “the Committee’s warmth, as well as their committed understanding and appreciation for our research.” Dr. Zanfir-Fortuna called on the importance of the article’s findings while reflecting on emerging AI regulatory trends: “Data protection law has proved to be one of the most relevant existing legal frameworks to deal with the risks posed by the mass deployment of new AI tools. Existing legal obligations related to processing of personal data, on all continents, are stringent and more pressing than possible future AI legislation, as they are immediately applicable to existing AI systems.” The authors hope this intervention, as well as the paper’s global scan, will support researchers and policymakers in understanding how existing data protection law protects against potential harms from algorithms and AI systems.
Melis Ulusel is a current student at the George Washington University Law School and an FPF Policy Intern.
A significant new chapter for data privacy protections in the United States will commence on July 1, 2023 as broad-based consumer privacy laws in Colorado and Connecticut take effect. At the same time, the California Privacy Rights Act amendments to the California Consumer Privacy Act of 2018 will become enforceable.
Over the next 3 years, an additional 11 recently enacted significant state privacy laws are scheduled to take effect. To assist stakeholders in tracking the emerging state privacy landscape, the Future of Privacy Forum has prepared a chart listing the primary and secondary effective dates for these statutes.
FPF Files Comments with the U.S. Department of Health and Human Services (HHS) Office for Civil Rights
On June 15, the Future of Privacy Forum (FPF) filed comments with the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) regarding the Notice of Proposed Rulemaking (NPRM) on extending additional protections to reproductive health care data under the Health Insurance Portability and Accountability Act (HIPAA).
One year ago last week, the Supreme Court issued a decision that has resulted in loss of access to reproductive care for many Americans. Federal and state legislative and regulatory entities have been quick to respond to protect rights to reproductive care, a fundamental aspect of decisional privacy. Rulemakings such as this one by HHS OCR seek to fill the gap left in the wake of the Supreme Court’s 2022 decision that fundamentally shifted the landscape of data and information privacy. With a post-Dobbs lens, FPF has filed comments on this rulemaking based on the following recommendations.
We recommend that HHS bolster privacy safeguards and support the responsible handling of reproductive health care information (RHCI) by specifically:
Ensuring that covered entities are aware of and responsible for information that, directly or indirectly, can reveal data about individuals seeking or receiving reproductive health care;
Providing additional guidance and resources to address the information privacy responsibilities of covered entities for their business associates and vendors;
Distributing privacy education and guidance materials to covered entities and partners on data privacy transparency;
Conducting regulatory analysis and providing compliance support for small clinics and rural/remote providers facing increased legal requests for reproductive and related health information;
Addressing privacy protections for reproductive health care data collected and generated during and as a part of clinical research.
FPF’s full comments to the HHS are available here.
Nigeria’s New Data Protection Act, Explained
On June 12, 2023, the President of Nigeria signed the Data Protection Bill into law following a successful third reading at the Senate and the House of Representatives. The Data Protection Act, 2023 (the Act) has had executive and legislative support and marks an important milestone in Nigeria’s nearly two-decade journey towards a comprehensive data protection law. Renewed efforts towards a comprehensive law began in September 2022 when the National Commissioner of the Nigeria Data Protection Bureau (NDPB), now the National Data Protection Commissioner (NDPC), announced that the office would seek legal support for a new law as part of the Nigeria Digital Identification for Development Project. The drafting of the law was followed by a validation process that was conducted in October 2022. After validation, the Act was submitted to the Federal Executive Council for approval, which paved the way for its transmission to the National Assembly. The 2022 Data Protection Bill was introduced in both houses of Nigeria’s bicameral legislature as the Nigeria Data Protection Bill, 2023. The Act commenced upon signature by the President.
The Act provides for data protection principles that are common to many international data protection frameworks. It defines “personal data” broadly and it includes legal obligations for “data controllers” and “processors,” defined similarly to the majority of data protection laws around the world. While the structure and content of the Act align with other international frameworks for data protection, the Act contains notable unique provisions:
The Act introduces a new category of “data controllers and processors of major importance,” which seems to be inspired by the tiered approach in the EU’s Digital Services Act and its specific obligations for “very large online platforms and search engines;”
A “duty of care” is among the principles that controllers and processors need to comply with;
Controllers and processors must seek the services of a data protection compliance organization (DPCO) to perform a data protection audit, among other obligations;
The broad extraterritorial provisions under the Act are also notable;
The Act also introduces and places limitations on legitimate interest as a legal basis for personal data processing which is not present under the NDPR and the Implementation Framework;
Data subject rights under the Act are similar to those found in international data protection frameworks, however, with minimal restrictions on their exercise;
The Act provides stronger protections for children and persons without legal capacity, compared to the NDPR. It also introduces a requirement for age verification, where feasible; and
Lastly, the Act sets out structural changes to the existing data protection authority, which will change from a “Bureau” to a Commission, and updates the governing mechanisms for the authority.
Prior to the introduction of the Act, Nigeria’s data protection landscape was governed by the Nigeria Data Protection Regulation, 2019 (NDPR) and the Nigeria Data Protection Regulation 2019: Implementation Framework (Implementation Framework). However, the need to fill in the gaps under the NDPR, create a legal foundation for the existing data protection body, and as a necessary condition for the rollout of a national digital identification program required the creation of a new legislative framework. However, the NDPR and its Implementation Framework shall remain in force alongside the Act. Under Section 64(2)(f) all existing regulatory instruments, including regulations, directives, and authorizations issued by the National Information Technology Development Agency (NITDA) or NDPB shall remain in force as if they were issued by the Commission until they expire, are repealed, replaced, reassembled or altered. Per Section 63 of the Act, the new law shall take precedence in any instance of a conflict with pre-existing provisions.
1.Covered Actors: Novel Categories of Data Controllers and Processors
The Act applies to the processing of personal data by data controllers, data processors, and third parties, which may be individuals, private entities, or public entities that process personal data. A data controller is defined as an individual, private entity, public Commission or agency, or any other body which, alone or jointly with others, determines the purposes and means of the processing of personal data. A data processor is defined as an individual, private entity, public authority, or any other body who or which processes personal data on behalf of or at the direction of a data controller or another data processor. The Act does not define third parties.
The Act introduces a novel category of “data controllers and processors of major importance.” A data controller and processor of major importance is defined as a “data controller or data processor that is domiciled, resident in, or operating in Nigeria and processes or intends to process personal data of more than a such number of data subjects who are within Nigeria.” The Act continues, explaining that “the Commission may prescribe or such other class of data controller or data processor that is processing personal data of particular value or significance to the economy, society, or security of Nigeria as the Commission may designate.”
While the practical thresholds of this definition are set to be further clarified by the Commission, they will be based on the number of data subjects whose data are processed and the value or significance of the processed data. This categorization has commonalities with the EU’s Digital Service Act’s designation of entities as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) and may be used to create unique and additional obligations for such controllers and processors. The Act currently requires qualifying entities to meet special registration requirements, appoint a data protection officer, and pay different penalty amounts for violations. Future obligations will relate to processes relating to filing of compliance returns (Section 61(2)(g)), as well as any others that may be prescribed through regulations later issued by the Commission.
2. Covered Data: Broad Categories of Sensitive Personal Data
The Act covers both personal and sensitive personal data. It defines personal data as “any information relating to an individual, who can be identified or is identifiable, directly or indirectly, by reference to an identifier such as a name, an identification number, location data, an online identifier or one or more factors specific to the physical, physiological, genetic, psychological, cultural, social, or economic identity of that individual.” The definition closely tracks Article 4(1) of the GDPR.
It further defines sensitive personal data as personal data relating to an individual’s:
genetic and biometric data, for the purpose of uniquely identifying a natural person;
race or ethnic origin;
religious or similar beliefs, such as those reflecting conscience or philosophy;
health status;
sex life;
political opinions or affiliations; and
trade union memberships.
Section 30(2) of the Act envisions a broad, flexible definition for sensitive personal data by authorizing the Commission to prescribe further categories of sensitive personal data. The Act also prohibits the processing of sensitive personal data unless specified conditions are met. Notable allowances for processing of sensitive personal data include:
Where the processing is necessary for reasons of “substantial public interest,” on the basis of a law (Section 30(1)(f)). Substantial public interest is not defined under the Act; and
Where the data subject has consented to such processing (Section 30(1)(a)).
These proposed rules closely track Article 9 of GDPR’s restrictions for the processing of “special category” data. Unlike the GDPR, which envisions situations where the prohibition on processing sensitive personal data may not be lifted on the basis of consent, the consent exception under Nigeria’s Act is only restricted to situations where a data subject has given and then withdrawn such consent. Additionally, the Act only applies an “explicit consent” requirement to the potential sharing of sensitive personal data by a “foundation, association, or other not-for-profit body with charitable, educational, literary, artistic, philosophical, religious, or trade union purposes,” while the GDPR’s Article 9(2)(a) requires “explicit consent” for all excepted processing. However, the Act does permit the Commission to potentially create additional regulations that may apply to the processing of sensitive personal data, including regulations expanding categories of sensitive personal data, additional grounds for processing such data, and safeguards to be applied.
3.Territorial Application: Broad Extraterritorial Application of the Act
Section 2(2)(c)) of the Act contains broad extraterritorial authority, covering any form of processing the personal data of a data subject in Nigeria by controllers or processors not established in Nigeria. This provision does not consider the nature of processing being conducted, unlike frameworks such as the GDPR, which include the “targeting” criteria.
Exemptions from Application of the Act: Increased Protections for Exempted Processing Activities
Section 3 of the Act provides several different exemptions from the broader application of the law and makes room for the Commission to expand the processing activities that may be exempted from the Act. Processing personal information “solely for personal or household purposes” is exempt, as long as such processing does not violate a data subject’s right to privacy. This is a stark difference from laws such as the GDPR, which wholly exempts processing of personal data by a natural person in the course of personal or household activity, regardless of whether it touches on the person’s right to privacy or not. Therefore, there are instances where personal data processing activities of a non-professional and non-commercial nature may fall under the ambit of the law. The rationale for this condition is not clear. Other exemptions include processing activities by law enforcement during the prevention, investigation, detection, or prosecution of a crime, processing for the prevention or control of a national public health emergency, national security or public interest purposes, and as necessary for the establishment, exercise, or defense of legal claims that are exempt from most of the obligations under Part V of the Act. Exempt entities must still comply with some specific provisions under Part V including:
The principles of personal data processing – Section 24;
Provisions relating to the lawful basis of processing personal data – Section 25;
Provisions relating to the appointment of data protection officers for data controllers and processors of major importance – Section 32; and
Provisions relating to personal data breaches – Section 40.
While the Act reserves for the Commission authority to prescribe additional exemptions, it includes a greater number of protections for exempt processing activities than the 2022 Bill. In addition to the above-mentioned provisions that exempt entities must comply with, the Act empowers the Commission to issue a Guidance Note on the legal safeguards and best practices for exempted data controllers and processors where such processing violates or is likely to violate Sections 24 and 25 of the law. Some exemptions have been narrowed relative to the 2022 Bill. Entities who were exempted from complying with provisions under the 2022 Bill must now comply with the above-mentioned provisions for exempt entities under the 2023 Act, as well as those relating to data security and cross-border data transfers.
4. Obligations of Data Controllers and Processors: Novel Registration Requirements for Data Controllers and Processors of Major Importance
Some of the Act’s obligations for data controllers and processors are novel, while others have been maintained from the NDPR.
Data Controllers and Processors of “Major Importance”
The designation of “data controllers and processors of major importance” and the Commission’s authority to classify and regulate such entities is a key new development. Section 44 of the Act sets out the process and timelines to which such entities must adhere, including registering with the Commission within six months after the commencement of the Act, or upon meeting the statutory criteria for qualifying as a data controller or processor of major importance. Additionally, the Act empowers the Commission to exempt classes of data controllers and processors of major importance from registration where it considers that such registration is unnecessary or disproportionate. The criteria for exemption may be stipulated through Regulations by the Commission.
Another special obligation for controllers and processors of major importance is the requirement to appoint a Data Protection Officer (DPO), which is imposed by Section 32 on such entities only. This requirement substantially differs from the NDPR and the Implementation Framework; under the NDPR, every data controller must appoint a DPO (Article 4.1.2), while the Implementation Framework stipulates conditions for such an appointment (3.4.1).
Other important obligations of all data controllers and processors include:
Compliance with Data Protection Principles
Data controllers and processors are responsible for complying with the principles provided in the Act. The principles are similar to the FIPPS-based sets found in many comprehensive data protection regimes but also include the duty of care as a principle for controllers and processors (Section 24(3)). Specifically, both controllers and processors owe “a duty of care” with respect to data processing, which is linked to demonstrating accountability related to compliance with other principles provided by the Act.
Filing of Audit Reports
As discussed in greater detail below, controllers and processors must seek the services of a data protection compliance organization (DPCO) to perform a data protection audit, among other obligations. As the Act does not create new criteria for entities required to conduct such audits, provisions under the NDPR and Implementation Framework remain in force. While the Implementation Framework provides that the authority may carry out scheduled audits or perform spot checks, the common practice is for controllers and processors that process personal data of more than 2000 data subjects in 12 months to engage a DPCO to conduct annual audits on their behalf. This practice is expected to continue.
Provision of Information to a Data Subject Prior to Collection of Personal Data
Where a data controller collects personal data directly or indirectly from a data subject, they must supply the data subject with the following information prior to collection:
The identity, residence or place of business of, and means of communication with the controller;
The specific lawful basis of processing under either Section 25(1) or 30(1) of the Act, and the specific purposes of processing;
The categories of recipients of the personal data;
The existence of the data subject rights;
The retention period of the data;
The right to lodge a complaint with the Commission; and
The existence of any automated decision-making, including profiling, its significance, the envisaged consequence of such processing for the data subject, and the right to object to/challenge such processing.
Where personal data is collected indirectly, a controller may be exempted from providing this information to a data subject if it had already been provided or where it would involve a disproportionate effort (Section 27(2)). The transparency obligations imposed by Section 27(1) (listed above) shall form part of the content of a privacy policy that a controller is obliged to have under the law and must be expressed in “clear, concise, transparent, intelligible, and easily accessible form.” In providing this information, a controller is obligated to take into account the intended class of data subjects. This implies that a privacy notice may need to be adjusted to cater to, among other issues, the literacy levels and language differences among data subjects.
Conducting a Data Protection Impact Assessment
Section 28(1) mandates a data controller to conduct a data protection impact assessment (DPIA) prior to the processing of personal data, where such processing may likely result in a high risk to the rights and freedoms of a data subject. The Act does not specify the period within which a DPIA must be conducted prior to such processing. Laws such as Kenya’s Data Protection Act require a DPIA to be conducted 60 days prior to the processing; this obligation may be clarified under future Regulations.
The Commission may designate, by Regulation or Directive, categories of processing or persons that automatically trigger the requirement to conduct a DPIA. To qualify, a DPIA must include:
A systematic description of the envisioned processing and its purpose, including any legitimate interest pursued by the controller, processor, or third party;
An assessment of the necessity and proportionality of the processing related to those purposes;
An assessment of the risks to the rights and freedoms of the data subject;
Measures envisioned to address those risks, along with any safeguards, security measures, and mechanisms in place to ensure the protection of personal data.
Overseeing the Conduct of Data Processors and Sub-Processors
Controllers engaging processors, or processors engaging sub-processors, must take “reasonable measures” to ensure that the engaged party complies with the requirements of the Act set out in Section 29(1). These measures must take the form of a written agreement and ensure that the engaged party:
Assists the data controller or processor, as the case may be, in the fulfillment of the controller’s obligations to honor the rights of a data subject;
Implements appropriate technical and organizational measures to ensure the security, integrity, and confidentiality of personal data;
Provides the controller or processor, where applicable, with information reasonably required to demonstrate compliance with the Act; and
Notifies the engaging controller or processor when a new processor is engaged.
Data Security and Data Breach Notification Requirements
Controllers and processors shall be required to implement security measures and safeguards for personal data. The level of such measures shall take into account several factors, including:
The amount and sensitivity of personal data involved;
The period of data retention; and
The availability, and cost of technologies to be implemented.
The measures that controllers and processors may implement are further described under section 39(2), including pseudonymization, encryption, and periodic assessments of risks to processing systems.
Where a data breach occurs that affects a data processor, the processor will be required to notify the data controller or processor that engaged it as soon as the breached party becomes aware of the incident, and must respond to information requests regarding the breach (Section 40(1)).
Where a data controller suffers a breach that is likely to cause a risk to the rights and freedoms of data subjects as defined by Section 40(7), several steps are required, including:
Notifying the Commission within 72 hours of becoming aware of the breach;
Notifying the data subjects without undue delay. No time specification is made for notifying data subjects.
The requirements for communications to the Commission and to affected data subjects also differ. Communication to the Commission should be as detailed as possible and include a description of the nature of the breach, while notice to data subjects should be in plain and clear language and include steps to take to mitigate any adverse effects. Section 40(4) highlights the common information that should be present in both cases, such as the name and contact details of a point of contact for the data controller. Information relating to a breach may be provided in a phased manner, where it is impossible to provide all information in a single communication.
5.Lawful Grounds for Processing Personal Data, Consent Requirements, and Children’s Personal Data
The Act provides for six lawful grounds for processing personal data similar to those under the GDPR, including:
With consent of a data subject for the processing of personal data for a specific purpose;
For performance of a contractual obligation in which the data subject is a party;
Compliance with a legal obligation to which the data controller or data processor is subject;
For protection of the vital interest of the data subject or another person;
For performance of a task in the public interest or exercise of official authority by a data controller or data processor; and
To fulfill the legitimate interests of a data controller or processor, or by a third party to whom the data is disclosed, considering that a data subject would have a reasonable expectation that personal data would be processed in the stipulated manner and the processing does not override their fundamental rights and freedoms. This basis is, however, not permissible where a controller or processor’s legitimate interests are incompatible with other lawful bases under the Act.
Consent Requirements
The Act requires that consent be freely given, specific, informed, and unambiguous. This is similar to consent requirements under the NDPR and Implementation Framework. The Act prohibits implied consent – i.e., the inference of consent from a data subject’s inactivity or the use of pre-ticked boxes. This corresponds with most of the consent provisions under the Implementation Framework, other than the fact that the Framework provides exceptions for consent relating to cookies. The Framework (5.6) provides that consent for cookies may be implied from the continued surfing of a website and does not mandate explicit consent. This effectively limits the extent of consent to direct marketing that is required under 5.3.1(a) of the Implementation Framework.
Children’s Privacy
The Act expands the protections accorded to children and persons lacking legal capacity compared to the NDPR and its Implementation Framework. It increases the age threshold under which a data subject is considered a “child” to 18 years, in alignment with the Nigeria Child Rights Act (however, not all states have domesticated the Act), and contrasts with the Implementation Framework, which categorizes a child as a person under 13 years of age). The Act also includes specific consent requirements for children and persons lacking the legal capacity to consent. While the NDPR and Implementation Framework are silent on whom to obtain such consent from, under the Act, consent shall be obtained explicitly from parents or legal guardians (Section 31(1)). To effect this, the Act requires controllers and processors to adopt consent verification mechanisms. To guarantee stronger privacy protections for children, the Commission will create Regulations to guide the personal data processing of a child of 13 years and above in the course of their usage of online products and services.
However, there are instances where a controller or processor may process the personal data of children and persons lacking legal capacity without the consent of a parent or legal guardian, such as:
Where the processing is necessary to protect the vital interests of the child or person lacking the legal capacity to consent;
Where the processing is carried out for purposes of education, medical, or social care, and undertaken by or under the responsibility of a professional or similar service provider owing a duty of confidentiality; or
Where the processing is necessary for proceedings before a court relating to the individual.
Further Protection for Processing of Personal Data Relating to Children and Persons Lacking Legal Capacity
In addition to the consent requirements, the Act further requires controllers and processors to adopt age verification mechanisms. Age verification is required “where feasible,” taking into consideration available technology. Presentation of any government-approved identification documents will be permitted as a verification mechanism.
6.Data Subject Rights: Robust Rights with No Implementation Mechanisms for Data Subjects and Narrow Restrictions on Exercise of Rights
The Act provides for data subject rights, which data controllers and processors must comply with prior to and during the processing of personal data, including the rights to:
Obtain information regarding the personal data held by a controller or processor about the requestor, in a commonly used electronic format;
Know the source of information where the personal data has been collected from a source other than the data subject;
Lodge a complaint with the Commission;
Know the existence of automated decision-making (ADM) and not to be subject to a decision that is solely based on automated processing of personal data, including profiling, and the significance and consequences for the data subjects of such processing;
Correct, and where it is not feasible or suitable, delete inaccurate, out-of-date, incomplete, or misleading information;
Request erasure where the personal data is no longer required in relation to the purpose for which it was collected and where the data controller has no other lawful basis to retain the personal data;
Request the restriction of processing personal data where there is a pending resolution of a request, where a data subject objects to processing, and where a data subject seeks to establish, exercise or defend a legal claim;
Object to the processing of personal data, including where processing is for the purpose of direct marketing;
Data portability. The Act makes it possible for a data subject to receive personal data concerning them from a data controller and transmit it to another controller, or for the data to be directly transferred from one controller to another. The importance of this right, given Nigeria’s thriving fintech ecosystem, has seen the Central Bank of Nigeria issue operational guidelines within the context of open banking; and
Withdraw consent to the processing of personal data at any time.
The Act does not provide comprehensive mechanisms for implementing these rights, such as parameters and modalities to respond to data subject requests. However, the Implementation Framework (2.3.2(c)) requires controllers to inform data subjects on the method to use to withdraw consent before obtaining consent. The Act states that “a controller should ensure that it is as easy for the data subject to withdraw as it is to give consent.”
The Act does not provide general restrictions/limits to the rights except for specific cases such as:
The right to object – where a controller may still process personal data in light of an objection if there is a public interest or other legitimate ground which overrides the fundamental rights and freedoms of the data subject; and
Exceptions to the right not to be subject to ADM systems, including where the processing is necessary for contractual obligations, is authorized under a written law, or based on a data subject’s consent. While a person may not object to processing by ADM systems if it falls under the three exempt conditions, data subjects retain the rights to (i) request human intervention in the ADM system, (ii) have an opportunity to express their point of view, and (iii) contest a decision based on an ADM system. This differs from the 2022 Bill where a controller using an ADM system on the basis of another existing law, did not have to guarantee these three data subject rights.
7. Cross Border Data Transfers: Broad Grounds for Transfers of Personal Data as well as Parliamentary Authorizations to Protect Data Sovereignty
The Act establishes as a rule that personal data should not be transferred outside of Nigeria, allowing for two exceptions. First, personal data can be transferred when the recipient of the personal data (the data importer) is subject either to (1) a law, (2) Binding Corporate Rules (‘BCRs’), (3) contractual clauses, (4) a Code of Conduct, or (5) a certification mechanism that “affords an adequate level of protection” to that provided by the Act. In the absence of such adequate protection through one of the enumerated means, personal data can also be transferred outside of Nigeria in exceptional situations, listed in Section 43 and mapping precisely to the set of derogations under Article 49 GDPR (consent of the individual, or for the performance of a contract, among others).
Controllers are under an obligation to keep a record of the legal basis for transferring personal data outside Nigeria, as well as to record “the adequacy of protection,” according to the criteria described in detail under Section 42 of the Act. This wording suggests that the adequacy of the means of transfers used can be validly assessed by each controller. This is a departure from other existing adequacy regimes, which usually require an official body to declare a specific jurisdiction adequate.
The Commission is tasked with issuing guidelines on how to assess the adequacy of a particular means of transfer, under the criteria established by Section 42 of the Act. This section explains that an adequate level of protection means “upholding principles that are substantially similar (n. – our emphasis) to the conditions for processing personal data” under the Act. The criteria relevant for adequacy include “access of a public authority to personal data,” potentially complicating such assessments in line with the broader global debate on “government access to data held by private companies.”
Of note, the Commission is given the possibility under the Act to determine whether “a country, region, or specified sector within a country, or standard contractual clauses, affords an adequate level of protection.” In this sense, it is important to recall that the NDPR and Annex C of the Implementation Framework already provide a white list of 41 countries whose laws are considered adequate. Interestingly, the Act specifically allows the Commission to make an adequacy determination under the Nigerian law based on an adequacy decision “made by a competent authority of other jurisdictions,” if such adequacy is based on similar criteria to those listed in Section 42 of the Act. This opens the door for Nigeria to potentially equivalate adequacy decisions made by foreign bodies, like the European Commission, making an “adequacy network effect” functional. The Commission is also empowered to approve BCRs, Codes of Conduct, and certification mechanisms for data transfers.
Finally, and particularly interesting in the context of emerging certification frameworks like the Global Cross Border Privacy Rules (CBPR) framework, the Act requires that any specific “international, multinational cross-border data transfer codes, rules, or certification mechanisms” relating to data subject protection or data sovereignty must be approved by the National Assembly of Nigeria. This provision on data sovereignty aligns with the Nigeria National Data Strategy, 2022, which incorporates data sovereignty as one of its enabling pillars. Under the Strategy, data sovereignty will facilitate data residency and ensure that data is treated in accordance with national laws and regulations.
In this sense, the Act also empowers the Commission to “designate categories of personal data that are subject to additional specified restrictions on transfer to another country.” This designation would be based on “the nature” of such personal data and on “risks” to data subjects. This provision opens the door to potential future data localization requirements for specific categories of personal data.
8.Enforcement: Legal Foundation for the Nigeria Data Protection Bureau, Creation of a Governing Council and Expected Regulations
Establishment of the Commission
Originally created through an Executive Order in February 2022, the NDPB has now been renamed the “Nigeria Data Protection Commission” and will operate as an independent and impartial body to oversee the Act’s implementation and enforcement. Previously, data protection enforcement in Nigeria was conducted under the auspices of the Nigeria Information and Technology Development Agency. However, concerns that the NITDA lacked powers to oversee data protection in the country may have necessitated the creation of a new agency. The Commission will function as a successor agency, and all persons engaged in the activities of the Commission shall, upon enactment of the Act, have the same rights, powers, and remedies held by the NDPB before the commencement of the law (Section 64(1)). All regulatory instruments issued by the NITDA, including the NDPR, shall remain in force, and shall have the same weight as if they had been issued by the Commission until they expire, are repealed, replaced, reassembled, or altered (Section 64(2)(f)).
Functions and Powers of the Commission
Some of the key functions and powers of the Commission include:
Accrediting, Licensing, and Registering Suitable Bodies to Provide Data Protection Compliance Services (Section 5(c)).
Section 28 of the Act provides the Commission with the power to delegate the duty to monitor, audit, and report on compliance with the law to licensed data protection compliance organizations. This model was introduced under the NDPR and allows the data protection authority to delegate some functions under existing regulations to monitor, audit, and report on compliance by data controllers and data processors. Detailed provisions on the operation of DPCOs can be found under the NDPR and Implementation Framework and shall continue to apply to controllers and processors.
Designating, Registering, and Collecting Fees from Data Controllers and Processors of Major Importance (Section 5(d)).
Following successful registration of a controller or processor of major importance, the Commission is tasked to publish a register of duly registrants on its website. The Commission is also expected to prescribe fees and levies to be paid by this class of controllers and processors.
Participating in international fora and engaging with national and regional authorities responsible for data protection to develop efficient strategies for the regulation of cross-border transfers of personal data (Section 5(j)).
Issuing Regulations, Rules, Directives, and Guidance.
The Commission is expected to develop certain regulations as prescribed under the law and as detailed above, including in relation to designating new categories of sensitive data, adequate steps for data breach notification, conducting DPIAs, or issuing data localization regulations for specific categories of personal data.
Other functions of the Commission include promoting public awareness and understanding of personal data protection, the rights and obligations imposed under the law, and the risks to personal data; receiving complaints alleging violations of the Act or subsidiary legislation; and ensuring compliance with national and international personal data protection obligations and good practice.
In a bid to ensure that the services of the Commission are accessible beyond urban areas, the Commission is allowed to establish its offices in other parts of Nigeria (Section 3(b)). This is important as part of creating awareness of the importance of data protection across the country.
The Commission will be governed by a “Governing Council” (the Council), whose members will be appointed by the President on the recommendation of the Minister on a part-time basis, drawn from the public and private sector to serve for a term of 5 years that is renewable once. This rule exempts the National Commissioner, who will serve as the Secretary to the Council.
The Council is tasked with providing overall policy direction of the affairs of the Commission, approving strategic and action plans, budgeting support programs submitted by the National Commissioner, as well as providing advice and counsel to the National Commissioner.
9. Offenses, Sanctions, and Compensation: Higher Penalties for Data Controllers and Processors of Major Importance
The Act provides a data subject who has suffered injury, loss, or harm arising from a violation of the law with a private right of action that allows recovery of damages in a civil proceeding. Where a controller or processor violates the provisions of the Act or subsidiary legislation, the Commission may issue a compliance order requiring them to take specific measures to remedy the situation within a specified period as well as inform them of their right to a judicial review. The Commission may also impose an enforcement order or a sanction. In issuing an enforcement order or a sanction, the Commission may:
Require the data controller or processor to remedy the violation;
Order for the compensation of data subjects;
Order the controller or processor to account for profits realized from the violation; or
Impose a penalty.
However, it is not clear from the Act what conditions may trigger an enforcement order, sanction, and thus a penalty or any other such measure. In laws such as Section 62 of Kenya’s Data Protection Act, failure to comply with the requirements of an enforcement order (referred to as a compliance order under the Act) triggers a penalty notice. The Act does not specify the period within which complaints may be heard and concluded.
The penalty amount depends on whether the violator is a data controller or processor of major importance or not. Penalties against data controllers or processors of major importance shall be the higher of N10,000,000 (approximately 22,000 USD) or 2% of the annual gross revenue of the preceding financial year. Penalties against other data controllers and processors shall be greater than N2,000,000 (approximately 4,300 USD) or 2% of the annual gross revenue of the preceding financial year.
The Commission is empowered to create regulations that create new offenses and that impose penalties not exceeding those prescribed under the Act (Section 56(3)).
Conclusion
As Nigeria continues to make its mark within the global digital economy and rapidly expand its technology ecosystem, this Act represents a continued focus on protecting the personal data of Nigerian citizens, in alignment with common internationally accepted principles of data protection.
However, the Act contains unique provisions that should not be overlooked, including a new classification of data controllers and processors “of major importance” and specific obligations attached to them, as well as broader protections for exempt processing activities. Overall, the Act represents a significant step in Nigerian data protection and notably resolves the long-running dispute regarding the identity and institutional authority of Nigeria’s primary data protection regulator.
FPF Launches Cybersecurity and Data Privacy Expert Group, Bringing Together Top Leaders for Advisory Committee
As the world becomes more ingrained and dependent on digital systems, the need to explore the challenges posed by emerging technologies and develop ethical norms and workable best practices grows. Today, FPF launched its Privacy and Cybersecurity Expert Group and announced the Inaugural Advisory Committee to lead FPF’s exploration of the intersection of privacy and security.
Comprised of FPF’s Advisory Board members, the Privacy & Cybersecurity Expert Group will examine the overlap between data privacy and cybersecurity and how different global laws and policy regimes tackle that overlap. The working group will also provide space for privacy and cybersecurity experts to work together, ultimately facilitating the opportunity to elevate practices and approaches.
“Often, privacy and cybersecurity are mistaken as separate, and sometimes even competing, motivators,” said Amie Stepanovich, Vice President for U.S. Policy at FPF, who leads the working group. “However, both cybersecurity and privacy experts deal with large amounts of personal information. Our work to convene the Expert Group, and with the advice of our Inaugural Advisory Committee of innovative thinkers, provides an invaluable opportunity for a wide variety of experts to work together and become powerful allies moving toward the common interest of crafting norms and protections for society at large.”
The Inaugural Advisory Committee consists of top cyber and privacy executives at industry-leading companies and representatives from civil society and academia.
Advisory committee members include:
Emily Hancock, Cloudflare
Stephenie Handler, Gibson Dunn (Chair)
David Hoffman, Duke University, Sanford School of Public Policy
Anitha Ibrahim, Amazon Web Services
Andy Serwin, DLA Piper
Chad Sniffen, National Network to End Domestic Violence
Melanie Tiano, T-Mobile
Heng Xu, American University
To join FPF’s Privacy & Cybersecurity Expert Working Group as an FPF member, please email [email protected].
We’re On to Oregon: Sixth State Privacy Law of 2023 Creates New Consumer Rights and Protections
On June 22nd, lawmakers in Salem passed SB 619, the Oregon Consumer Privacy Act (“OCPA”). If enacted by Governor Kotek, Oregon will become the eleventh U.S. state (and sixth in 2023) to adopt broad-based data privacy legislation governing the collection, use, and transfer of consumer data. The bulk of OCPA’s requirements will take effect on July 1, 2024 (with a July 1, 2025 effective date for nonprofit organizations).
OCPA is the product of a multi-year stakeholder task force held under the auspices of Attorney General Rosenblum’s office.* The bill shares a common underlying framework, rooted in the proposed Washington Privacy Act, with every non-California state to enact comprehensive privacy legislation. Nevertheless, stakeholders should pay particular attention to OCPA as it would join the Texas Data Privacy and Security Act as the only two comprehensive state privacy laws enacted so far in 2023 that extend privacy rights and protections beyond the existing high-water marks established by states such as Colorado and Connecticut in modest but meaningful ways.
Below, we identify the key provisions that distinguish OCPA from comparable state privacy laws:
1. Broad Scope
State privacy laws typically exclude entities that are subject to existing federal privacy laws from coverage; however, OCPA carves out only specific data held by organizations that is subject to laws such as the Health Insurance Portability and Accountability Act (HIPAA) and the Gramm-Leach-Bliley Act (GLBA). Furthermore, OCPA does not include an entity-level exception for non-profit organizations, joining only the Colorado Privacy Act in applying to such organizations. Notably, Oregon’s bill does carve out nonprofits “established to detect and prevent fraudulent acts in connection with insurance” as well as organizations designated as “financial institutions” under state law.
2. Expansive Definitions of Covered Data
Consistent with the ten existing state privacy laws, OCPA applies to “personal data” and creates a subcategory of “sensitive data” that is subject to heightened protections. However, OCPA’s definition of “personal data” is unique in explicitly including “derived data” (presumably covering inferences about a customer) and data associated with a “device” that is reasonably linkable to one or more consumers in a “household.” Furthermore, contrary to other Washington Privacy Act-style laws, OCPA lacks a definition of and dedicated exemptions for personal data that is maintained in a “pseudonymous” format.
The Oregon bill’s definition of “sensitive data” is also broader than other state laws, covering “national origin,” “status as transgender or nonbinary,” and “status as a victim of crime.” OCPA also contains a comparatively broad definition of “biometric data” (a category of sensitive data), including information that may allow the unique identification of an individual, not just data collected or used for the purpose of such identification. However, the scope of “biometric information” is also limited by a novel exception providing that “facial mapping or facial geometry” only qualifies as biometric data if collected for or used for the purpose of uniquely identifying an individual, likely carving out technologies used solely for facial detection and characterization.
3. Novel Consumer Rights
OCPA provides for now-routine individual rights to obtain confirmation of data processing, to access, correct, delete, receive personal information in a portable format, and to opt-out of targeted advertising, data sales, and significant profiling decisions. However, Oregon’s bill also allows individuals to obtain a list of the “specific third parties” to whom a controller discloses personal data. This may be the most operationally complicated novel aspect of the Act and mirrors similar requirements included in recently enacted health privacy laws in Washington State and Nevada. OCPA would also be the first comprehensive state privacy law to explicitly provide that individuals have a right to request the deletion of “derived data.”
4. Slightly Stricter Controller Obligations
Oregon’s bill includes a number of routine obligations for covered organizations including maintaining reasonable data security, contractual requirements for processors, posting privacy notices, and obtaining consent for the processing of sensitive data. However, OCPA does establish slightly stricter data controller obligations than comparable state laws. First, controllers are required to obtain affirmative consent in order to profile adolescent data (individuals from 13-15 years of age), for significant decisions. Second, while OCPA does not explicitly use the term “dark patterns,” it does provide that design mechanisms deployed with a purpose to frustrate consumer choice, not just those that have “substantial effect” of doing so, will invalidate consumer consent under the law. Finally the Act requires that data protection impact assessments must be retained for a period of five years, joining only Colorado which established a similar retention period through its implementing regulations.
5. Enabling Data Use and Sharing for Research
Finally, OCPA includes a familiar list of uses of personal data that are exempt from that Act’s consumer rights and obligations, including processing for the purposes of internal operations reasonably aligned with consumer expectations, complying with law enforcement inquiries, and maintaining data security. However, unlike other state privacy laws providing that the use of data for research must be governed by an IRB-like entity and conducted in the “public interest” (a subjective term that may be applied differently across jurisdictions), SB 619 exempts identifying data used for research so long as it is consistent with applicable law.
*The Future of Privacy Forum submitted written feedback to the Oregon task force in early 2022.
New FPF Infographic Analyzes Age Assurance Technology & Privacy Tradeoffs
As a growing number of federal and state children’s online privacy and safety proposals seek to age-restrict social media and other online experiences, FPF released a new infographic, Unpacking Age Assurance: Technologies and Tradeoffs. The infographic analyzes the risks and potential harms associated with attempting to discern someone’s age online, as well as potential mitigation tools. FPF also outlines the privacy and accuracy tradeoffs of specific age assurance methods and technologies.
“Age assurance is highly contextual, and the most privacy-protective approach requires choosing a method, or sometimes multiple methods, in a layered approach, that is proportional to the risks of each specific use case,” said Jim Siegl, FPF Youth & Education Senior Technologist, and a co-author of the infographic. “If the potential for user harm associated with the service is high, a higher level of certainty may be appropriate. We hope this analysis and visualization of the key decision points will serve as a helpful guide as policymakers, regulators, service providers, and others continue to evaluate options and potential solutions.”
The analysis outlines the three categories of age assurance, finding that:
Age declaration, including age gate and parental consent/vouching, generally offers the lowest degree of privacy risks to the user and the lowest level of accuracy to the service provider. Parental consent provides more assurance than age-gating but may impact a teen’s autonomy.
Age estimation, such as facial characterization and other algorithmic estimation methods based on browsing history, voice, gait, or data points/signals from a VR game, can be particularly effective for determining if a user meets an age threshold (e.g., under 13 or 21+) but is less accurate within in narrow age ranges (e.g., determining if a user is 17 or 18).
Age verification, such as government ID plus biometrics or digital ID, is more appropriate for higher-risk, regulated, or age-restricted services, and it provides the greatest level of assurance but also poses the highest degree of privacy risks.
Balancing the privacy and equity implications of age assurance with the potential for harm to minors is an ongoing challenge for policymakers and online service providers. The infographic highlights some of those risks, including limiting legitimate access to online content, a loss of anonymity, limiting teen autonomy, and sensitive data collection and/or unexpected data uses. FPF also identifies some risk management tools, including on-device processing, data minimization, immediate deletion of ID data, and using a third party to separate data processing so that one company doesn’t control/access all of the data.
“Age assurance impacts all users on a service, not just minors. In addition to considering the tradeoffs around privacy and the potential for harm with each particular method, it’s important to balance those against the risk of barring access to content – especially if content restrictions have inequitable impacts,” said Bailey Sanchez, FPF Youth & Education Privacy Senior Counsel and a co-author of the infographic. “While age restrictions such as gambling are a matter of law, risk of harm is subjective – and that’s where things are starting to become really difficult. Different people make different calculations about potential risks posed by online gaming, for example.”
Unpacking Age Assurance builds on a series of new federal and state policy resources from FPF’s Youth & Education Privacy team. FPF recently released a report on verifiable parental consent, a form of age declaration and requirement of the Children’s Online Privacy Protection Act, and its analyses of new children’s privacy laws in Utah, California, Florida, and Connecticut highlight their respective approaches to age assurance.