FPF Files COPPA Comments with the Federal Trade Commission

Today, the Future of Privacy Forum (FPF) filed comments with the Federal Trade Commission (Commission) in response to its request for comment on the Children’s Online Privacy Protection Act (COPPA) proposed rule.

Read our comments in full.

As technology evolves, so must the regulations designed to protect children online, and FPF commends the Commission’s efforts to strengthen COPPA. In our comments, we outlined a number of recommendations and considerations that seek to further refine and update the proposed rule, from how it would interact with multiple provisions of a key student privacy law to the potential implications of a proposed secondary verifiable parental consent requirement. 

To amplify the questions about how COPPA would interact with the Family Educational Rights and Privacy Act (FERPA), FPF was also one of 12 signatories to a multistakeholder letter addressed to the Commission and Department of Education urging the development of joint guidance.

Read the letter here.

Considerations Applicable to All Operators

Children today are increasingly reliant on online services to connect with peers, seek out entertainment, or engage in educational activities, and while there is a great benefit to this, there are also risks to privacy and personal data protection, and we applaud the Commission for its ongoing efforts to find a balance between these tradeoffs. Our comments and recommendations focused on areas where we believe there is further opportunity to strike that balance, including:

Unique Considerations for Schools and Educational Technology

FPF commends the Commission’s effort to provide better clarity regarding how the rule should be applied in a school context; however, there are several areas where the proposed rule does not fully align with the Family Educational Rights and Privacy Act (FERPA), the primary federal law that governs use and disclosure of educational information. Both laws are complex, and the potential impact of confusion and misalignment is significant for the more than 13,000 school districts across the country and for the edtech vendor community.

With that in mind, our comments related to the proposed rule’s implications for student privacy focused in large part on identifying areas where more alignment and clarity around the interaction between COPPA and FERPA would be particularly instructive for both schools and edtech companies. Our recommendations include:

To read FPF’s COPPA comments in full, click here.

To download the joint letter to the FTC and U.S. Department of Education signed by FPF and 11 others, click here.

Little New About Hampshire

On March 6, 2024, Governor Sununu signed SB 255 into law, making New Hampshire the fourteenth U.S. State to adopt a comprehensive privacy law to govern the collection, use, and transfer of personal data. SB 255 is the second comprehensive privacy law enacted in 2024, the first having been New Jersey’s S332, which was also a holdover from the 2023 legislative session. Another example of states following the “Connecticut model,” S255 bears a strong resemblance to other laws following the Washington Privacy Act (WPA) framework. The law will take effect on January 1, 2025. This blog post addresses two unique facets of SB 255, including its narrow rulemaking authority and a unique provision addressing conflicts with other laws, while ultimately reflecting on how SB 255 is arguably the first “boring” state comprehensive privacy law.

1. Two Novel Provisions in New Hampshire

a. Narrow Rulemaking Authority

Prior to New Hampshire joining the fray, there were two approaches to rulemaking in the state comprehensive privacy landscape. In the first category are laws that provide no rulemaking authority, which includes a majority of enacted legislation. However, a handful of states—California, Colorado, and New Jersey— exist in another category where the legislation provides broad rulemaking authority, either to promulgate regulations for the purpose of carrying out the law or, in California’s case, to issue regulations on a variety of important topics. 

SB 255 breaks this trend by including two narrow rulemaking provisions. First, in section 507-H:6, which notes that the secretary of state will establish standards for privacy notices. The second rulemaking provision is section 507-H:4(II), which specifies that the secretary of state will establish a “secure and reliable means” for individuals to exercise their rights under the law. Most other states task controllers with establishing their own means for individuals to exercise their rights (e.g., Delaware). California was slightly more prescriptive in its requirements (e.g., requiring that businesses offer a toll-free telephone number to exercise rights) but ultimately leaves much to the discretion of businesses. New Hampshire’s requirement that the secretary of state establish a uniform means for exercising data rights could make it easier for individuals to submit requests given that the mechanism will not vary from controller-to-controller. Businesses interact with their customers in a variety of ways, however, and this standardization could pose challenges for businesses if it is overly rigid.

b. Compliance with Other Law

SB 255 contains a unique provision regarding compliance with “other law.” Section 507-H:12 provides that anyone covered by SB 255 and “other law regarding third party providers of information and services” must comply with both laws, and, where there is a “direct conflict” between the two laws, the individual or entity “shall comply with the statute that provides the greater measure of privacy protection to individuals.” For the purposes of that provision, opt-in consent for disclosing personal information is deemed more protective than the opt-out rights in SB 255. 

This language was added while SB 255 was in committee to prevent potential conflicts between SB 255 and HB 314, a distinct bill that was being considered in parallel to SB 255. Originally intended to curtail government acquisition of personal information, HB 314 was expanded significantly by the House Judiciary Committee to place strict limits on the disclosure of personal information by a “third-party provider of information,” defined broadly under that bill to encompass telephone companies, utilities, internet service providers, streaming services, social media services, email service providers, banks and financial institutions, insurance companies, and credit card companies. 

HB 314 passed the New Hampshire House of Representatives in early January 2024, but it has not progressed in the Senate at the time of writing. Retaining this conflict provision in SB 255 without also passing HB 314 raises questions about the provision’s function, given that “third-party provider of information or services” currently is not defined in law.

2. The First “Boring” Privacy Law?

Perhaps what is most interesting about SB 255 is how uninteresting it is—at least in regard to comprehensive privacy law, there is very little new in New Hampshire:

That SB 255 adds little new to the state comprehensive privacy landscape is indicative of the maturity of state privacy law. Once upon a time, a state enacting comprehensive privacy legislation warranted an emergency blog post with detailed analysis and lofty questions about a looming “patchwork” of incompatible laws. In the almost six years since the California Consumer Privacy Act was enacted, fourteen states have now joined the fold. As noted in FPF’s forecast of the 2024 privacy landscape, while there was a general regulatory convergence on the WPA framework, there are still meaningful differences between most of the post-California comprehensive state privacy laws. Many have wondered whether any states would buck the consensus trend in 2024 and adopt a novel approach to data privacy. That may be the case, as several states are currently considering bills inspired by the American Data Privacy and Protection Act. But if New Hampshire is anything to go by, perhaps 2024 will instead be a year of greater convergence and uniformity amongst the states. Time will tell.

FPF Statement on President Biden’s 2024 State of the Union Address

“At this critical moment in time, the U.S. is positioned to demonstrate leadership to develop and regulate emerging technologies such as AI. These tools, while incredibly advantageous when deployed responsibly, also carry tremendous potential to cause harm. We commend the Biden administration for recognizing the multifaceted challenges and opportunities presented by AI technologies. 

We’re also encouraged to hear President Biden reaffirm his commitment to enacting stronger privacy protections for kids online. Technology creates both terrific opportunities and real risks for young people, and as kids spend more time online and as AI and other technologies continue to evolve, finding that balance has become more difficult ― and more important ― than ever before. We stand by the fact that a comprehensive federal privacy law would address some of the most pressing privacy concerns associated with AI, including algorithms’ use of mass amounts of sensitive data.”

 – Jules Polonetsky, CEO, Future of Privacy Forum 

Read the full State of the Union Address here.

Event Recap: FPF X nasscom Webinar Series – Breaking Down Consent Requirements under India’s DPDPA

Following the enactment of India’s Digital Personal Data Protection Act 2023 (DPDPA), the Future of Privacy Forum (FPF) and nasscom (National Association of Software and Service Companies), India’s largest industry association for the information technology sector, co-hosted a 2-part webinar series focused on the consent-centric regime under the DPDP Act. Spread across two days (November 9, 2023 and January 29, 2024), the webinar series comprised four panels that brought together experts from industry, governments, civil society, and the global data privacy community to share their perspectives on operationalizing consent under the DPDPA. This blog post provides an overview of these discussions. 

Panel 1 – Designing notices and requests for meaningful consent 

The first panel was co-moderated by Bianca Marcu (Policy Manager for Global Privacy, FPF) and Ashish Aggarwal (Vice President for Public Policy, nasscom) They were joined by the following panelists: 

  1. Paul Breitbarth, Data Protection Lead, Catawiki & Member of the Data Protection Authority, Jersey.
  2. Eduardo Ustaran, Partner, Global Co-Head of Privacy & Cybersecurity, Hogan Lovells.
  3. Eunjung Han, Consultant, Rouse, Vietnam.
  4. Swati Sinha, APAC, Japan and China Privacy Officer & Senior Counsel, Cisco.

The panel began with a short presentation by Priyanshi Dixit (Senior Policy Associate, nasscom) that introduced the concepts of notice and consent under the DPDPA. During the discussion, panelists emphasized the importance of clear, understandable written notices and discussed other design choices to ensure that consent is “free, specific, informed, unconditional, and unambiguous”. To this end, Swati Sinha highlighted consent notices for different categories of cookies under the EU General Data Protection Regulation (GDPR), and granular notices with separate tick boxes in South Korea and China as examples of how data fiduciaries under the DPDPA could design notices to enable individuals to make informed decisions. However, Swati also stressed that consent forms should not bundle different purposes or come with pre-ticked boxes. Eduardo Ustaran observed that the introduction of strict consent requirements in many new data protection laws internationally has transformed the act of giving consent from a passive action into a more active and affirmative one. Eduardo also stressed the importance of ensuring that consent was clearly and freely given and maintaining clear records. 

Adding to this, Paul Breitbarth suggested that visuals such as videos and images could help make the information in notices more accessible, particularly given that long text-based notices might not be convenient for individuals using mobile devices. Paul used the example of airline safety videos as an effective method for presenting notices, with voiceovers and subtitles to ensure accessibility for a broader audience. However, Paul cautioned that it is always advisable to include written notices alongside such visual representations. 

The panelists also highlighted challenges to relying on consent as a basis for processing personal data, such as varying levels of digital literacy, the risk of “consent fatigue,” and the use of deceptive design choices (such as pre-ticked consent boxes). The discussions therefore considered alternatives to consent under different data protection laws. The panelists highlighted that in Europe, consent is not always the most popular legal basis for processing personal data as under the GDPR consent is one of several equal bases for processing personal data. The panelists also considered that in jurisdictions whose data protection laws emphasize consent over other legal bases, organizations may face difficulties in ensuring that consent is meaningful. Eunjung Han cited Vietnam’s recent Personal Data Protection Decree as an example of a framework that emphasizes consent and could potentially limit businesses’ ability to process personal data for their operations. She also noted that industry stakeholders in Vietnam are engaging in conversations with the government to share global practices where business necessity serves as a legal basis for processing.

Regarding regulatory actions, the panelists noted that regulators initially offer guidance and support to industry but over time, may transition to initiating enforcement actions. As final takeaways, panelists stressed the importance of accountability and emphasized the need to clearly identify usage of personal data, only collect personal data that is necessary for a specific purpose, and adhere to data protection principles. 

Panel 2 – Examining consent and its alternatives

The second panel was co-moderated by Gabriela Zanfir-Fortuna (Vice President for Global Privacy, FPF) and Ashish Aggarwal (Vice President for Public Policy, nasscom). They were joined by the following panelists:

  1. Francis Zhang, Deputy Director, Data Policy, PDPC Singapore.
  2. Leandro Y. Aguirre, Deputy Privacy Commissioner, Philippines National Privacy Commission.
  3. Kazimierz Ujazdowski, Member of Cabinet, European Data Protection Supervisor.

Varun Sen Bahl (Manager, nasscom) set the context for the panel discussion through a brief presentation, outlining various alternatives to consent under the DPDP Act: legitimate uses (section 7) and exemptions (sections 17(1) and 17(2)).

Throughout the discussion, the panelists drew from their experiences with their respective data protection laws: Singapore’s Personal Data Protection Act (PDPA), the Philippines’ Data Privacy Act (DPA), and the EU’s GDPR. In particular, a common experience shared by the three panelists was that they had all faced questions on the interpretation of alternative bases to consent in their respective jurisdictions. They noted that this was an evolving trend and suggested that it would likely extend to India as well. 

Panelists noted that some data protection authorities were proactively promoting alternative legal bases to consent. This need arose because organizations in their jurisdictions were over-relying on consent as the de facto default legal basis for processing personal data, leading to “consent fatigue” for data subjects. For instance, Francis Zhang explained that Singapore amended its PDPA in 2020 to include new alternatives to consent that aim to strike a balance between individual and business interests. 

Gabriela highlighted the similarities between section 15(1) of Singapore’s PDPA and section 7(a) of the DPDP Act. Both provisions allow for consent to be deemed where an individual voluntarily shares their personal data within an organization. In this context, Francis Zhang shared Singapore’s experience with this provision and explained that it was intended to apply in scenarios where consent can be inferred from the individual’s conduct, such as sharing payment details in a transaction or health information during a health check-up.

Reflecting on his experience in Europe, Kazimierz Ujazdowski observed that data protection authorities tend to be reactive as they are constrained by the resources at their disposal. He suggested that Indian regulators could be better prepared than the ones in Europe at the time of the enactment of the GDPR by proactively identifying practices that are likely to adversely affect users. He also highlighted the importance of taking a strategic approach to map areas of risk requiring regulatory attention. Deputy Commissioner Aguirre emphasized the need for India’s Data Protection Board to establish effective mechanisms to offer guidance regarding the interpretation of key legal provisions and how to comply with them. He highlighted that effective communication between regulators and industries was crucial for anticipating lapses and promoting compliance. He also explained that complaints and awareness efforts during the transition period before the Philippines’ DPA took effect helped to refine the Philippines’ data protection legal frameworks.

Panel 3 – Realizing the ‘consent manager’ model

The third panel was focused on the novel concept of consent managers introduced under the DPDPA and was moderated by Malavika Raghavan (Senior Fellow, FPF) and Varun Sen Bahl (nasscom). They were joined by the following panelists:

  1. Vikram Pagaria, Joint Director, National Health Authority of India. 
  2. Bertram D’Souza, CEO, Protean AA and Convener, AA Steering Committee, Sahamati Foundation. 
  3. Malte Beyer-Katzenberger, Policy Officer, European Commission. 
  4. Rahul Matthan, Partner – TMT, Trilegal.
  5. Ashish Aggarwal, Head of Public Policy, nasscom.

To kick off the discussions, Varun Sen Bahl provided a quick overview of the provisions on “consent managers” under the DPDPA.The law defines a “consent manager” as a legal entity or individual who acts as a single point of contact for data principals (i.e., data subjects) to give, manage, review, and withdraw consent through an accessible, transparent, and interoperable platform. Consent managers must be registered with the Data Protection Board of India (once established) and will be subject to obligations under forthcoming subordinate legislation to the DPDPA.

As the concept of a consent manager is not found in other legislation in India or internationally, there has been a great deal of speculation as to what form consent managers will take, and what role they will play in India’s technology ecosystem, once the DPDPA and its subordinate legislation are fully implemented. 

The discussion among panelists touched upon the evolving role of consent managers and their potential impact under the DPDPA. 

Rahul Matthan highlighted two concepts from existing consent management frameworks in India: the “account aggregator” framework in the financial sector, and the National Health Authority’s Ayushman Bharat Digital Mission (ABDM) in the health sector that could serve as potential operational models for consent managers under the DPDPA. He also suggested that these initiatives could facilitate data portability, even though the DPDPA does not expressly recognize such a right. He also anticipated that forthcoming subordinate legislation would clarify how these existing initiatives will interface with consent managers under the DPDPA.

Bertram D’Souza and Vikram Pagaria provided background on how these two sectoral initiatives function in India.

Bertram noted that in India’s financial sector, account aggregators currently enable users to manage their consent with over 100 financial institutions, including banks, mutual funds, and pension funds and enable users to manage their consent. Several different account aggregators exist on the market today, but must register with the Reserve Bank of India to obtain an operational license. 

Vikram highlighted how ABDM enables users in the health sector to access their health records and consent to requests from various different entities (such as hospitals, laboratories, clinics, or pharmacies) to access that data. Users can also control the type of health record to be shared and the duration for which the data needs to be shared. Vikram also noted that approximately 500 million individuals have consented to create their Health IDs (Ayushman Bharat Health Account), with around 300 million health records linked to these IDs.

Malte Beyer-Katzenberger drew parallels between these existing sectoral initiatives in India and the EU’s Data Governance Act (DGA), a regulation that establishes a framework to facilitate data-sharing across sectors and between EU countries. He explained how the DGA evolved from business models trying to solve problems around personal data management and consent management. In this context, he noted that EU regulators are keen to collaborate with India on the shared objectives of empowering users with their data and enabling data portability.  

Ashish highlighted that the value of consent managers lies in providing users a technological means to seamlessly give and withdraw consent. He also saw scope for data fiduciaries to rely on consent managers as a tool to safeguard against liability and regulatory action. When asked about what business model consent managers would adopt, Bertram noted that it is an evolving space and the market in which consent managers will operate is extremely fragmented. While he anticipated that based on his experience with account aggregators, consent managers would initially be funded by India’s technology ecosystem system, they may eventually shift to a user-paid model. The panelists also highlighted the need to obtain “buy-in” from data fiduciaries and ensure that they are accountable towards users towards users). Malte also pondered how consent managers could achieve scale in the absence of a legislative mandate requiring their use.

Rahul Matthan highlighted the immense potential of the market for consent managers in India, noting that as of January 2024, account aggregators have processed 40 million consent requests, twice the number from August of the previous year. Though account aggregators are not mandatory for users, Rahul noted that the convenience and efficiency that they offer is likely to encourage people to opt into using these services, whether they are within the formal financial system or outside it. Agreeing with this, Bertram highlighted the need for consent managers to focus on enhancing user experience and foster cross-sectoral collaborations. 

In his concluding remarks, Ashish underscored the importance of striking a balance by allowing the industry to develop the existing account aggregators framework while ensuring that use of this framework is optional for consumers. He agreed that the account aggregator framework is likely to influence the development of consent managers under the DPDPA, and suggested that there may also be use cases for similar frameworks in other areas and sectors, such as in e-commerce, to address deceptive design patterns.

Panel 4 – Operationalizing ‘verifiable parental consent’ in India

The final panel in the webinar series was focused on examining the requirements for verifiable consent for processing the personal data of children under the DPDPA. The panel was co-moderated by Christina Michelakaki (Policy Counsel for Global Privacy, FPF) and Varun Sen Bahl and they were joined by the following panelists:

  1. Kieran Donovan, Founder, k-ID. 
  2. Rakesh Maheshwari, Former Head of the Cyber Laws and Data Governance Division, Ministry of Electronics and Information Technology.
  3. Iqsan Sirie, Partner, TMT, Assegaf Hamzah & Partners, Indonesia. 
  4. Vrinda Bhandari, Advocate – Supreme Court of India. 
  5. Bailey Sanchez, Senior Counsel, Youth & Education Privacy, Future of Privacy Forum. 

Varun Sen Bahl presented a brief overview of verifiable parental consent under the DPDPA. Specifically, the legislation requires data fiduciaries to seek verifiable consent from the parent or lawful guardian when processing the personal data of minors aged eighteen years or below or persons with disabilities. However, the Act empowers India’s Central Government to: 

The forthcoming subordinate legislation under the DPDPA is expected to provide further detail on how these provisions will be implemented.

Building on the presentation, the panelists shed light on the complexities surrounding parental consent requirements under different data protection laws. Iqsan Sirie drew parallels between India’s DPDPA and Indonesia’s recently enacted Personal Data Protection Law, which also introduced parental consent requirements for processing children’s data that will only be clarified through enactment of secondary regulation. Iqsan cited guidelines issued by Indonesia’s Child Protection Commission as “soft law” which businesses could refer to when developing online services. 

Rakesh Maheshwari explained that the Indian Government’s intent in introducing these measures in the DPDPA was to address concerns regarding children’s safety, albeit while providing the Central Government flexibility in implementing these measures. 

Vrinda Bhandari focused on the forthcoming subordinate legislation to the DPDPA and stressed that any method for verifying parental consent must be risk-based and proportionate. Specifically, she highlighted privacy risks and low digital literacy as challenges in introducing such tech-based solutions. First, she pointed out that biometric-based verification methods, such as India’s national resident ID number (Aadhaar) or any other government-issued ID that captures sensitive personal data, could pose security risks, depending on who can access this information. Second, she noted that the majority of Indians belong to a mobile-first generation, where parents may not be digitally literate. Although Vrinda cited tokenization as a good alternative, she questioned whether it would be feasible to implement it in India, given the costs and technical complexity of deploying this solution.

Drawing from his expertise at K-ID, which aids developers in safely authenticating and safeguarding children’s online privacy, Kieran Donovan highlighted the array of methods for implementing age-gating, ranging from simple email verifications to advanced third-party services aimed at privacy preservation. He discussed the use of payment transactions, SMS 2-factor authentication, electronic signatures, and question-based approaches designed to gauge user maturity. He also pointed out that only 4 of the 103 countries requiring parental consent specify the exact method for verifying parental consent. He also spoke about the challenges faced by businesses in implementing age-gating measures, including the cost per transaction and resistance from users to sophisticated verification methods. 

Comparing India’s DPDPA with the Children’s Online Privacy Protection Act (COPPA) Bailey Sanchez noted that the age for consent in this context is 13 years in the US and is applicable only for services directed at children. Bailey also observed that it is not straightforward to demonstrate compliance under the COPPA. However, the Federal Trade Commission proactively updates the approved methods for parental verification and also works with industry to review new methods that reflect technological advancements. Christina spoke about the legal position on children’s consent in the EU under GDPR, and the challenges in relying on other legal bases for processing children’s data. 

As final takeaways, the discussion touched on the importance of regulatory guidance and risk-based intervention that incentivizes stakeholders to participate actively. Overall, panelists noted that a nuanced approach balancing privacy protection and practical considerations is essential for effective implementation of parental consent requirements globally.

To conclude the webinar series, Josh Lee Kok Thong (Managing Director for APAC, FPF) expressed his gratitude to all the panelists, viewers, and hosts (from FPF and nasscom) for their active participation, extending a special note of thanks for their contributions.

Conclusion

In the run up to the notification of the subordinate legislation which will enforce key provisions of the DPDPA, the FPF x nasscom webinar series aimed to foster an active discussion that captured the insights of regulators, industry, academia, and civil society from within India and beyond. Going forward, FPF will play an active role in building on these conversations.

The DNA of Genetic Privacy Legislation: Montana, Tennessee, Texas, and Virginia Enter 2024 with New Genetic Privacy Laws Incorporating FPF’s Best Practices

In 2023, four states enacted new genetic privacy laws regulating direct-to-consumer genetic testing companies. This blog post provides details on what these new laws cover and how they compare to FPF’s widely-adopted Best Practices for Consumer Genetic Testing Services. 

Genetic privacy has been under increasing scrutiny at the state and federal levels, and regulators are prioritizing efforts to examine how businesses handle and disclose genetic data. For instance, the Federal Trade Commission (FTC) obtained orders against genetic testing providers Vitagene (2023) and CRI Genetics (2023) over alleged deceptive trade practices, including a claim that Vitagene had left sensitive data unsecured and retroactively changed its privacy policy without user consent. The White House has also taken a keen interest in genetic data privacy protections; genetic data privacy was flagged as an area of interest in the Biden Administration’s recent executive order that seeks to restrict “countries of concern” from accessing Americans’ sensitive personal data in bulk. The Department of Justice has also indicated that genetic data will be a focus of an upcoming Advance Notice of Proposed Rulemaking related to the executive order.

While federal agencies and lawmakers have been active in this area, state legislators have been the most active in mandating protections for this particularly sensitive category of personal information. In 2023, Montana, Tennessee, Texas, and Virginia joined six other states (Arizona, California, Kentucky, Maryland, Utah, and Wyoming) that have enacted privacy laws for direct-to-consumer genetic testing companies. These four newly enacted laws follow the trend of the six existing laws in adopting baseline requirements–including requirements to publish privacy notices and create consumer rights of access and deletion–in line with FPF’s Privacy Best Practices for Consumer Genetic Testing Services, first released in 2018.

However, the four state laws leave out key elements of the best practices around transparency about law enforcement access to data, children’s and teens’ online privacy, and consent for revised privacy policies that reflect the use of emerging technologies in genetic testing. As these privacy issues take center stage in 2024, states should consider expanding the scope of direct-to-consumer genetic testing privacy laws to address emerging technologies like artificial intelligence and persistent concerns about law enforcement access to data and minors’ rights to their genetic data.

New State Laws on Genetics Privacy Include Strong, Important Protections for Individuals

These four new state genetic privacy laws largely incorporate the foundational principles of the Future of Privacy Forum’s 2018 best practices. All four states’ genetic privacy laws create a consumer right to access and delete personal data, prohibit sharing genetic information with insurers and employers, and require companies to create a comprehensive security program to protect individuals’ data. All four laws also require companies to collect separate express consent to use data for marketing, research, and third-party sharing, with some laws extending this requirement to any secondary use or additional retention of individuals’ genetic data.

Laws in Tennessee, Texas, and Virginia exclude de-identified data from their definitions of “genetic data.” This is in line with FPF’s best practices on de-identified data, which note that de-identified data is not subject to the remaining best practices, as long as “de-identification measures taken establish strong assurance that the data is not identifiable.”  In addition, Tennessee, Texas and Virginia follow the guidance from the FTC and the Department of Health and Human Services (HHS) for de-identified data; the three state laws require that companies (1) take measures to ensure that individuals’ data cannot be linked to them, (2) commit to maintain and use data only in its de-identified form, and (3) contractually obligate data recipients to do the same.

Montana and Texas, meanwhile, each go beyond any existing consumer genetic privacy laws and the scope of FPF’s best practices to create additional requirements for direct-to-consumer genetic testing companies. Montana imposed data localization requirements for its residents’ genetic data and Texas established a property right for its residents over their genetic samples and data.

New State Laws Differ on Key Privacy Issues, Including Law Enforcement Access to Data, Kids’ Privacy Needs, and Transparency

The four state genetic privacy laws passed in 2023 are the first such laws to be passed in the wake of the Supreme Court’s 2022 decision in Dobbs v. Jackson Women’s Health Organization (2022), overruling the precedent set in Roe v. Wade and negating constitutional protections for reproductive health services. These four new laws have created essential genetic data privacy protections in line with the existing direct-to-consumer genetic privacy laws, but they differ on some key privacy issues that are the subjects of intense debate, including law enforcement access to data, children’s and teens’ online privacy, and transparency requirements around changing privacy policies to consider emerging technologies, including AI.

Law Enforcement Access to Data

FPF’s best practices call for genetic testing companies to notify individuals when their personal data is shared with law enforcement agencies and to publicly report on data requests from law enforcement on at least an annual basis. In the wake of Dobbs, the processes by which law enforcement agencies may gain access to health data have come under increased public and regulatory scrutiny. Data collected by direct-to-consumer genetic testing companies may reveal relationship and health data that could be used in abortion prosecutions; for example, fetal tissue samples could be compared to genetic data held by direct-to-consumer genetic testing companies to determine paternity or maternity, and retained biological samples could be repurposed by law enforcement for saliva-based pregnancy tests. As a result, even though none of the four laws specifically refer to reproductive health data or post-Dobbs privacy issues, some of them may impact how law enforcement can access genetic data to enforce restrictions on abortion and how direct-to-consumer genetic testing companies may respond to law enforcement requests for data.

Of the four laws, only Montana’s specifies that government agencies must provide a warrant to access genetic data after June 1, 2025, unless the disclosure is otherwise permitted by a specific state law. Two of the remaining new genetic privacy laws (Tennessee and Texas) explicitly permit law enforcement and government agencies to access individuals’ genetic data with valid legal process, which may include a warrant or subpoena, depending on the specific data being requested. While legal process may require notification to the impacted individual, in practice individuals can be prevented from receiving that notice under non-disclosure provisions. Only Virginia’s law does not specify detailed procedural requirements for genetic testing companies to share data with government agencies. 

While the four state laws diverge in their requirements for valid legal process and consumer notification, none of the laws include a requirement for companies to publish reports on data requests from law enforcement agencies. Leading direct-to-consumer genetic testing companies voluntarily publish reports on government requests for consumers’ data–including 23andMe and Ancestry, both of which report on data multiple times a year. Those reports are not often broken out by topic or type of data. Notably, some of the disclosures in these reports may be limited by law, including the U.S. Foreign Intelligence Surveillance Act.

Children’s and Teens’ Online Privacy

In recognition of the need for heightened privacy protections for children, FPF’s best practices recommend that direct-to-consumer genetic testing companies not market or directly offer their services to minors (under age 18). When parents and guardians provide consent for minors to submit their DNA samples, FPF recommends that genetic testing companies provide minors with a right to access their data and become the primary account holder once they reach age 18. 

2023 was also a banner year for debate around children’s online privacy and safety issues, including a unanimous vote by the Senate Commerce Committee to advance a bill to expand children’s privacy protections and cover teens aged 13 to 16. However, despite FPF’s recommendations and the recent attention given to children’s online privacy, none of the four state genetic privacy laws explicitly address children’s privacy interests when engaging with direct-to-consumer genetic testing companies, including scenarios where parents and guardians may submit genetic samples on behalf of their children.

Emerging Technologies and New Privacy Policies

Consent is an important part of all of the new genetic privacy laws, in line with the baseline standards for consent established in the six other existing state laws and in FPF’s best practices. Montana, Tennessee, and Virginia establish a specific requirement for direct-to-consumer genetic testing companies to collect initial express consent from users seeking genetic testing products and services–this initial consent must specify the inherent contextual uses of the data. Texas does not specifically require initial express consent but does require separate express consent for several different types of data processing.

FPF’s best practices state that companies should notify individuals and seek their consent before making any changes to privacy policies–over the past year, this has also become a major topic for regulatory enforcement. For instance, in 2023, the FTC issued its first genetic privacy enforcement action. In the Vitagene (2023) case, the FTC argued that the company engaged in deceptive behavior when it updated its privacy policy in 2020 and retroactively expanded third-party data sharing without notifying existing consumers or seeking their consent for the policy change. In the press release about the settlement order, Director of the FTC Bureau of Consumer Protection Samuel Levine noted, “[c]ompanies that try to change the rules of the game by re-writing their policy policy are on notice” for any unilateral applications of new privacy policies to existing consumer data.

The practice of ensuring that consent is obtained with updates to privacy policies and practices is becoming more important with the incorporation of new technologies into genetic testing business models. As AI becomes increasingly integrated in direct-to-consumer genetic testing companies’ platforms and product offerings, the inherent contextual uses of individuals’ genetic data may evolve, requiring updates to privacy policies.

All four laws also require entities to collect separate express consent for any secondary uses of individuals’ genetic data that are beyond the scope of the initial genetic testing product or service. However, none of the four laws explicitly include any procedural requirements for how companies should collect consent before implementing policy changes. The absence of an explicit provision in the laws means that the need to notify individuals of policy changes and seek consumer consent to implement those changes will largely be a matter of judicial or regulatory interpretation, and may vary from state to state.

State Legislatures Should Consider Expanded Genetic Privacy Protections in 2024

In addition to the four states that enacted genetic privacy laws in 2023, eight other states considered bills to regulate direct-to-consumer genetic testing companies’ privacy practices, demonstrating state lawmakers’ growing appetite for state genetic privacy legislation in the absence of comprehensive federal legislation. The 2024 legislative session is another opportunity for additional states to establish new protections, and state legislatures in Alabama, Indiana, Nebraska, and West Virginia have already considered legislation largely based on FPF’s best practices. 

2024 is also an opportunity for states with existing laws, including the four states that passed laws in 2023, to establish additional protections for individuals’ genetic data and adopt FPF’s best practices around law enforcement access to data, minors’ rights to their genetic data, and transparency for privacy policy changes. While these laws establish baseline genetic privacy protections that are in line with FPF’s best practices and consistent with existing state genetic privacy laws, they have left space for future legislators to further consider additional protections needed in the areas of law enforcement access to data post-Dobbs, children’s and teens’ online privacy, and direct-to-consumer genetic testing companies’ embrace of emerging technologies. 

By fully incorporating FPF’s best practices, states can promote a more privacy-protective genetic testing ecosystem and strive to better address the privacy issues that emerged in 2023 and continue to be a priority in 2024. In doing so, states can also raise the standard for genetic data privacy and effectively complement the federal government’s approach to regulating direct-to-consumer genetic testing companies.

FPF Awarded DOE and NSF Grants to Advance Privacy Enhancing Technologies & AI

The Future of Privacy Forum (FPF) has been awarded grants by the Department of Energy (DOE) and the National Science Foundation (NSF) to support FPF’s establishment of a Research Coordination Network (RCN) for Privacy-Preserving Data and Analytics. FPF’s work will support the development and deployment of Privacy Enhancing Technologies (PETs) for beneficial data sharing and analytics. Most notably, the RCN will bring together a multi-stakeholder community of academic researchers, industry practitioners, policymakers, and other stakeholders to advance the adoption of PETs in the context of AI and other key technologies.

Since its founding, FPF’s work has been driven by a belief in the fair and ethical use of technology to improve people’s lives,” said John Verdi, FPF’s Senior Vice President for Policy, who will serve as the RCN’s principal investigator. “We are convening a multidisciplinary, cross-sector, and international group of experts to better understand the risks of data sharing and analytics and how PETs can and cannot mitigate those risks.”

The DOE-NSF grants will enable FPF to establish a robust expert network with members from academia, industry, government, and civil society to discuss and develop best practices for advancing PETs. Its goals are to facilitate ethical data use, stimulate responsible scientific research and innovation, and enable individuals and society to benefit from data sharing and analytics.

The RCN will operate in two interrelated parts:

“Privacy-enhancing technologies are increasingly important in today’s data-driven landscape. They allow us to safeguard sensitive datasets and information needed to advance a broad research, development, and demonstration portfolio,” said Asmeret Asefaw Berhe, Director of DOE’s Office of Science. “This Research Coordination Network will help us move toward the shared goal of establishing new standards for data safety and security that will allow us to continue to develop the innovations and scientific discoveries we need to achieve our clean energy and industrial goals.” 

The awarded grants build on FPF’s years-long track record of convening private-sector stakeholders and regulators to discuss responsible data sharing and the deployment and regulation of PETs, including its Privacy Research and Data Responsibility RCN and Global PETs Network.

“This crucial investment represents our commitment to advancing the foundations of responsible AI and privacy-enhancing technologies,” said Dilma DaSilva, Acting Assistant Director for NSF’s Computer and Information Science and Engineering Directorate. “This effort supports research and development that enables individuals and society to benefit from the value derived from privacy preserving data sharing and analytics.”

The RCN will inform the public debate on PETs, provide useful information to policymakers, and contribute to the development of systems and products to support AI. For more information about the RCN and how to get involved, please contact [email protected]. To keep updated on similar issues and emerging topics, apply to join the Ethics and Data in Research Working Group.

The Research Coordination Network (RCN) for Privacy-Preserving Data Sharing and Analytics is supported by U.S. National Science Foundation (Award #2413978) and the Department of Energy (Award #DE-SC0024884).

RECs Report: Towards a Continental Approach to Data Protection in Africa

On July 28, 2022, the African Union (AU) released its long-awaited African Union Data Policy Framework (DPF), which strives to advance the use of data for development and innovation, while safeguarding the interests of African countries. The DPF’s vision is to unlock the potential of data for the benefit of Africans, to “improve people’s lives, safeguard collective interests, protect (digital) rights and drive equitable socio-economic development.” One of the key mechanisms that the DPF seeks to leverage to achieve this vision is the harmonization of member states’ digital data governance systems to create a single digital market for Africa. It identifies a range of focus areas that would greatly benefit from harmonization, including data governance, personal information protection, e-commerce, and cybersecurity.  

In order to promote cohesion and harmonization of data-related regulations across Africa, the DPF recommends leveraging existing regional institutions and associations that are already in existence to create unified policy frameworks for their member states. In particular, the framework emphasizes the role of Africa’s eight Regional Economic Communities (RECs) to harmonize data policies and serve as a strong pillar for digital development by drafting model laws, supporting capacity building, and engaging in continental policy formulation.
This report provides an overview of these regional and continental initiatives, seeking to better clarify the state of data protection harmonization in Africa and to educate practitioners about future harmonization efforts through the RECs. Section 1 begins by providing a brief history of policy harmonization in Africa before introducing the RECs and explaining their connection to digital regulation. Section 2 dives into the four regional data protection frameworks created by some of the RECs and identifies key similarities and differences between the instruments. Finally, Section 3 of the report analyzes regional developments in the context of the Malabo Convention through a comparative and critical analysis and, lastly, provides a roadmap for understanding future harmonization trends. It concludes that while policy harmonization remains a key imperative in the continent, divergences and practical limitations exist in the current legal frameworks of member states.

Brussels Privacy Symposium 2023 Report

The seventh edition of the Brussels Privacy Symposium, jointly co-organized by the Future of Privacy Forum and the Brussels Privacy Hub, took place at the U-Residence of the Vrije Universiteit Brussel campus on November 14, 2023. The Symposium presented a key opportunity for a global, interdisciplinary convening to discuss one of the most important topics facing Europe’s digital society today and in the years to come: “Understanding the EU Data Strategy Architecture: Common Threads – Points of Juncture – Incongruities.” 

With the program of the Symposium, the organizers aimed to transversally explore three key topics that cut through the Data Strategy legislative package of the EU and the General Data Protection Regulation (GDPR), painting an intricate picture of interplay that leaves room for tension, convergence, and the balancing of different interests and policy goals pursued by each new law. Throughout the day, participants debated the possible paradigm shift introduced by the push for access to data in the Data Strategy Package, the network of impact assessments from the GDPR to the Digital Services Act (DSA) and EU AI Act, and debated the future of enforcement of a new set of data laws in Europe. 
Attendees were welcomed by Dr Gianclaudio Malgieri, Associate Professor of Law & Technology at Leiden University and co-Director of the Brussels Privacy Hub, and Jules Polonetsky, CEO at the Future of Privacy Forum. In addition to three expert panels, the Symposium opened with Keynote addresses by Commissioner Didier Reynders, European Commissioner for Justice, and Wojciech Wiewiórowski, the European Data Protection Supervisor. Commissioner Reynders specifically highlighted that the GDPR remains the “cornerstone of the EU digital regulatory framework” when it comes to the processing of personal data, while Supervisor Wiewiórowski cautioned that “we need to ensure the data protection standards that we fought for, throughout many years, will not be adversely impacted by the new rules.” In the afternoon, attendees engaged in a brainstorming exercise in four different breakout sessions, and the Vice-Chair of the European Data Protection Board (EDPB), Irene Loizidou Nikolaidou, gave her closing remarks to end the conference.

The following Report outlines some of the most important outcomes from the day’s conversations, highlighting the ways and places in which the EU Data Strategy Package overlaps, interacts, supports, or creates tension with key provisions of the GDPR. The Report is divided into six sections: the above general introduction; the ensuing section which provides a summary of the Opening Remarks; the next three sections which provide insights into the panel discussions; and the sixth and final section which provides a brief summary of the EDPB Vice-Chair’s Closing Remarks.

Editor: Alexander Thompson

Colorado’s Approval of Global Privacy Control: Implications for Advertisers and Publishers

The privacy laws of both Colorado and California require organizations to recognize Universal Opt-Out Mechanisms (UOOMs), a tool through which a person can invoke their opt out rights broadly across all the websites they visit. While California has required responding to certain UOOMs since July 2021, the Colorado Attorney General has only recently approved their first tool – the Global Privacy Control – as valid within the scope of the state law. This sets the stage for organizations within the law’s jurisdiction to take appropriate action necessary to ensure that they are recognizing and responding to any person’s use of the GPC. Below we provide information for what organizations need to know about UOOMs going forward, including particular implementation challenges that must be addressed to avoid enforcement actions for falling afoul of the law.

Background

Governor Polis signed the Colorado Privacy Act (CPA) in July 2021, making Colorado the third state to pass a comprehensive privacy law. Among other things, the act requires the Colorado Attorney General to conduct a special process for approving Universal Opt Out Mechanisms (UOOMs) for people to use as a means of invoking their opt out rights. Under Colorado law, covered entities will be required to honor these UOOMs beginning July 1, 2024. 

The Colorado AG’s office closed applications for UOOM tools on November 6, 2023. After a public comment period, the Colorado AG announced that only one tool – the Global Privacy Control (GPC) – would be acknowledged on the exclusive public list of acceptable UOOMs in Colorado.

The recognition of the GPC as a valid UOOM in Colorado leaves adtech vendors, advertisers, and publishers in a broadly similar place in both California and Colorado once enforcement begins this summer: Publishers will have to respond to valid GPC requests in both states; advertisers and vendors will have to adjust business practices accordingly. Although implementations of GPC must still satisfy the requirements of the CPA, Colorado’s decision aligns their enforcement of opt-out rights with those in California, creating momentum toward a national standard.

What should Advertisers, Publishers, and Other Organizations Know About the GPC and UOOMs in U.S. law

1. Implementations of GPC must still satisfy the requirements of CPA

Under the CPA, UOOMs in Colorado must satisfy three categories of rules. By selecting a single UOOM tool, the Colorado AG’s office has indicated that this is the only tool “recognized in so far as the UOOM or any authorized implementations meet the requirements of [the Colorado Privacy Act].” 

The first and second of these rules relate to Notice and Choice under Rule 5.03 and Default Settings under Rule 5.04. The notice and choice requirements ask UOOM vendors to ensure that the signal represents an “affirmative, freely given, and unambiguous choice to opt out” of targeted advertising and data sales. The requirements for default settings seek to ensure the choice remains a genuine opt-OUT with respect to the device. The default browser installed on the device cannot simply negate the selection in a user interface to transform the user-facing mechanism into what would appear to be an opt-IN for the user. For browsers or browser extensions that do not come pre-installed on the device and that are marketed as tools for exercising a user’s opt out rights, the consumer’s decision to install and use these tools is considered an affirmative, freely given, and unambiguous choice.

The final requirement for UOOMs in the CPA is to follow Technical Specifications under Rule 5.06. The technical specification requirements make the tool “universal” in the sense that it can automatically transmit the opt-out to multiple publishers while remaining in compliance with other requirements, like the notice and choice requirements and the default settings requirements, and without unfairly disadvantaging controllers.

It is noteworthy that the AG’s office distinguishes between “the UOOM” – the GPC in this case – and “any authorized implementations” of the UOOM. Several organizations, including FPF, expressed broad support of the GPC while correctly observing that the GPC is a protocol-level technical specification and is implementable in valid and invalid ways in user-facing tools. Actual implementations of the GPC vary significantly in their interface and functionality. However, it is not clear what is required for an implementation to be “authorized”. One may read the language to require some additional recognition by the Colorado AG’s office (which has not produced a list of authorized implementations) or instead to include those implementations recognized by the creators of the GPC, which lists several implementations that support the GPC on their website. It is even possible that “authorized implementations” may even refer to other authorized, yet-to-be-approved UOOMs and have nothing to do with the GPC.

Based on this analysis, it is technically possible for publishers to receive an invalid GPC signal originating from a tool that fails to implement other requirements of the CPA. However, discerning the validity of GPC signals as they are received may require publishers to implement otherwise invasive means, like browser fingerprinting.

2. GPC will be a multi-state enforcement priority for 2024

Despite the limitations of approving a technical specification, the decision in Colorado to recognize only the Global Privacy Control marks an alignment with California that the GPC should be a clear priority for organizations looking to avoid an enforcement action in 2024. Controllers in Colorado and businesses in California should earnestly implement appropriate means to receive these signals and respond in their advertising technology stack. Industry preparation should include some mechanism for differentiating data that has been opted-out of sale or sharing from data that has not.

The Colorado AG also indicated that the current public list (which, again, consists solely of the GPC) will be “prioritized for enforcement,” meaning publishers will likely be required to respond to GPC opt-out requests as soon as the enforcement date of July 1, 2024 rolls around. Any relevant on-going or concluded investigations in California since the AG settlement with Sephora have not resulted in publicly announced enforcement actions. However, it has remained an area of active interest, including recent discussions by the California Privacy Protection Agency (CPPA) regarding the possibility of requiring browser vendors to implement a feature allowing users to express their opt-out preferences to publishers.1

3. Novel mechanisms may still be reconsidered in upcoming years

In naming the GPC as the current exclusive UOOM recognized in Colorado, Colorado AG also indicated that this did “not exclude additional UOOMs from meeting the requirements” in the future. This could mean the other shortlisted opt out mechanisms (i.e., the OptOut Code or the Opt-Out Machine) or some tool that has not yet been developed may be able to be approved in the future. However, the process for submitting applications is uncertain. The website is no longer accepting submissions, and although it may be opened to new submissions in the future, no plans for doing so are currently public.

The Colorado AG also indicated that when it does accept new applications, it will also seek public comments on them in a similar process. The three applications listed in the shortlist each took different approaches to standardizing expression of user opt out preferences. The OptOut Code proposal focused on prepending a code to human-readable device names, the Opt-Out Machine proposed an automated email-based opt out mechanisms, and the Global Privacy Control (GPC) proposed using their HTTP-based protocol-level specification in Colorado, having already been recognized as a UOOM in California.

Challenges Ahead for Enforcement

Enforcement of the Colorado Privacy Act’s requirements for opt-outs will begin later this year. Although the Colorado AG selected the GPC, they did not reveal their rationale or respond substantively to the concerns raised during the comment process. As a result, specific enforcement techniques and investigative approaches are hard to predict. At least four enforcement challenges exist for Colorado: (1) responding to the GPC alone may not be enough to ensure compliance with the CPA, (2) confirmation of signals by controllers is not required making verification of the receipt of valid signals difficult, (3) invalid GPC signals are difficult to detect definitively, and (4) the current move toward enforcement is happening at a time of transition in the industry at large.

First, responding to the GPC alone is not enough for compliance with the CPA. Although the GPC specification includes optional requirements allowing publishers to confirm to users that they have received the GPC signal, this confirmation is not technically tied to any advertising that appears on the publisher site. In other words, it is possible for a publisher site to continue serving targeted ads while confirming to users that their GPC opt-out signal has been received, either intentionally or accidentally. The Colorado AG will need some mechanism for discerning whether any advertising displayed was targeted or not. For people who have invoked the GPC, publishers are likely to replace targeted advertising with contextual advertising, and these ads may be served by similar ad servers, making discernment challenging. (The opt-out also applies to the sale of personal data, but that would not be immediately obvious to an enforcement agency in a single web browsing session regardless of the GPC configuration.) 

Second, optional confirmation requirements in the GPC specification are not strictly required by the CPA. Although confirmation may be useful for users, advertisers, and publishers seeking to test their configuration of their GPC tool of choice, their utility as part of regulatory enforcement remains unclear, and without them it is unclear how Colorado enforcement agencies will determine whether a signal has been received and responded to. It is worth noting here that California’s recently proposed revisions to the California Consumer Privacy Act (CCPA) would require businesses to display the status of the consumer’s choice.2

Third, invalid implementations of the GPC can transform the opt-out into a user-facing opt-in. Developers of privacy-oriented browsers and browser extensions have evinced a desire to make the user’s experience of setting up both the browser and the GPC as fast and easy as possible, but the legal environment is inherently complex. The installation and configuration process for these tools will be critical to ensuring that GPC signals are valid in each jurisdiction where they are intended to apply. The GPC signal does not embed information on which browser, extension or tool sent the signal. This can make it difficult for organizations seeking to determine a mechanism’s validity and investigators seeking to respond to GPC signals sent using an invalid mechanism or configuration.  Investigators will also have to determine if the person covered by the signal is a Colorado resident.

Finally, enforcement of the CPA comes at a time when the industry is transitioning away from the third-party cookie and toward new advertising APIs, presenting an additional challenge for discernment of targeting information. Publishers will need to be able to connect receipt of the GPC signal to their new infrastructure for advertising APIs during this transition. Similarly, Colorado’s enforcement will need to be able to verify compliance with the CPA, including responses to valid GPC signals, during this industry transition. Many other states are considering comprehensive privacy laws, some with subtly different opt out rights. Colorado has indicated that they prefer a harmonious, multi-state approach where possible, but this possibility remains an open question as states consider new approaches to privacy.

Conclusion

Colorado’s adoption of the GPC as the only valid universal opt out mechanism, for now at least, represents a critical step for vendors, advertisers, publishers, and users.  Broad alignment with California marks this as important outside of Colorado as well, particularly with other states adopting or considering comprehensive privacy laws. Although some challenges and open questions remain, covered entities should earnestly work towards compliance to be able to honor these UOOMs beginning July 1, 2024.

1 Note that this requirement may complicate the default setting requirements discussed earlier given Colorado’s differentiation between a browser that comes pre-installed on a device and one that does not.

2 See page 40, in § 7025 on Opt-out Preference Signals.

FPF Health & Wellness: Mapping the 2024 Health Privacy Landscape, A 2023 Retrospective

In 2024, health and wellness-focused companies are increasingly integrating AI to streamline their services–with the expansion of AI-enabled digital health, the universe of potential health inferences will also expand, triggering new concerns about patient and consumer privacy. At this intersection of reproductive health privacy and AI concerns, state legislators and federal regulators appear poised to take more action on health data privacy, with specific attention to reproductive health privacy and genetic data privacy. As we look ahead to further developments, it is prudent to look back and understand exactly where the regulatory landscape stands and how we got here…

In 2023, health data privacy developments were nearly all related to the continuing development of privacy law responses to the Supreme Court’s Dobbs decision and subsequent moves by states to bar access to certain reproductive health care services and to criminally prosecute individuals seeking access to that care. As reproductive health care remains in jeopardy in several states, we expect that reproductive health data privacy will continue to drive broader action on health data privacy. In this 2023 retrospective, FPF has identified the top themes of health legislation and regulation while looking ahead to 2024.