FPF Releases Infographic to Explore Implications of Open Banking Data Flows and Security for Individuals

Today, the Future of Privacy Forum (FPF), a global non-profit focused on privacy and data protection, is pleased to release an infographic, “Open Banking And The Customer Experience,” visualizing the US open banking ecosystem. FPF’s open banking infographic is supported by over a year of meetings and outreach with leaders in banking, credit management, financial data aggregators, and solution providers to comprehensively understand the developing industry of open banking. 

Open banking involves customer-permissioned data transfers between organizations holding data and entities that provide financial products and services (e.g., wealth management, payments, and loan access). Open banking is organized around four main steps, including (i) signing up and initiating a service; (ii) authenticating identity; (iii) authorizing data sharing; and (iv) provision of the product or service. 

Open banking can be a catalyst for greater competition by enabling new products and services that depend on the sharing of personal data. While the sharing of personal data is integral to realizing these benefits, it is not without privacy and security risks, including the risk of data breaches and unauthorized transactions. The US open banking ecosystem can also be confusing for customers wishing to use these products and services as well as the organizations that provide them, including in areas related to:

The Consumer Financial Protection Bureau (CFPB) sought comments this year regarding data portability for financial products and services, which is a prerequisite to issuing a proposed rule later in 2023 to update Section 1033 of the Dodd-Frank Wall Street Reform and Consumer Protection Act. Subject to rules created by the CFPB, Section 1033 requires covered entities to make certain information related to a person’s requested products and services available to the person upon request.

In response to the CFPB request regarding data portability for financial products and services, FPF submitted comments in January 2023, which address the main pain points raised in this infographic in greater detail. FPF has also released a paper, Data Portability in Open Banking: Privacy and Other Cross-Cutting Issues, detailing how different jurisdictions’ laws impacted open banking activities and intersected with data protection law, including issues surrounding consent, security, and data subject portability rights. The paper provided grounds for discussion at an event FPF organized in 2022 with the Organization for Economic Co-Operation and Development (OECD). In February 2023, the OECD issued a paper of the same name about the event.

If you wish to speak with FPF about this infographic or would like to learn more about the organization’s Open Banking Working Group, please reach out to Zoe Strickland ([email protected]) and Daniel Berrick ([email protected]). For media inquiries, please reach out to [email protected].

This infographic would not have been made possible without the work of Hunter Dorwart, former FPF Policy Counsel, who devoted significant hours to this project during his time at FPF.

screenshot 2023 06 26 at 9.23.59 am

Brussels Privacy Symposium 2022 Report

On November 15, 2022, the Future of Privacy Forum (FPF) and the Brussels Privacy Hub (BPH) of Vrije Universiteit Brussel (VUB) jointly hosted the sixth edition of the Brussels Privacy Symposium on the topic of “Vulnerable People, Marginalization, and Data Protection.” Participants explored the extent to which data protection and privacy law including the EU’s General Data Protection Regulation (GDPR) and other data protection laws like Brazil’s General Data Protection Law (LGPD) safeguard and empower vulnerable and marginalized people. Participants also debated balancing the right to privacy with the need to process sensitive personal information to uncover and prevent bias and marginalization. Stakeholders discussed whether prohibiting the processing of personal data related to vulnerable people serves as a protection mechanism or, on the contrary, whether it potentially deepens bias.

The event also marked the launch of VULNERA, the International Observatory on Vulnerable People in Data Protection, coordinated by the Brussels Privacy Hub and the Future of Privacy Forum. The observatory aims to promote a mature debate on the multifaceted connotations surrounding the notions of human “vulnerability” and “marginalization” existing in the data protection and privacy domains.

The Symposium was started with short introductory remarks by Jules Polonetsky, FPF’s CEO, and Gianclaudio Malgieri, Associate Professor at Leiden eLaw and BPH’s Co-Director. Polonetsky stressed the importance of understanding that privacy increasingly intersects with other rights and issues. Malgieri offered an overview of VULNERA and incentivized participants to reflect on important questions, such as whether data protection law could serve as a means to address human vulnerabilities and marginalization.

Throughout the day, there were two keynote addresses by Scott Skinner-Thompson, Associate Professor at the University of Colorado Boulder and Hera Hussain, Founder and CEO of Chayn, a nonprofit providing online resources for survivors of gender-based violence, followed by three panel discussions, a brainstorming exercise with the Symposium’s attendees in four different breakout sessions, and closing remarks delivered by FPF’s Vice President for Global Privacy, Gabriela Zanfir- Fortuna, and the European Data Protection Supervisor (EDPS), Wojciech Wiewiórowski.

This Report outlines some of the most noteworthy points raised by the speakers during the day-long Symposium. It is divided into seven sections: the above general introductions; the ensuing section, which covers the remarks made during the Keynote Speeches; the next three that summarize the content of the discussions held during the panels; the sixth one that touches on the exchanges audience members had during the breakout sessions; and the seventh and final one that provides highlights of the EDPS’s closing remarks.

Editor: Isabella Perera

Iowa Senate Advances Comparatively Weak Consumer Privacy Bill

By Keir Lamont & Mercedes Subhani

Update: On March 28, Governor Kim Reynolds signed SF 262 into law, making Iowa the 6th state to enact a baseline consumer privacy framework. 

Lawmakers in Iowa are considering the adoption of a new consumer privacy framework that would fall far short of comparable state privacy laws in terms of consumer rights, business obligations, and enforcement. On Monday, March 6th, the Iowa Senate passed SF 262, an Act relating to consumer data protection, by a 47-0 vote. Companion legislation, HF 346 is currently eligible for a vote in the Iowa House.

Iowa is one of several states that are currently seriously considering the enactment of privacy legislation, demonstrating a commendable focus on the protection of consumer data. However, the Iowa bill, while modeled after frameworks adopted in other states, nevertheless diverges significantly from the most protective state privacy laws.

In order to help stakeholders and policymakers assess Iowa’s privacy proposal, the Future of Privacy Forum is releasing a chart comparing SF 262 to the Connecticut Data Protection Act, which currently stands as one of the strongest and most interoperable state approaches for establishing privacy rights and protections.

At a high level, Iowa’s privacy proposal contains the following protections and notable omissions as compared to bills with similar models:

FPF Files Comments with the National Telecommunications and Information Administration (NTIA) on Privacy, Equity, and Civil Rights

On March 6, the Future of Privacy Forum filed comments with the National Telecommunications and Information Administration (NTIA) in response to their request for comment on privacy, equity, and civil rights.

NTIA’s “Listening Sessions on Privacy, Equity, and Civil Rights,” drew attention to the well-documented and ongoing history of data-driven discrimination in the digital economy. When the digital economy functions properly, all individuals, regardless of race, gender, or other historically discriminated-against attributes, are able to equally access the benefits of technology, including better access to education and employment opportunities, while trusting that their personal data is protected from misuse. Individuals and communities can benefit from digital tools that provide important services regarding education, employment, housing, credit, insurance, and government benefits. In addition, society as a whole benefits from diverse individuals contributing different perspectives, ideas, and objectives without cause to fear discrimination or harm. 

But when the digital economy reinforces human bias rather than combats discrimination, individuals suffer concrete harms, including artificially limited educational opportunities, reduced access to jobs and financial services, and more. At the same time, misuse of personal information can contribute to biased outcomes, undermine trust in digital services, or both.

FPF’s comments urge NTIA to include these three recommendations in its forthcoming report to advance protections for data privacy, equity, and civil rights in the US:

1. Support for Congressional efforts to pass a comprehensive federal privacy law that includes limitations on data collection, heightened safeguards for sensitive data use, support for socially beneficial research, and protections for civil rights, including protections regarding automated decision-making that has legal or similarly significant effects.

2. Support the administration to promote a common approach to privacy, AI, and civil rights among executive agencies in the agencies’ guidance to private entities and internally on the processing of personal information, and tech procurement processes.

3. Continue proactive engagement and meaningful consultation with marginalized groups in conversations regarding privacy and automated decision-making across the administration and federal agencies.

Race Equity, Surveillance in the Post-Roe Era, and Data Protection Frameworks in the Global South are Major Topics During This Year’s Privacy Papers for Policymakers Event

Author: Randy Cantz, U.S. Policy Intern, Ethics and Data in Research and former Communications Intern at FPF

The Future of Privacy Forum (FPF) hosted a Capitol Hill event honoring 2022’s must-read privacy scholarship at the 13th annual Privacy Papers for Policymakers Awards ceremony. This year’s event featured an opening keynote by FTC Commissioner Alvaro Bedoya as well as facilitated discussions with the winning authors: Anita Allen, Anupam Chander, Eunice Park, Pawel Popiel, Laura Schwartz-Henderson, Rebecca Kelly Slaughter, and Kate Weisburd. Experts from academia, industry, and government, including Olivier Sylvain, Neil Chilson, Amanda Newman, Gabriela Zanfir-Fortuna, Maneesha Mithal, and Chaz Arnett, moderated these policy discussions.

Stephanie Wong, FPF’s Elise Berkower Fellow, provided welcome remarks and emceed the night, noting this was the first time the award ceremony has met in-person after two years of virtual events due to the pandemic. Ms. Wong noted she was excited to present leading privacy research relevant to Congress, federal agencies, and international data protection authorities (DPAs).

fpf 13th annual privacy papers 16.feb .2023 13

FPF’s Stephanie Wong

In his keynote, Commissioner Bedoya was excited to celebrate privacy by highlighting some of the finest minds working on relevant technology policy issues. Commissioner Bedoya noted that events like Privacy Papers for Policymakers bring together the “brightest minds in technology speaking with the brightest minds in Congress” to determine which issues deserve special sectoral treatment. Commissioner Bedoya concluded his remarks by highlighting the aspects of the winning authors’ work that he finds most salient, stating that Professor Park’s work forces us to reckon with the reality that our health information cannot be confined to a neat box that lives at our doctor’s office; that Professor Weisburd’s work answers the fallacy that surveillance is passive, not a physical, lived experience and that surveillance is a form of punishment; that Professor Allen is a luminary who underscores the importance of seeing surveillance not as data collection, but rather by tracing the differential use of surveillance; that Professor Popiel and Laura Schwartz-Henderson grapple with challenges faced by DPA regulators; that Professors Anupam Chander and Paul Schwartz reveal a divergence between international trade and privacy norms; and that Commissioner Slaughter’s paper describes the harms of algorithmic bias and how to protect consumers from algorithmic harms.

fpf 13th annual privacy papers 16.feb .2023 17
FTC Commissioner Alvaro Bedoya

Following Commissioner Bedoya’s keynote address, the event shifted to discussions between the winning authors and expert discussants. The 2022 PPPM Digest includes summaries of the papers and more information about the authors.

Professor Anita Allen (University of Pennsylvania Carey Law School) kicked off the first discussion of the night with Olivier Sylvain (Senior Advisor at the Federal Trade Commission) by talking about her paper, Dismantling the “Black Opticon”: Privacy, Race Equity, and Online Data-Protection Reform. Professor Allen’s paper analyzes how African-Americans endure discriminatory oversurveillance, discriminatory exclusion, and discriminatory predation, which she calls the “Black Opticon,”and also highlights the role of privacy’s unequal distribution. During her conversation, Professor Allen discussed discriminatory surveillance and oversurveillance of the poor, why race-neutral policies aren’t enough to address discrimination, how commercial practices and reforms are unequally distributed along racial lines, and how the right to privacy emerged in the U.S.’s founding era when the concept of one’s privacy was used to protect slave owners and more recently, those who commit domestic abuse under the same guise of privacy. 

fpf 13th annual privacy papers 16.feb .2023 40
Professor Anita Allen and Olivier Sylvain

Next, Professor Anupam Chander (Georgetown University Law Center), discussed his paper, Privacy and/or Trade, with Neil Chilson (Stand Together). Professor Chander’s paper, co-written with Professor Paul Schwartz of the UC Berkeley School of Law, explores the conflict between international privacy and trade. At the outset, Mr. Chilson outlined the paper’s four contributions to privacy law and policy: an analysis of the long history of trade and privacy, the rapid growth of international regulation, that privacy and trade are not necessarily exclusive and can both impact humanitarian issues when we think about rights, and that the paper doesn’t merely observe the problem, but rather gives a potential path to fixing it through a regulatory regime. The conversation centered on privacy and transborder data flows, the concept that “privacy is not bananas,” and the practical effect of kicking the proverbial can down the road to other countries, amongst other salient topics.

fpf 13th annual privacy papers 16.feb .2023 52
Professor Anupam Chander and Neil Chilson

Professor Eunice Park (Western State College of Law) discussed her paper, Reproductive Health Care Data Free or For Sale: Post-Roe Surveillance and the “Three Corners” of Privacy Legislation Needed with Amanda Newman (Office of Congresswoman Sara Jacobs). Professor Park’s paper recommends placing over-due limits on state surveillance to protect the privacy of personal health data. Ms. Newman acknowledged that state criminalization has fallen hardest on marginalized communities post-Roe and that the issue of reproductive health data being available on the open market has made the issue of privacy more concrete and topical. Professor Park explained why the post-Roe era is harsher than the 1960s pre-Roe era, the disconnect between legal protection and digital reality in the context of the First Amendment, and the three corners of protections she recommends, including an inclusive definition for reproductive healthcare data, a substantive prohibition on certain practices, and procedural protections that would require a warrant for any type of information that could criminalize an individual.

fpf 13th annual privacy papers 16.feb .2023 65
Professor Eunice Park and Amanda Newman

During the next panel, Dr. Pawel Popiel (University of Pennsylvania Annenberg School for Communication) and Laura Schwartz-Henderson (Internews) discussed their paper, Understanding the Challenges Data Protection Regulators Face: A Global Struggle Towards Implementation, Independence, & Enforcement with FPF’s Dr. Gabriela Zanfir-Fortuna. Their paper examines the challenges facing DPAs in Africa and Latin America and identifies two prominent factors as key obstacles to effective data protection oversight: resource constraints and threats to independence. The co-authors discussed why they pursued this research and why it is valuable, how they chose the data they worked with and what the key challenges were, how lack of independence can create cross-cutting challenges across organizations, and why independence is so important. They also stated that ensuring local values and needs with baseline data protection from the European model was important.

fpf 13th annual privacy papers 16.feb .2023 69
Dr. Pawel Popiel, Laura Schwartz-Henderson, and Dr. Gabriela Zanfir-Fortuna

Next, FTC Commissioner Rebecca Kelly Slaughter discussed her paper, Algorithms and Economic Justice: A Taxonomy of Harms and a Path Forward for the Federal Trade Commission with Maneesha Mithal (Wilson Sonsini Goodrich & Rosati). Commissioner Slaughter’s paper provides a taxonomy of algorithmic harms that portend injustice, describes her view of how the Commission’s existing tools can and should be aggressively applied to thwart injustice, and explores how new legislation or an FTC rulemaking could help structurally address the harms generated by algorithmic decision-making. Commissioner Slaughter reiterated it is important that “we’re not awed by the concept of algorithmic decision making, into thinking it’s an amazing technology.” Commissioner Slaughter discussed consumer protection and “AI snake oil,” the privacy harms of regenerative AI, potential harms to kids, and the importance of transparency and explainability to consumers.

fpf 13th annual privacy papers 16.feb .2023 79

FTC Commissioner Rebecca Kelly Slaughter and Maneesha Mithal

In the evening’s final presentation, Professor Kate Weisburd (George Washington University Law School) was joined by Professor Chaz Arnett (University of Maryland Carey Law School) to discuss her paper, Punitive Surveillance. Professor Weisburd’s paper analyzes “punitive surveillance” practices, which allow government officials, law enforcement, and companies to track, record, share, and analyze the location, biometric data, and other metadata of thousands of people on probation and parole. Professor Weisburd noted how there has been a “shift to digitizing and datifying the carceral state, entrenching and deepening carceral surveillance measures.” Professor Weisburd then discussed the concept of punitive surveillance, the disconnect between how e-monitors are presented in popular culture and how it plays out in practice, and what policymakers could do to connect more with this subject matter.

fpf 13th annual privacy papers 16.feb .2023 88

Professor Kate Weisburd and Professor Chaz Arnett

FPF CEO Jules Polonetsky closed the event by thanking the audience, including a special shout-out to Professor Paul Ohm for helping inspire this conference thirteen years ago and to FPF’s Stephanie Wong for making the event happen.

fpf 13th annual privacy papers 16.feb .2023 96
FPF CEO Jules Polonetsky

Thank you to Commissioner Alvaro Bedoya and Honorary Co-Hosts Congresswoman Diana DeGette and Senator Ed Markey, Co-Chairs of the Congressional Privacy Caucus. We would also like to thank our winning authors, expert discussants, those who submitted papers, and event attendees for their thought-provoking work and support. We hope to see you next year!

ITPI Law and Technology Policy Paper Competition In Collaboration with the Israel Ministry of Justice

Last month (January 11, 2023), the Israel Tech Policy Institute held the final event of the ‘Law and Technology Policy Paper Competition’ with the presence of Adv. Sharon Afek, Deputy Attorney General (Director of the Consulting and Legislative Division), and Saar Abramovich, a patent examiner from the Patent Office, along with the participation of students from four law faculties of universities and academic colleges in Israel who have reached the final stage of the competition – Sapir Academic College, The University of Haifa, Bar-Ilan University, and the Ono Academic College.

The winning students from Sapir Academic College were awarded a prize worth NIS 5,000 by the Ministry of Justice. The winning paper was also translated into English – courtesy of the Israel Technology Policy Institute.

The research question of competition was: Should artificial intelligence be recognized as an “inventor” for the purposes of patent law, or as a “patent owner”? The question was examined against the background of dealing with legal challenges and complexities that arise from the need to regulate the encounter between smart technologies and existing law.

The competition judges included:

The competition aroused great interest among scholars, academia, and government officials and will be continued next year as well. 

To read the policy papers in Hebrew of the four faculties of law that have reached the final stage of the competition, click here

The English version of the winning paper by Sapir Academic College written by Uri Avigal, Galit Cohen, Sharon Pustilnik, and Matt Weiser with the supervision of Dr. Sharon Bar-Ziv is available here

itpi blog

New Report Highlights LGBTQ+ Student Views on School Technology and Privacy

The Future of Privacy Forum and LGBT Tech outline recommendations for schools and districts to balance inclusion and student safety in technology use.

Today, the Future of Privacy Forum (FPF), a global non-profit focused on privacy and data protection, and LGBT Tech, a national, nonpartisan group of LGBT organizations, academics, and high technology companies, released a new report, Student Voices: LGBTQ+ Experiences in the Connected Classroom. The report builds on FPF and LGBT Tech research, including interviews with recent high school graduates who identify as LGBTQ+, to gather firsthand accounts of how student monitoring impacted their feelings of privacy and safety at school.

Student Voices: LGBTQ+ Experiences in the Connected Classroom

“LGBTQ+ students are at greater risk for their online activity being flagged or blocked by some content filtering and monitoring systems, and hearing first-hand about their experiences was incredibly important and informative,” said Jamie Gorosh, FPF Youth & Education Privacy policy counsel and an author of the report. “We hope that the recommendations outlined in this report can help schools and districts develop policies that better reflect the views of LGBTQ+ students and ultimately create a safe learning environment for all students.”

LGBTQ+ Experiences in the Connected Classroom analyzes the concerns that arose in conversations with LGBTQ+ students regarding school technologies and privacy, including concerns about identity, parent/caregiver access to data, health information, and student safety. The report then outlines a series of recommendations and actionable steps for schools and districts to address those concerns and others, including to:

Read the complete list of recommendations.

“Our hope would be that every young LGBTQ+ individual would have a supportive family home when it comes to their sexual orientation or gender identity, but unfortunately, over 60% of youth in a recent Trevor Project study did not feel their home was an LGBTQ+ affirming space,” said Christopher Wood, Executive Director, LGBT Tech. “Their public school, its faculty, and technology is most likely the closest line of support that young LGBTQ+ person has access to. With a wave of anti-LGBTQ legislation being passed across the US, it is more important than ever that school monitoring technology is not further inflicting harm on young LGBTQ+ individuals seeking information and support. We hope this report can help schools and districts identify areas for improvement and develop policies that create a safe, supportive learning environment for all students, especially those who might identify as LGBTQ+.”

Student Voices Infographic

screenshot 2023 06 26 at 6.26.22 pm

LGBTQ+ Experiences in the Connected Classroom adds to the FPF Youth & Education Privacy team’s portfolio of work on the privacy implications of student monitoring, which also includes “The Privacy and Equity Implications of Using Self-Harm Monitoring Technologies: Recommendations for Schools” and an accompanying infographic, “Understanding Student Monitoring.”

To access the Youth & Ed team’s child and student privacy resources, visit www.StudentPrivacyCompass.org and follow the team on Twitter at @SPrivacyCompass.

Knowledge is Power: The Future of Privacy Forum launches FPF Training Program

“An investment in knowledge always pays the best interest”
–Ben Franklin

Let’s make 2023 the year we invest in ourselves, our teams, and the knowledge needed to best navigate this dynamic world of privacy and data protection.  I am fortunate to know many of you who will read this blog post, but for those who I don’t know, please allow me to introduce myself.  My name is Alyssa Rosinski, FPF Senior Director of Business Development, and I am leading the relaunch of the FPF Training Program.  

Perhaps when you think of FPF, the first thing that comes to mind is the deep in-house subject matter expertise on topics such as Artificial Intelligence, Ad Tech, Legislation (both US and Global), Biometrics and De-Identification, to name a few. Or you think of the brilliant infographics used to explain complex privacy and technology issues in a digestible format. Maybe it’s the deep-dive reports that you wait for to help you understand the intricacies of the ever-evolving privacy and data protection landscape.

If FPF training didn’t make it to your list, I’m here to tell you it should. The FPF Training Program combines all the strengths and delivers them to you and your teams in an interactive, educational training session. The same subject matter expertise that you rely on to help you understand the relevant laws, policies, underlying technologies, and business practices are the same subject matter experts developing and teaching our training. 

Are you looking to understand the evolution of online advertising and how data is used for targeted advertising purposes? Our Fundamentals of Online Advertising was designed with you in mind. Are you curious about the ethical ramifications of how Biometrics are used? FPF experts will explore how Biometrics work and weigh in on the social and ethical considerations of this technology during our Biometrics – Head to Toe training session. Perhaps your team needs to brush up on the technical and legal definitions of De-Identification and the risks associated with re-identification. Look no further than our De-Identification training.

The FPF Training Program is intentionally designed to provide you and your teams with training that lasts no more than two hours and can be done virtually or in-person, in a public training setting or as a closed-door in-house training, all culminating with a Certificate of Completion from Credly, qualifying for CPE credits. During our public training sessions, there’s an open dialogue allowing you to discuss relevant issues with your peers and the trainer as you work your way through the session. 

In-house training sessions provide a closed-door forum for collaborative discussions between internal stakeholders and the trainer, making the training session highly relevant to those who attend. Our training is for those of you who play a role in developing policies for your organization, are working with clients on complex privacy issues, or for those of you who are interested in emerging technologies and their implications for privacy and data protection. 

Joining one of our training sessions will pay dividends. Check out our public training calendar to reserve a spot for one of our upcoming training sessions, or email me at [email protected] if you’re looking to bring your team together for private, in-house training. I look forward to working with you and your teams this year and participating in the investment you make in professional development.

Evolving Enforcement Priorities in Times of Debate – Overview of Regulatory Strategies of European Data Protection Authorities for 2023 and Beyond

Today, the Future of Privacy Forum released a report that explores “Evolving enforcement priorities in times of debate – Overview of regulatory strategies of European Data Protection Authorities for 2023 and beyond.” It is the third in a series that explores European DPAs’ evolving regulatory priorities, following the 2021 Report “Insights into the Future of Data Protection Enforcement: Regulatory Strategies of European Data Protection Authorities for 2021-2022,” and the 2020 Report “New Decade, New Priorities: A summary of twelve European Data Protection Authorities’ strategic and operational plans for 2020 and beyond.”

At a time where the effectiveness of the EU General Data Protection Regulation (GDPR) enforcement model is being challenged by the European Parliament, Data Protection Authorities (DPAs), civil society, and policymakers, the European Data Protection Board (EDPB) has launched several initiatives to reform the way DPAs are working together. This includes aligning DPAs’ national enforcement strategies, selecting cases of strategic importance for regulators to pursue in a coordinated fashion, and launching yearly coordinated enforcement initiatives dedicated to specific topics. The European Commission also seems to agree that the cooperation between DPAs should be improved, as its Work Program for 2023 plans “to harmonize some national procedural aspects.”

On the other hand, the majority of national DPAs have recently reported a shortage of adequate human and financial resources, which impacts the performance of their supervisory duties. Nonetheless, they seem increasingly willing to closely cooperate with other DPAs – both bilaterally and multilaterally – and also regulators from other fields. They also seem ready to ramp up investigatory and sanctioning efforts, both on their own initiative and following individuals’ complaints. In this regard, it is noteworthy that the highest nine administrative fines since the GDPR became applicable in May 2018 were issued between July 2021 and December 2022.

Since the 2021 edition of the Report, most DPAs have published their 2021 and 2022 annual reports, as well as novel short or long-term strategies. These documents shed light on the areas that DPAs are likely to devote significant regulatory efforts and resources for guidance creation, awareness-raising, and enforcement actions.

For this year’s Report, FPF compiled and analyzed these novel strategic documents, describing where different DPAs’ priorities have common trends or notable deviations. The report also contains links to and translated summaries of strategic documents from nine DPAs in Belgium (BE), the Czech Republic (CZ), Denmark (DK), Estonia (ET), France (FR), Ireland (IE), Spain (ES), Sweden (SE), and the United Kingdom (UK). The analysis also includes documents published by the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS). These documents should be read together with other complementary multi-year strategies that FPF included in its 2020 and 2021 reports.

screenshot 2023 02 13 at 4.05.12 pm
Table 1 – Overview of strategic and operational topics per DPA/jurisdiction

Some of the main conclusions include:

Utah Considers Proposals to Require Web Services to Verify Users’ Ages, Obtain Parental Consent to Process Teens’ Data

Update: On March 23, Governor Spencer Cox signed SB 152 and HB 311. While amendments were made to both bills, the concerns raised in FPF’s analysis remain. SB 152 leaves critical provisions, such as methods to verify age or obtain parental consent, to be established in further rulemaking, but questions remain regarding whether these can be done in a privacy-preserving manner. SB 152’s requirement that social media companies provide parents and guardians access to their teenager’s accounts, including messages, has raised security concerns and questions about what sort of parental access mandates are appropriate for teens online.

Utah lawmakers are weighing legislation that would require social media companies to conduct age verification for all users and extend a parental consent requirement to teens. 

The Utah legislature has introduced two similar, competing bills that seek to regulate online experiences for Utah users. SB 152 would require social media companies to verify the age of all Utah users and require parental consent for users under 18 to have an account. The bill would also require social media companies to provide a parent or guardian access to the content and interactions of an account held by a Utah resident under the age of 18. On February 13, SB 152 was amended to replace the prescriptive requirements for age verification (e.g. requirements that companies obtain and retain a government-issued ID) with verification methods established through rulemaking, but concerns remain that a rulemaking process could nonetheless require privacy-invasive age verification methods.

Utah HB 311, as originally introduced, would have required social media companies to verify the age of all Utah residents, require parental consent before users under 18 create an account, and would also prohibit social media companies from using “addictive” designs or features. On February 9, the bill was amended to remove the age verification and parental consent provisions; the provisions regarding design features remain, as does a private right of action. The amended bill passed the Utah House and moved to the Senate, where it will be considered alongside Utah SB 152. 

FPF shared our analysis of these bills last week, focusing on three areas:

Parental consent under COPPA has longstanding challenges:

FPF published an analysis and accompanying infographic regarding verifiable parental consent (VPC) under COPPA, which was informed by research and insights from parents, COPPA experts, industry leaders, and other stakeholders. The white paper and infographic highlight key friction points that emerge in the VPC process, including:

Age verification requires additional data collection: 

As written, Utah’s proposed legislation would require companies to affirmatively verify the age of all Utah residents. A key pillar of privacy legislation and best practices is the principle of data minimization and not collecting information beyond what is necessary to provide a service. Requiring social media companies or their agents to collect this data would increase the risk of identity theft resulting from a data breach. We also note that since some social media companies are based outside of the United States (with some located in jurisdictions that have few effective privacy rules), there is an inherent security risk in the increased collection of sensitive data for age verification purposes.

Additionally, as written, Utah’s proposed legislation specifies that ages must be verified without a definition of what “verify” means. Companies would benefit from clarity on whether age verification or age estimation is required. An example of age estimation might include capturing a “selfie” of a user to estimate the user’s age range. Verifying someone’s exact age almost always  requires increased data collection compared with estimating an age range or age bracket. Some of the current age estimation technology can accurately distinguish a 12 year old from someone over 25, resulting in a much smaller number of users that would be required to provide sensitive identity documentation. Although methods of verification and forms or methods of identification will be established by further administrative rulemaking, compliance with the proposed legislation as written may still necessitate companies to require government-issued ID to access their services.

Protecting children and teens online should include increasing privacy protections: 

FPF knows that children and teens deserve privacy protections and has highlighted Utah’s leadership in this space, notably in the education context. However, a one-size-fits-all approach may not be appropriate given developmental differences between young children and teens. Similar to how children under 13 can access services with safeguards under COPPA, teens stand to derive benefit from online services such as socializing with peers, distant family, and communities. Utah’s legislation proposes to restrict access to services rather than enhancing privacy protections on these services. Enhanced privacy protections could not only benefit children, but could benefit adults  as well. Because many parents may ultimately choose to provide consent, it is important to consider how privacy protections could be implemented on online services.

These youth-focused proposals follow last year’s passage of the Utah Consumer Privacy Act – a comprehensive privacy law that created some new rights for Utah residents,  but provides fewer protections than other state frameworks. Adding privacy protections for young people would not just help Utah align with other states but also would address several of the privacy risks the social media bills would create. Examples of privacy protective provisions include: