FPF Participates in FTC Event on “Commercial Surveillance and Data Security” Proposed Rulemaking
Yesterday, FPF Senior Director for U.S. Policy Stacey Gray participated in a panel discussion hosted by the Federal Trade Commission (“FTC”) regarding its Advance Notice of Proposed Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security” (comments start at 1:39:00). Feedback from the public forum is intended to help inform the Commission’s decision whether to proceed in rulemaking and what form a new market-wide rule governing consumer privacy could take.
As a panelist, Stacey Gray urged the Commission to move forward with its rulemaking proposal, noting that exponential increases in the benefits and harms of data collection in our daily lives make it the right time to establish national rules on what constitutes unlawful behavior with respect to the collection and use of personal data. Highlighting potential regulatory solutions, Gray urged the Commission to codify existing case settlements requiring accurate disclosures and reasonable data security practices and to apply the Commission’s “unfairness authority” to reform business practices that result in data-driven discrimination and harmful secondary uses of personal information.
The public forum included two expert panels, one on industry perspectives and one on consumer advocate perspectives regarding the consumer data issues implicated by the rulemaking. Furthermore, presentations from the Commissioners as well as the questions posed by the panel moderators may offer further insight into how the FTC is approaching rulemaking on consumer harms in the present digital ecosystem.
Panel 1: Industry Perspectives
The first panel was moderated by Olivier Sylvain, senior advisor to FTC Chair Khan. In addition to asking about the restrictions that a new privacy rule should create, Mr. Sylvain’s questions covered existing industry best practices (including for the retention of sensitive data), ways the Commission can incentivize best practices short of rulemaking, and current market incentives to collect data.
While the ANPR broadly defines “commercial surveillance” to include “collection, aggregation, analysis, retention, transfer, or monetization of consumer data,” industry panelists stressed that there are a wide range of uses of personal data that create different risks, depending on context. For example, Digital Context Next’s Jason Kint argued that while first-party use of data to tailor experiences is expected by consumers, secondary uses (including targeted advertising) tend to violate these expectations. National Retail Foundation’s Paul Martino agreed that there are greater risks inherent to data collection and processing by third-party businesses, which may lack incentives to develop long term customer relationships.
In the context of best practices, panelists paid particular attention to the topic of data security. Mozilla’s Marshall Erwin described a “universally accepted” (though not universally adopted) consensus set of data security practices that includes the encryption of personal information in transit, employee access controls, and password standards. Mr. Martino further pointed to controls like multi-factor authentication, malware and antivirus software, and patching, though he stressed that there is no “one size fits all” approach to cybersecurity standards.
The Partnership on AI’s Rebecca Finlay encouraged the Commission to review data governance models emerging in jurisdictions outside the U.S. to evaluate the merits of different regulatory approaches. She specifically highlighted the privacy interests of children and the United Kingdom’s recent Age Appropriate Design Code, which includes transparency and data minimization standards. Mr. Erwin also highlighted the need to protect childrens’ privacy, while cautioning that some approaches can result in “privacy theater” with minimal tangible benefit.
Panel 2: Consumer Perspectives Panel
The second panel was moderated by Attorney Advisor to the FTC, Rashida Richardson. Ms. Richardson’s questions underscored the Commission’s focus on civil rights and on children and teenager’s privacy, as well as its interest in ensuring that requirements placed on industry are in fact privacy and security-protective. She asked for insights from the panel on the unique impacts of online tracking and data collection on members of protected classes and on children and teenagers and the extent to which data minimization and transparency requirements are effective tools to combat the harms associated with widespread collection of personal data. Finally, she asked about the limitations of the traditional notice and consent model for protecting consumer privacy.
Members of the panel signaled strong support for the FTC’s efforts to establish national, clear standards regarding what constitutes unfair or deceptive data collection, storage, and use. EPIC’s Caitriona Fitzgerald spoke to the inability of many individuals to understand or protect themselves from harmful data collection online in the absence of regulatory intervention. Upturn’s Harlan Yu and the Joint Center for Political and Economic Studies’ Spencer Overton, focused on marketplace harms borne by the members of historically-marginalized and protected groups in critical areas, such as housing, education, and voting. Citing examples of housing and employment discrimination enabled by widespread data collection, they urged the Commission to place limits on the ability of data brokers and other parties to collect and aggregate certain sensitive types of data. The German Marshall Fund of the U.S.’s Karen Kornbluh added that online data collection and aggregation, when it is deployed to interfere with elections or track members of the armed services, poses national security as well as privacy risks.
FPF’s Stacey Gray noted that, when applying the unfairness standard, the Commission should be mindful of the fact that fairness determinations “inherently involve balancing, context, and policy tradeoffs,” emphasizing that, “many secondary uses of data can and should enable academic research, support for public health, fraud detection, and perhaps, to a reasonable extent, advertising-supported content.” Mr. Overton returned to this theme, noting that data-enabled targeted messaging can be positive when it provides individuals with information that is particularly relevant to them, such as messaging about sickle cell disease aimed at African-American audiences.
Commissioners Weigh In
In opening the public forum, Chair Khan noted that digital tools can deliver “huge conveniences” but also contribute to the tracking and surveillance of individuals in entirely new ways. She further emphasized the legal tests that the Commission must satisfy if it is to proceed in rulemaking. Commissioner Slaughter spoke favorably of efforts to enact comprehensive federal privacy legislation, but emphasized that until there’s a law on the books, the Commission must make use of all its enforcement tools to investigate and address unlawful behavior. Her comments highlighted harms to adolescents who are not covered by existing children’s privacy laws as well as harms resulting from AI and advanced algorithms.
Commissioner Bedoya spoke following the panel presentations, stressing the importance for the Commission to receive a broad array of first-hand consumer accounts of unfair and deceptive practices. Picking up on points raised by FPF’s Stacey Gray on the history of “unfairness” in U.S. privacy law, Bedoya also noted that the ANPR’s broad scope reflects the sum total of historical privacy frameworks in the United States, such as the Brandeis-Warren ‘Right to Privacy’ and the Fair Information Practice Principles (FIPPS), that go beyond mere ‘notice and consent’ protections. Commissioners Wilson and Phillips, who both voted against the FTC’s ANPR, did not participate in the event.
Next Steps:
In addition to the public forum, the Commission will consider written responses to the ANPR in determining whether to proceed in a new privacy and data security rulemaking; the deadline for public comment is October 21, 2022.
The Commission’s 95-question ANPR covers a broad range of topics, seeking information on the prevalence and harms of particular industry practices (including in advanced algorithms, children’s data, and targeted advertising), potential regulatory interventions (such as data minimization, consent, and transparency), and remedies (such as first-time fining authority and “algorithmic disgorgement”).
Due to its expansive nature, the ANPR has been heralded for attempting to rein in invasive and unfair business practice, while critics have alleged the proposal exceeds the Commission’s statutory authority. The Commission could pursue a range of possible directions in crafting new privacy and security rules for U.S. businesses, and stakeholders will be closely watching for additional indications from the Commission on what will come next.
View a video and transcript of the public forum here.