FPF to Present First-Ever Research Data Stewardship Award

Nominations Requested by March 12, 2020

Today, the Future of Privacy Forum (FPF) is announcing a first-of-its-kind award recognizing privacy protective research collaboration between a company and academic researchers. When privately held data is responsibly shared with academic researchers, it can support significant progress in medicine, public health, education, social science, and other fields.

With this in mind, FPF is requesting nominations for its Award for Research Data Stewardship. The goal is to promote the safe use and transfer of privately held company data to academic institutions for study and analysis. The award is supported by the Alfred P. Sloan Foundation, a not-for-profit grantmaking institution that supports high-quality, impartial scientific research and institutions.

“Increasingly, the challenges facing our society – health, transportation, education – are being addressed by independent research on consumer data collected by private companies,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “This award recognizes projects that minimize potential privacy risks while helping academics access corporate data for research that benefits society.”

Academics and their corporate partners are invited to nominate a successful data-sharing project that reflects privacy protective approaches to data protection and ethical data sharing. Nominations will be reviewed and selected by an Award Committee comprised of representatives from FPF, leading foundations, academics, and industry leaders. Nominated projects will be judged based on several factors, including their adherence to privacy protection in the sharing process, the quality of the data handling process, and the company’s commitment to supporting the academic research. The award winner will be notified by Monday, March 16, 2020.

FPF will present the award at an April 6, 2020 gala in Washington, DC to a member of the academic research team and a senior executive at the company that provided the data. Applicants should apply by filling out the corporate and academic nomination forms by Thursday, March 12, 2020. Self-nominations and nominations from the public are welcome. Read more about the award, event, and more in the call for nominations, and email Kelsey Finch, FPF Senior Counsel, at [email protected] with any questions.

FPF Director of AI & Ethics Testifies Before Congress on Facial Recognition

WASHINGTON, D.C. – In a hearing today before the House Committee on Oversight and Reform, Future of Privacy Forum (FPF) Senior Counsel and Director of AI and Ethics Brenda Leong testified on the privacy and ethical implications of the commercial use of facial recognition technology.

“Technology has only accelerated the practice of identification and tracking of people’s movements, whether by governments, commercial businesses, or some combination thereof, leading to the real concerns about an ultimate state of ubiquitous surveillance,” wrote Leong. “How our society faces these challenges will determine how we move further into the conveniences of a digital world, while continuing to embrace our fundamental ideals of personal liberty and freedom.”

In her testimony, Leong emphasized that not every camera-based system is a facial recognition system,” and that the term facial recognition is often broadly and confusingly used in reference to other image-based technology that does not necessarily involve individual identification.

“Understanding how particular image-analysis technology systems work is a critical foundation for effectively understanding and evaluating the risks of facial recognition,” Leong noted in her written testimony. To help educate policymakers, consumers, and others about the varying levels of facial image software and associated benefits and risks, and privacy implications of each, FPF created the infographic, Understanding Facial Detection, Characterization, and Recognition Technologies.

Leong outlined a set of privacy principles created by FPF that should be considered as the foundation of any facial recognition-specific legislation, writing, “consent remains the critical factor, and should be tiered based on the level of personal identification collected or linked, and the associated increasing risk levels.” Leong highlighted that the default standard for consent should be “an “opt-in” or “affirmative consent” model consistent with existing FTC guidelines.

As educational institutions across the country, including colleges and public school districts, consider the use of facial recognition technology on campus, Leong pointed to guidance in the privacy principles that calls for policymakers to: “Give special consideration to the age, sophistication, or degree of vulnerability of those individuals, such as children, in light of the purposes for which facial recognition technology is used, including whether additional levels of transparency, choice, and data security are required.” She also testified that “there is no good justification for the use of facial recognition in a K-12 school.”

In 2019, FPF held a webinar about facial recognition in schools and wrote to the New York State Legislature in support of a well-crafted moratorium on facial recognition systems for security uses in public schools, while cautioning against overly broad bans or language that might have unintended consequences on other security programs.

In her written testimony, Leong cited controversial developments surrounding the implementation of passports and the requirement that they include a photo, resistance to calls for a federally issued national ID card, and REAL ID requirements for state licensing as precedent for policymakers seeking to balance individual rights and freedoms with efficiencies and security.

“These historical discussions reflect the ongoing need to determine the appropriate balance of technological, legal and policy standards and protections, along with the underlying threshold question of whether some systems are simply too high risk to implement regardless of perceived benefits,” wrote Leong.

To read Leong’s written testimony, click here. For an archived livestream of the committee hearing, visit https://oversight.house.gov/. To learn more about the Future of Privacy Forum, visit www.fpf.org and FPF’s facial recognition and biometrics work, including relevant resources, recommended reading, and other materials.

CONTACT: [email protected]

Award-Winning Paper: "The Many Revolutions of Carpenter"

For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting to lawmakers and regulators award-winning research representing a diversity of perspectives. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 is The Many Revolutions of Carpenter by Paul Ohm of Georgetown University Law Center. The paper’s detailed assessment of the 2018 Supreme Court opinion in Carpenter v. United States is an essential read for those interested in the changing conception of privacy in the criminal justice system.


The Supreme Court’s 2018 majority opinion in Carpenter v. United States, the author argues, is the most important Fourth Amendment opinion in decades. The opinion requires the police to obtain a warrant to access an individual’s historical whereabouts from the records of a cell phone provider.

Ohm states that Carpenter represents a new approach to the “reasonable expectation of privacy” test: “Until now, the Supreme Court has tended to pay more attention to the nature of the police intrusion required to obtain information than to the nature of the information obtained.” In Carpenter, the justices argued that individuals have a “reasonable expectation of privacy in the whole of their physical movements,” suggesting that data tracking those movements should be considered private and subject to warrant requirements.

Ohm notes that the Carpenter opinion serves as the death of the “third party doctrine” – an idea that holds that information a person voluntarily discloses to a third party is not protected by a reasonable expectation of privacy. The justices write: “the fact that the Government obtained the information from a third party does not overcome Carpenter’s claim to Fourth Amendment protection.” Ohm points out that the justices focused on the nature of the information rather than the structure of the database or its relation to the individual, likely ensuring that this opinion will apply to other massive collections of historical geolocation information.

Finally, Carpenter creates a previously unrecognized rule of “technological equivalence.” Ohm explains: “If a technology, or a near-future improvement, gives police the power to gather information that is the ‘modern-day equivalent’ of activity that has been held to be a Fourth Amendment search, the use of that technology is also a search.” The justices acknowledge that information technology is exceptional – different in kind, not merely in degree, from what has come before.

If you’re interested in reading more about how Carpenter v. United States represents an inflection point in Fourth Amendment court cases concerning privacy, you’ll want to check out the full paper.


The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website.

Future of Privacy Forum Releases Analysis of Washington Privacy Act

FPF CEO: “Most comprehensive state privacy legislation proposed to date”

WASHINGTON, DC – January 13, 2020 – The Future of Privacy Forum today released an in-depth analysis of the Washington Privacy Act (Washington State Senate Bill 6281), as well as the following statement by Future of Privacy Forum CEO Jules Polonetsky about the bill:

“The Washington Privacy Act is the most comprehensive state privacy legislation proposed to date. The bill addresses concerns raised last year and proposes strong consumer protections that go beyond the California Consumer Privacy Act. It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not.”

According to the FPF analysis, the Act would be a holistic, GDPR-like comprehensive law that: (1) provides protections for residents of Washington State; (2) grants individuals core rights to access, correct, delete, and port data; (3) creates rights to opt out of sale, profiling, and targeted advertising; (4) imposes obligations to perform risk assessments; (5) requires opt-in consent for the processing of sensitive data; and (6) creates collection and use limitations. In addition, the Act contains provisions for controllers and processors utilizing facial recognition services.

READ THE FPF ANALYSIS OF THE WASHINGTON PRIVACY ACT.

It’s Raining Privacy Bills: An Overview of the Washington State Privacy Act and other Introduced Bills

By Pollyanna Sanderson (Policy Counsel), Katelyn Ringrose (Christopher Wolf Diversity Law Fellow) & Stacey Gray (Senior Policy Counsel)

 

Today, on the first day of a rapid-fire 2020 legislative session in the state of Washington, State Senator Carlyle has introduced a new version of the Washington Privacy Act (WPA). Legislators revealed the Act during a live press conference on January 13, 2020 at 2:00pm PST. Meanwhile, nine other privacy-related bills were introduced into the House today by Representative Hudgins and Representative Smith. 

If passed, the Washington Privacy Act would enact a comprehensive data protection framework for Washington residents that includes individual rights that mirror and go beyond the rights in the California Consumer Privacy Act (CCPA), as well as a range of other obligations on businesses that do not yet exist in any U.S. privacy law.

“The Washington Privacy Act is the most comprehensive state privacy legislation proposed to date,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “The bill addresses concerns raised last year and proposes strong consumer protections that go beyond the California Consumer Privacy Act. It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not.”

Earlier Senate and House versions of the Washington Privacy Act narrowly failed to pass last year in the 2019 legislative session. Read FPF’s comments on last year’s proposal. The version introduced today contains strong provisions that largely align with the EU’s General Data Protection Regulation (GDPR), and commercial facial recognition provisions that start with a legal default of affirmative consent. Nonetheless, legislators must work within a remarkably short time-frame to pass a law that can be embraced by both House and Senate within the next six weeks of Washington’s legislative session.

Below, FPF summarizes the core provisions of the bill, which if passed would go into effect on July 31, 2021. The Act would be a holistic, GDPR-like comprehensive law that: (1) provides protections for residents of Washington State; (2) grants individuals core rights to access, correct, delete, and port data; (3) creates rights to opt out of sale, profiling, and targeted advertising; (4) creates a nuanced approach to pseudonymised data; (5) imposes obligations on processors and controllers to perform risk assessments; (6) creates collection, processing, and use obligations; and (7) requires opt-in consent for the processing of sensitive data. In addition, the Act contains provisions for controllers and processors utilizing facial recognition services. 

Read the Bill Text HERE. Read the 9 other bills introduced today at the end of this blog post (Below).

Update (1/21/20): A substitute bill was released on January 20 by Senator Carlyle and cosponsors (see PSSB 6281). At 10:00am on January 23, the Senate committee on Environment, Energy & Technology will hold a hearing on this and other bills.

1. Jurisdictional and Material Scope

The Act would provide comprehensive data protections to Washington State residents, and would apply to entities that 1) conduct business in Washington or 2) produce products or services targeted to Washington residents. Such entities must control or process data of at least 100,000 consumers; or derive 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). The Act would not apply to state and local governments or municipal corporations.

The Act would regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data (see below, Core consumer rights).

2. Individual Rights to Access, Correct, Delete, Port, and Opt-Out of Data Processing

The Act would require companies to comply with basic individual rights to request access to their data, correct or amend that data, delete their data, and access it in portable format (“portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data… without hindrance, where the processing is carried out by automated means”). These rights would not be permitted to be waived in contracts or terms of service, and would be subject to certain limitations (for example, retaining data for anti-fraud or security purposes). 

Along with these core rights, the Act would also grant consumers the right to explicitly opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal, or similarly significant, effects. Such effects include the denial of financial and lending services, housing, insurance, education enrollment, employment opportunities, health care services, and more. Unlike the CCPA, the Act would not prescribe specific opt out methods (like a “Do Not Sell My Information” button on websites), but instead require that opt-out methods be “clear and conspicuous.” It would also commission a government study on the development of technology, such as a browser setting, browser extension, or global device setting, for consumers to express their intent to opt out. 

For all of these individual rights, companies are required to take action free of charge, up to twice per year, within 45-90 days (except in cases where requests cannot be authenticated or are “manifestly unfounded or excessive”). Importantly, the law would also require that companies establish a “conspicuously available” and “easy to use” internal appeals process for refusals to take action. With the consumer’s consent, the company must submit the appeal and an explanation of the outcome to the Washington Attorney General, whether any action has been taken, and a written explanation. The Attorney General must make such information publicly available on its website. When consumers make correction, deletion, or opt out requests, the Act would oblige controllers to take “reasonable steps” to notify third parties to whom they have disclosed the personal data within the preceding year.

Finally, the Act would prohibit companies from discriminating against consumers for exercising these individual rights. Such discrimination could include the denial of goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods and services.

3. Obligations for De-identified and Pseudonymous Data

Under the Act, companies processing “pseudonymous data” would not be required to comply with the bulk of the core individual rights (access, correction, deletion, and portability) when they are “not in a position” to identify the consumer, subject to reasonable oversight. Notably, the Act defines pseudonymous data consistently with the GDPR’s definition of pseudonymization, as “personal data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to [protect against identification].” This is also consistent with the Future of Privacy Forum’s Guide to Practical Data De-Identification. Pseudonymous data is often harder to authenticate or link to individuals, and can carry lessened privacy risks. For example, unique pseudonyms are frequently used in scientific research (e.g., in a HIPAA Limited Dataset, John Doe = 5L7T LX619Z). 

In addition, companies may refuse to comply with requests to access, correct, delete, or port data if the company: (A) is not reasonably capable of associating the request with the personal data, or it would be unreasonably burdensome to associate the request with the personal data; (B) does not use the personal data to recognize or respond to the data subject, or associate the personal data with other data about the same specific consumer; and (C) does not sell personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor (service provider). 

Importantly, other requirements of the overall bill, including Data Protection Assessments (below), and the right to Opt Out of data processing for targeted advertising, sale, and profiling (above) would still be operational for pseudonymous data.

Finally, the Act would not apply to de-identified data, defined as “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable natural person, or a device linked to such person,” subject to taking reasonable measures to protect against re-identification, including contractual and public commitments. This definition aligns with the FTC’s longstanding approach to de-identification. 

Legislators revealing the Act during a live press conference on January 13, 2020 at 2:00pm PST.

4. Obligations of Processors (Service Providers)

In a structure that parallels the GDPR, the Act distinguishes between data “controllers” and data “processors,” establishing different obligations for each. Almost all of the provisions of the Act involve obligations that adhere to a controller, defined as “natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.”

Data processors, on the other hand, “natural or legal person who processes personal data on behalf of a controller,” must adhere (as service providers) to controllers’ instructions and help them meet their obligations. Notwithstanding controller instructions, processors must maintain security procedures that take into account the context in which personal data is processed; ensure that individual processors understand their duty of confidentiality, and may only engage a subcontractor once the controller has had the chance to object. At the request of the controller, processors must delete or return personal data. Processors must also aid in the creation of data protection assessments.

5. Transparency (Privacy Policies)

The Act would require companies to provide a Privacy Policy to consumers that is “reasonably accessible, clear, and meaningful,” including making the following disclosures:

Additionally, if a controller sells personal data to third parties or processes data for certain purposes (i.e. targeted advertising), they would be required to clearly and conspicuously disclose such processing, as well as how consumers may exercise their right to opt out of such processing. 

6. Data Protection Assessments

Companies would be required under the Act to conduct confidential Data Protection Assessments for all processing activities involving personal data, and again any time there are processing changes that materially increase risks to consumers. In contrast, the GDPR requires Data Protection Impact Assessments only when profiling leads to automated decision-making having a legal or significant effect upon an individual (such as credit approval), when profiling is used for evaluation or scoring based on aspects concerning an individual’s economic situation, health, personal preferences or interests, reliability or behavior, location or movements, or when it is conducted at large-scale on datasets containing sensitive personal data.

Under the WPA, in weighing benefits against the risks, controllers must take into account factors such as reasonable consumer expectations, whether data is deidentified, the context of the processing, and the relationship between the controller and the consumer. If the potential risks of privacy harm to consumers are substantial and outweigh other interests, then the controller would only be able to engage in processing with the affirmative consent of the consumer (unless another exemption applies, such as anti-fraud measures and research). 

7. Sensitive Data 

Companies must obtain affirmative, opt-in consent to process any “sensitive” personal data, defined as personal data revealing:

Although the Act requires consent to process data from a “known child,” an undefined term, it notably also exempts data covered by the Family Educational Rights and Privacy Act (FERPA) and entities that are compliant with the Children’s Online Privacy Protection Act (COPPA). The Act defines a child as a natural person under age thirteen, meaning it does not follow the approach of CCPA and other bills around the country that extend child privacy protections to teenagers. 

8. Collection, Processing, and Use Limitations 

In addition to consumer controls and individual rights, the Act would create additional obligations on companies that align with the GDPR:

The obligations imposed by the Act would not restrict processing personal data for a number of specified purposes. Those exemptions include cooperating with law enforcement agencies, performing contracts, providing requested products or services to consumers, processing personal data for research, consumer protection purposes, and more. If processing falls within an enumerated exception, that processing must be “necessary, reasonable, and proportionate” in relation to a specified purpose. Controllers and processors are also not restricted from collecting, using, or retaining data for specific purposes such as conducting internal product research, improving product and service functionality, or performing internal operations reasonably aligned with consumer expectations. 

9. Enforcement

The Act would not grant consumers a private right of action. Instead, it would give the Attorney General exclusive authority to enforce the Act. The Act would cap civil penalties for controllers and processors in violation of the Act at $7,500 per violation. A “Consumer Privacy Account,” in the state treasury, would contain funds received from the imposition of civil penalties. Those funds would be used for the sole purpose of the office of privacy and data protection. The Attorney General would also be tasked with compiling a report evaluating the effectiveness of enforcement actions, and any recommendations for changes. 

10. Commercial Facial Recognition 

In addition to its baseline requirements, the Act contains provisions specifically regulating commercial uses of facial recognition. The Act would require affirmative, opt in consent as a default requirement, and place heightened obligations on both controllers and processors of commercial facial recognition services, particularly with respect to accuracy and auditing, with a focus on preventing unfair performance impacts. A limited exception is provided for using this technology for uses such as to track the unique number of users in a space, when data is not maintained for more than 48 hours and users are not explicitly identified.

Definitions

The Act provides a number of core definitions that are relevant only to the facial recognition provisions (Section 18, the final section of the bill). Given the standalone nature of this section of the overall bill, the definitions can be very impactful. The term “facial recognition service” is defined as technology that analyzes facial features and is used for identification, verification, or persistent tracking of consumers in still or video images. 

Additional definitions are as follows: 

Additional Duties on “Processors” and “Controllers” of Facial Recognition Services

The Act would place affirmative duties on processors, or service providers (see above for definitions of controller and processor under the Act), when they provide facial recognition services. Those duties include enforcing current provisions against illegal discrimination, as well as providing an API or other means for controllers and third parties to conduct fairness and accuracy tests. If such tests reveal unfair performance differences (e.g. bias based on a protected characteristic), the processor must develop and implement a plan to address those differences.

Controllers must also take affirmative steps to post notice in public spaces where facial recognition services are deployed; obtain consent from consumers prior to enrollment in a service operating in physical premises open to the public; ensure meaningful review for potentially harmful uses of the service; test the service and take reasonable steps to ensure quality standards; and engage in staff training. Conspicuous public notice includes, at a minimum, the purpose for which the technology is deployed and information about where consumers can obtain additional information (e.g. a link for consumers to exercise their rights). 

Consent would not be required for enrolling images for security or safety purposes, but the consumer must have engaged in or be suspected of engaging in criminal activity (e.g. shoplifting); the controller must review the safety/security database no less than biannually and remove templates from individuals no longer under suspicion or who have been in the database for more than three years; and, finally, the controller must have an internal process whereby a consumer may correct or challenge enrollment. Furthermore, controllers must ensure that decisions which could pose legal or significant harms (e.g. the loss of employment opportunities, housing, etc.) are subject to meaningful human review. 

Finally, the Act would prohibit controllers from disclosing personal data obtained from a facial recognition service to law enforcement, unless: required by law in response to a warrant, subpoena or legal order; when necessary to prevent or respond to an emergency involving danger of death or serious physical injury to any person, upon a good faith belief by the controller; or to send information to the national center for missing and exploited children. In addition to these duties, controllers must also comply with consumer requests outlined elsewhere in the Act. 

Insight: Senator Nguyen (jointly with Senator Carlyle and others) have introduced a separate bill regulating state and local government agency uses of facial recognition technologies. In a recent news article, he stated that he did so in order to avoid getting “caught up in any potential political fight.”

OTHER WASHINGTON STATE HOUSE BILLS INTRODUCED TODAY

Washington legislators have been busy drafting a number of other consumer privacy bills. The following nine House Bills, filed by Representatives Smith (D) and Hudgins (D) were also introduced on January 13, 2020 and are intended to accompany the WPA. These bills would:

Did we miss anything? Let us know at [email protected] as we continue tracking developments in Washington State.

Statement by Future of Privacy Forum CEO Jules Polonetsky on the Washington Privacy Act

WASHINGTON, DC – January 13, 2020 – Statement by Future of Privacy Forum CEO Jules Polonetsky regarding the introduction of the Washington Privacy Act (Washington State Senate Bill 6281):

“The Washington Privacy Act is the most comprehensive state privacy legislation proposed to date. The bill addresses concerns raised last year and proposes strong consumer protections that go beyond the California Consumer Privacy Act. It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not.”

READ THE FPF ANALYSIS OF THE WASHINGTON PRIVACY ACT.

# # #

ICYMI: National PTA, Future of Privacy Forum Host Student Privacy Briefing for Parents

On December 12th, the Future of Privacy Forum (FPF) and National PTA recently co-hosted a webinar for parents to learn more about the critical importance of safeguarding their child’s data privacy at school. FPF Director of Youth & Education Privacy Amelia Vance led the discussion about key student privacy laws and trends.

As school districts across the country cope with a wave of new privacy requirements – 130+ new laws specific to student privacy have passed in 41 states since 2013Vance highlighted the ongoing but critical challenge for states to balance access to student data with privacy and security, noting several instances where new laws have inadvertently created limitations on students’ educational opportunities.

“Oftentimes legislatures have to go back and fix the laws… because they didn’t check in with teachers, they didn’t check in with administrators, they didn’t talk to parents about what are the most important things to them,” Vance said. “[Not only about] what are the privacy protections that are important, but also what are the services being provided to your kids that are most important?”

Specifically, Vance cited examples in New Hampshire and Louisiana where strict new privacy laws raised questions about whether schools were permitted to conduct routine activities such as hanging student artwork in hallways, sharing classroom recordings with special education or homebound students, or even producing a yearbook.

Parents play a critical role in bringing these types of issues to light, and the webinar included a review of key questions for parents to ask schools to better understand their privacy policies. Additionally, parents received tips for navigating practical scenarios such as keeping photos of a child off of social media. The webinar highlighted FPF and National PTA’s Parent’s Guide to Student Data Privacy with additional tools and resources for parents.

Vance observed that policymakers’ recent focus on new restrictions for districts, schools, and edtech companies is shifting. “We now see fewer bills imposing more privacy protections, and more bills adding data sharing or ways of surveilling students that could potentially cause harm down the road,” Vance noted.

During the presentation, Vance pointed to a particularly concerning example of overly broad data sharing: Florida’s new “School Safety Portal.” In an effort to prevent school violence, the controversial portal allows school threat assessment teams to access students’ personal information, including whether a child is in foster care or has been bullied due to their disability or sexual orientation.

However, Vance cautioned: “We know that data is not relevant to whether or not a student is a threat, and so it has the potential to cause additional bias and harm students, instead of pointing out actual threats.” FPF and 32 other disability, privacy, education, and civil rights groups first sounded the alarm about this database in a July 2019 letter to Florida Governor Ron DeSantis.

Click here to watch to the full webinar, and access additional student privacy resources for parents here.

To learn more about the Future of Privacy Forum, visit www.fpf.org and subscribe to FPF’s student privacy newsletter.

CONTACT

[email protected]

Award-Winning Paper: "Privacy's Constitutional Moment and the Limits of Data Protection"

For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting to lawmakers and regulators award-winning research representing a diversity of perspectives. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 is Privacy’s Constitutional Moment and the Limits of Data Protection by Woodrow Hartzog of Northeastern University School of Law and Neil Richards of the Washington University School of Law. Whatever your perspective on potential federal privacy legislation, you’ll find this paper to be thought-provoking.


The authors present a case for national privacy legislation that looks beyond data protection and fair information processing (FIPs) principles – the central elements of the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA).

They argue that privacy faces a “constitutional moment” that presents an opportunity to define the structure of our budding digital society. Their position is that while data-protection-focused legislation represents an important step, data protection alone is insufficient. Instead, they suggest an approach to privacy that addresses corporate matters, trustworthy relationships, data collection and processing, and personal data’s externalities.

Corporate Privacy Matters

Hartzog and Richards argue that a framework for privacy legislation should address market power and corporate structures. For example, they write: “Privacy law should be concerned with a number of corporate matters, including limiting how the corporate form is used to shield bad actors from personal liability.” By empowering privacy officers within the corporate structure with more decision-making abilities and protection from executive pushback, policymakers could promote improved corporate responsibility. Additionally, the paper proposes increased antitrust enforcement to broadly limit the power of corporations.

Relational Privacy

The authors recommend passing legislation designed to protect the trust that people place in companies when they share personal information. Legislation could foster discretion, honesty, and loyalty to create trusting relationships between data collectors and individuals. Hartzog and Richards explain that “If companies are to keep the trust they have been given, it is not enough to be merely passively ‘open’ or ‘transparent.’ Trust requires an affirmative obligation of honesty to correct misinterpretations and to actively dispel notions of mistaken trust.”

Information Collection and Processing

Though the authors argue that the GDPR alone would be inadequate in the United States, they recognize the importance of data protection. They recommend stricter limits on data collection, rigid mandatory deletion requirements, and the prioritization of obscurity to improve upon the data protection foundations present in the GDPR.

The External Impacts of Data Collection

The authors claim that the effects of data collection go far beyond the individual, impacting the environment, mental health, civil rights, and democratic values – concerns that could be addressed in privacy legislation.

If you’re interested in “outside-the-box” thinking about the foundations of potential privacy legislation, you’ll want to read the full paper.


The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website. For more information or to RSVP, please visit this page.