The Future of Privacy Forum (FPF) and 23 other education, healthcare, disability rights, data protection, and civil liberties organizations today released Education During a Pandemic: Principles for Student Data Privacy and Equity (available here). The Principles offer 10 guiding recommendations for schools as they rely on new technologies and data to facilitate remote, in-person, or […]
By Katelyn Ringrose, Christopher Wolf Diversity Law Fellow at the Future of Privacy Forum, and Christopher Wood, Executive Director of LGBT Tech, with thanks to Connor Colson, FPF Policy Intern. LGBTQ+ rights are, and have always been, linked with privacy. Over the years, privacy-invasive laws, practices, and norms have been used to oppress LGBTQ+ individuals […]
Authors: John Verdi (Vice President of Policy) and Katelyn Ringrose (Christopher Wolf Diversity Law Fellow) In July 2018, the Future of Privacy Forum released Privacy Best Practices for Consumer Genetic Testing Services. FPF developed the Best Practices following consultation with technical experts, regulators, leading consumer genetic and personal genomic testing companies, and civil society. The […]
Future of Privacy Forum (FPF) has received a grant to create an independent party of experts for an ethical review process that can provide trusted vetting of corporate-academic research projects. FPF will establish a pool of respected reviewers to operate as a standalone, on-demand review board to evaluate research uses of personal data and create a set of transparent policies and processes to be applied to such reviews.
Yesterday, the Federal Commission on School Safety released a report detailing its conclusions, after holding a series of meetings and hearings in the wake of school shootings. Nearly every aspect of the Commission’s report focuses on sharing data and, thus, has privacy implications for students, teachers, and the public.
The Future of Privacy Forum has released a new guide, Disclosing Student Information During School Emergencies: A Primer for Schools, which offers four best practices for information disclosure and answers five frequently asked questions about FERPA’s requirements for sharing information during health or safety emergencies. Read more about this guide in the Future of Privacy Forum’s […]
Analysis of personal data can be used to improve services, advance research, and combat discrimination. However, such analysis can also create valid concerns about differential treatment of individuals or harmful impacts on vulnerable communities. These concerns can be amplified when automated decision-making uses sensitive data (such as race, gender, or familial status), impacts protected classes, or affects individuals’ eligibility for housing, employment, or other core services. When seeking to identify harms, it is important to appreciate the context of interactions between individuals, companies, and governments—including the benefits provided by automated decision-making frameworks, and the fallibility of human decision-making.
Today, the Future of Privacy Forum (FPF) and the Ohio State University’s Program on Data and Governance are holding a discussion of ethics, privacy and practical research reviews in corporate settings. This timely event, which follows the White House’s call to develop strong data ethics frameworks, convened corporate and academic leaders to discuss how to integrate ethical and privacy considerations into innovative data projects and research.
Please join the Future of Privacy Forum (FPF) and the Ohio State University’s Program on Data and Governance in Washington, DC, on Tuesday, June 14, 2016, for a discussion of ethics, privacy and practical research reviews in corporate settings.
As the volume of consumer data grows, an increasing number of decisions previously made by humans are now made by algorithms. Many thought leaders have called for algorithmic transparency to ensure that these decisions aren’t leading to unfair or discriminatory outcomes, but algorithmic transparency is tricky to implement. Last December, FTC Commissioner Julie Brill acknowledged […]