The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.
Press Releases and Statements
Washington, DC – Today, Future of Privacy Forum, along with leading consumer genetic and personal genomic testing companies 23andMe, Ancestry, Helix, MyHeritage, and Habit, released Privacy Best Practices for Consumer Genetic Testing Services. The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.
Beyond Explainability aims to provide a template for effectively managing this risk in practice, with the goal of providing lawyers, compliance personnel, data scientists, and engineers a framework to safely create, deploy, and maintain ML, and to enable effective communication between these distinct organizational perspectives.
College Park, MD – June 26, 2018 – Immuta and the Future of Privacy Forum (FPF) today announced the first-ever framework for practitioners to manage risk in artificial intelligence (AI) and machine learning (ML) models. Their joint whitepaper, Beyond Explainability: A Practical Guide to Managing Risk in Machine Learning Models, provides business executives, data scientists, and compliance professionals with a strategic guide for governing the legal, privacy, and ethical risks associated with this technology.
Washington, D.C– Today, Future of Privacy Forum’s (FPF) Amelia Vance, Director of the Education Privacy Project, will deliver testimony in a hearing before the House Committee on Education and the Workforce, “Protecting Privacy, Promoting Data Security: Exploring How Schools and States Keep Data Safe.” In her prepared testimony, Vance will comment on how states, districts and ed tech companies can work together in ensuring student privacy.
Last week, the Future of Privacy Forum filed written comments in response to the California Public Utilities Commission’s proposed decision authorizing pilot programs for passenger service in Autonomous Vehicles. The CPUC is a consumer protection agency that oversees, among other topics, provision of passenger service in the state. The proposed decision called for a number of criteria to be met by companies seeking to operate AV passenger service, including reporting of communications between passengers and remote operators of driverless AVs, as well as aggregated operations data.
Yesterday, the Future of Privacy Forum submitted written comments to members of the Minnesota House of Representatives in response to the pending student privacy bill, the Student Data Privacy Act (HF 1507). FPF expressed concerns about the proposed language of the bill, which would create conflicting requirements for schools and education technology companies, and likely cause unintended consequences for Minnesota schools and students.
We are thrilled to announce four new members of FPF’s Education Privacy Project. Led by Amelia Vance, Director of Education Privacy, the Project works to equip and connect parents, educators, state and local education agencies, ed tech companies, and other stakeholders with substantive practices, policies, and other solutions to address education privacy challenges.
Washington, DC – Today, the Future of Privacy Forum announced the launch of a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family and the Nielsen Foundation as founding sponsors of the Elise Berkower Memorial Fellowship.
Today, the Future of Privacy Forum released its City of Seattle Open Data Risk Assessment. The Assessment provides tools and guidance to the City of Seattle and other municipalities navigating the complex policy, operational, technical, organizational, and ethical standards that support privacy-protective open data programs.