Washington, DC – Today, Future of Privacy Forum and Actionable Intelligence for Social Policy released Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems. Nothing to Hide provides governments and their partners working to integrate data for policy and program improvement with the necessary tools to lead privacy-sensitive, inclusive engagement efforts. In addition to a narrative step-by-step guide to communication and engagement on data privacy, the toolkit is supplemented with action-oriented appendices, including worksheets, checklists, exercises, and additional resources.
Press Releases and Statements
These resources will help businesses and policymakers better understand and evaluate the growing use of face-based biometric technology systems when used for consumer applications. Facial recognition technology can help users organize and label photos, improve online services for visually impaired users, and help stores and stadiums better serve customers. At the same time, the technology often involves the collection and use of sensitive biometric data, requiring careful assessment of the data protection issues raised. Understanding the technology and building trust are necessary to maximize the benefits and minimize the risks.
The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.
Washington, DC – Today, Future of Privacy Forum, along with leading consumer genetic and personal genomic testing companies 23andMe, Ancestry, Helix, MyHeritage, and Habit, released Privacy Best Practices for Consumer Genetic Testing Services. The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.
Beyond Explainability aims to provide a template for effectively managing this risk in practice, with the goal of providing lawyers, compliance personnel, data scientists, and engineers a framework to safely create, deploy, and maintain ML, and to enable effective communication between these distinct organizational perspectives.
College Park, MD – June 26, 2018 – Immuta and the Future of Privacy Forum (FPF) today announced the first-ever framework for practitioners to manage risk in artificial intelligence (AI) and machine learning (ML) models. Their joint whitepaper, Beyond Explainability: A Practical Guide to Managing Risk in Machine Learning Models, provides business executives, data scientists, and compliance professionals with a strategic guide for governing the legal, privacy, and ethical risks associated with this technology.
Washington, D.C– Today, Future of Privacy Forum’s (FPF) Amelia Vance, Director of the Education Privacy Project, will deliver testimony in a hearing before the House Committee on Education and the Workforce, “Protecting Privacy, Promoting Data Security: Exploring How Schools and States Keep Data Safe.” In her prepared testimony, Vance will comment on how states, districts and ed tech companies can work together in ensuring student privacy.
Last week, the Future of Privacy Forum filed written comments in response to the California Public Utilities Commission’s proposed decision authorizing pilot programs for passenger service in Autonomous Vehicles. The CPUC is a consumer protection agency that oversees, among other topics, provision of passenger service in the state. The proposed decision called for a number of criteria to be met by companies seeking to operate AV passenger service, including reporting of communications between passengers and remote operators of driverless AVs, as well as aggregated operations data.
Yesterday, the Future of Privacy Forum submitted written comments to members of the Minnesota House of Representatives in response to the pending student privacy bill, the Student Data Privacy Act (HF 1507). FPF expressed concerns about the proposed language of the bill, which would create conflicting requirements for schools and education technology companies, and likely cause unintended consequences for Minnesota schools and students.
We are thrilled to announce four new members of FPF’s Education Privacy Project. Led by Amelia Vance, Director of Education Privacy, the Project works to equip and connect parents, educators, state and local education agencies, ed tech companies, and other stakeholders with substantive practices, policies, and other solutions to address education privacy challenges.