The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.
John Verdi, the Future of Privacy Forum’s Vice President of Policy, testified today before the Federal Commission on Student Safety meeting, “Curating a Healthier & Safer Approach: Issues of Mental Health and Counseling for Our Young.”
This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice.
Strava’s location data controversy demonstrates the unique challenges of publicly releasing location datasets (open data), even when the data is aggregated.
Analysis of personal data can be used to improve services, advance research, and combat discrimination. However, such analysis can also create valid concerns about differential treatment of individuals or harmful impacts on vulnerable communities. These concerns can be amplified when automated decision-making uses sensitive data (such as race, gender, or familial status), impacts protected classes, or affects individuals’ eligibility for housing, employment, or other core services. When seeking to identify harms, it is important to appreciate the context of interactions between individuals, companies, and governments—including the benefits provided by automated decision-making frameworks, and the fallibility of human decision-making.
With this event, we aim to determine the relevant state of the art in privacy engineering; in particular, we will focus on those areas where the “art” needs to be developed further. The goal of this trans-Atlantic initiative is to identify open research and development tasks, which are needed to make the full achievement of the GDPR’s ambitions possible.
Today, FPF released a new Infographic: Microphones & the Internet of Things: Understanding Uses of Audio Sensors in Connected Devices (read the Press Release here). From Amazon Echos to Smart TVs, we are seeing more home devices integrate microphones, often to provide a voice user interface powered by cloud-based speech recognition.
On June 27, 2017, the Future of Privacy Forum released an infographic, “Data and the Connected Car – Version 1.0,” describing the basic data-generating devices and flows in today’s connected vehicles. The infographic will help consumers and businesses alike understand the emerging data ecosystems that power incredible new features—features that can warn drivers of an accident before they see it, or jolt them awake if they fall asleep at the wheel.
Washington, DC – Today, the Future of Privacy Forum (FPF) and the Data Quality Campaign (DQC) relaunched FERPA|Sherpa, the leading resource for information about education privacy issues. Named after the core federal law that governs education privacy, FERPA|Sherpa provides students, parents, schools, ed tech companies, and policymakers with easy access to the resources, best practices, and guidelines that are essential to understanding the complex privacy issues arising at the intersection of kids, schools, and technology.
There are many lessons to learn from the spread of the WannaCry ransomware attacks across the globe. One lesson that needs more attention is the danger that exists when a government attempts to create mandatory backdoors into computer software and systems.