Research often requires using sensitive data to answer important questions. The ethical collection and analysis of personal information can be challenging to do while still protecting the privacy of the implicated individuals, honoring informed consent, and complying with other legal obligations. The technology, policies, and ethical considerations for researchers are constantly shifting, sometimes making it difficult to keep up. That’s why FPF engages stakeholders across academia and industry to produce recommendations, best practices, and ethical review structures that promote responsible research. Our work is centered around streamlining, encouraging, and promoting responsible research that respects essential privacy and ethical considerations throughout the research lifecycle. FPF works with policymakers to develop legislative protections that support effective, responsible research with strong privacy safeguards, including hosting events that allow policymakers and regulators to engage directly with practitioners from academia, advocacy, and industry.
FPF also has an Ethics and Data in Research Working Group. This group receives late-breaking analysis of emerging legislation affecting research and data, meets to discuss the ethical and technological challenges of conducting research, and collaborates to create best practices to protect privacy, decrease risk, and increase data sharing for research, partnerships, and infrastructure. Learn more and join here.
Featured
FPF Comments on the California Consumer Privacy Act (CCPA)
On Friday, the Future of Privacy Forum submitted comments to the Office of the California Attorney General (AG), Xavier Becerra. Read FPF’s Full Comments (11-page letter) See Attachment 1: Comparing Privacy Laws: GDPR vs. CCPA See Attachment 2: A Visual Guide to Practical De-identification In FPF’s outreach to the AG, we commended the office for its […]
FPF, EFPIA, and CIPL Workshop Report Now Available: "Can GDPR Work for Health Scientific Research?"
On October 22, 2018, the Future of Privacy Forum (FPF), the European Federation of Pharmaceutical Industries and Associations (EFPIA), and the Centre for Information Policy Leadership (CIPL) hosted a workshop in Brussels, “Can GDPR Work for Health Scientific Research?” to discuss the processing of personal data for scientific research purposes under the European Union’s General Data Protection Regulation (GDPR).
New FPF Study Documents More Than 150 European Companies Participating in the EU-US Data Transfer Mechanism
New FPF Study Documents More Than 150 European Companies Participating in the EU-US Data Transfer Mechanism EU Companies’ Participation Grew by One Third Over the Past Year By Jeremy Greenberg Yesterday, the European Commission published its second annual review of the EU-U.S. Privacy Shield, finding that “the U.S. continues to ensure an adequate level of […]
Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems
Data-driven and evidence-based social policy innovation can help governments serve communities better, smarter, and faster. Integrated Data Systems (IDS) use data that government agencies routinely collect in the normal course of delivering public services to shape local policy and practice. They can use data to evaluate the effectiveness of new initiatives or bridge gaps between public services and community providers.
FPF Publishes Report Supporting Stakeholder Engagement and Communications for Researchers and Practitioners Working to Advance Administrative Data Research
The ADRF Network is an evolving grassroots effort among researchers and organizations who are seeking to collaborate around improving access to and promoting the ethical use of administrative data in social science research. As supporters of evidence-based policymaking and research, FPF has been an integral part of the Network since its launch and has chaired the network’s Data Privacy and Security Working Group since November 2017.
Taming The Golem: Challenges of Ethical Algorithmic Decision-Making
This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice.
New Future of Privacy Forum Study Finds the City of Seattle’s Open Data Program a National Leader in Privacy Program Management
Today, the Future of Privacy Forum released its City of Seattle Open Data Risk Assessment. The Assessment provides tools and guidance to the City of Seattle and other municipalities navigating the complex policy, operational, technical, organizational, and ethical standards that support privacy-protective open data programs.
Unfairness By Algorithm: Distilling the Harms of Automated Decision-Making
Analysis of personal data can be used to improve services, advance research, and combat discrimination. However, such analysis can also create valid concerns about differential treatment of individuals or harmful impacts on vulnerable communities. These concerns can be amplified when automated decision-making uses sensitive data (such as race, gender, or familial status), impacts protected classes, or affects individuals’ eligibility for housing, employment, or other core services. When seeking to identify harms, it is important to appreciate the context of interactions between individuals, companies, and governments—including the benefits provided by automated decision-making frameworks, and the fallibility of human decision-making.
Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers
Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment.
New Study: Companies are Increasingly Making Data Accessible to Academic Researchers, but Opportunities Exist for Greater Collaboration
Washington, DC – Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, FPF reveals findings from research and interviews with experts in the academic and industry communities. Three main areas are discussed: 1) The extent to which leading companies make data available to support published research that contributes to public knowledge; 2) Why and how companies share data for academic research; and 3) The risks companies perceive to be associated with such sharing, as well as their strategies for mitigating those risks.