Analysis of personal data can be used to improve services, advance research, and combat discrimination. However, such analysis can also create valid concerns about differential treatment of individuals or harmful impacts on vulnerable communities. These concerns can be amplified when automated decision-making uses sensitive data (such as race, gender, or familial status), impacts protected classes, or affects individuals’ eligibility for housing, employment, or other core services. When seeking to identify harms, it is important to appreciate the context of interactions between individuals, companies, and governments—including the benefits provided by automated decision-making frameworks, and the fallibility of human decision-making.
Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment.
Washington, DC – Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, FPF reveals findings from research and interviews with experts in the academic and industry communities. Three main areas are discussed: 1) The extent to which leading companies make data available to support published research that contributes to public knowledge; 2) Why and how companies share data for academic research; and 3) The risks companies perceive to be associated with such sharing, as well as their strategies for mitigating those risks.
Today, the Future of Privacy Forum (FPF) released “Law Enforcement Access to Student Records: A Guide for School Administrators & Ed Tech Service Providers,” written by Amelia Vance and Sarah Williamson. This guide helps to answer some of the basic questions that we have heard from key stakeholders about law enforcement access to data over the past nine months.
The Future of Privacy Forum conducted a study of the companies enrolled in the US-EU Privacy Shield program and determined that 114 European headquartered companies are active Privacy Shield Participants. These European companies rely on the program to transfer data to their US subsidiaries or to essential vendors that support their business needs.
In a rare moment of bipartisanship, the House Energy and Commerce Committee yesterday unanimously approved the SELF DRIVE Act H.R. 3388, sending it to the full House of Representatives for consideration. The bill facilitates introduction and testing of autonomous cars by clarifying federal and state roles, and by granting exemptions from motor vehicle standards that have impeded introduction of new automated vehicle technologies. This vote was an important step forward in enabling introduction of new technologies that have the potential to transform the future of mobility and maximize consumer safety.
Jules Polonetsky, CEO, Future of Privacy Forum, Omer Tene, Senior Fellow, Future of Privacy Forum, and Daniel Goroff, Vice President and Program Director, Alfred P. Sloan Foundation authored a paper titled Privacy Protective Research: Facilitating Ethically Responsible Access to Administrative Data. This paper will be featured in an upcoming edition of The Annals of the American Academy of Political and Social Science.
This week, the Federal Trade Commission (FTC) updated its guidance on COPPA, the Children’s Online Privacy Protection Act, to clarify that the 1998 statute applies not just to websites and online service providers that collect data from children, but also to Internet of Things devices, including children’s toys.
Kelsey Finch, FPF Policy Counsel, presented FPF’s 2016 Mobile Apps Study at the Federal Trade Commission’s annual PrivacyCon on January 12, 2017. Kelsey presented a visual representation of the App Study designed by FPF Fellow, Carolina Alonso. See the visual.
Today, FPF is pleased to make available the Conference Proceedings from our Beyond IRBs: Designing Ethical Review Processes for Big Data Research workshop. The workshop, co-hosted by the Washington & Lee School of Law and supported by the National Science Foundation and the Alfred P. Sloan Foundation, aimed to identify processes and commonly accepted ethical principles for data research in academia, government and industry.