Beyond Explainability aims to provide a template for effectively managing this risk in practice, with the goal of providing lawyers, compliance personnel, data scientists, and engineers a framework to safely create, deploy, and maintain ML, and to enable effective communication between these distinct organizational perspectives.
Researchers at Princeton University’s Center for Information Technology Policy (CITP) have demonstrated that many websites are using third-party tools to track visitors’ individual browsing sessions. “Session replay scripts” can raise serious privacy concerns if implemented incorrectly — but with the right safeguards, can be part of a range of ordinary, useful web analytics tools. FPF has published a 3-page guide for Privacy Professionals to assist in deciding whether and how to implement session replay scripts.
FPF requested feedback from the public on its proposed Draft Open Data Risk Assessment for the City of Seattle. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy. To ensure its Open Data program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked FPF with creating and deploying an initial privacy risk assessment methodology for open data.
Analysis of personal data can be used to improve services, advance research, and combat discrimination. However, such analysis can also create valid concerns about differential treatment of individuals or harmful impacts on vulnerable communities. These concerns can be amplified when automated decision-making uses sensitive data (such as race, gender, or familial status), impacts protected classes, or affects individuals’ eligibility for housing, employment, or other core services. When seeking to identify harms, it is important to appreciate the context of interactions between individuals, companies, and governments—including the benefits provided by automated decision-making frameworks, and the fallibility of human decision-making.
Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment.
Washington, DC – Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, FPF reveals findings from research and interviews with experts in the academic and industry communities. Three main areas are discussed: 1) The extent to which leading companies make data available to support published research that contributes to public knowledge; 2) Why and how companies share data for academic research; and 3) The risks companies perceive to be associated with such sharing, as well as their strategies for mitigating those risks.
Today, the Future of Privacy Forum (FPF) released “Law Enforcement Access to Student Records: A Guide for School Administrators & Ed Tech Service Providers,” written by Amelia Vance and Sarah Williamson. This guide helps to answer some of the basic questions that we have heard from key stakeholders about law enforcement access to data over the past nine months.
The Future of Privacy Forum conducted a study of the companies enrolled in the US-EU Privacy Shield program and determined that 114 European headquartered companies are active Privacy Shield Participants. These European companies rely on the program to transfer data to their US subsidiaries or to essential vendors that support their business needs.
In a rare moment of bipartisanship, the House Energy and Commerce Committee yesterday unanimously approved the SELF DRIVE Act H.R. 3388, sending it to the full House of Representatives for consideration. The bill facilitates introduction and testing of autonomous cars by clarifying federal and state roles, and by granting exemptions from motor vehicle standards that have impeded introduction of new automated vehicle technologies. This vote was an important step forward in enabling introduction of new technologies that have the potential to transform the future of mobility and maximize consumer safety.
Jules Polonetsky, CEO, Future of Privacy Forum, Omer Tene, Senior Fellow, Future of Privacy Forum, and Daniel Goroff, Vice President and Program Director, Alfred P. Sloan Foundation authored a paper titled Privacy Protective Research: Facilitating Ethically Responsible Access to Administrative Data. This paper will be featured in an upcoming edition of The Annals of the American Academy of Political and Social Science.