One key method for ensuring privacy while processing large amounts of data is de-identification. De-identified data refers to data through which a link to a particular individual cannot be established. This often involves “scrubbing” the identifiable elements of personal data, making it “safe” in privacy terms while attempting to retain its commercial and scientific value.
In the era of big data, the debate over the definition of personal information, de-identification and re-identification has never been more important. Privacy regimes often rely on data being considered Personal in order to require the application of privacy rights and protections. Data that is anonymous is considered free of privacy risk and available for public use.
Yet much data that is collected and used exists somewhere on a spectrum between these stages. FPF’s De-ID Project has examined practical frameworks for applying privacy restrictions to data based on the nature of data that is collected, the risks of de-identification, and the additional legal and administrative protections that may be applied.
Featured
FPF Comments on the California Consumer Privacy Act (CCPA)
On Friday, the Future of Privacy Forum submitted comments to the Office of the California Attorney General (AG), Xavier Becerra. Read FPF’s Full Comments (11-page letter) See Attachment 1: Comparing Privacy Laws: GDPR vs. CCPA See Attachment 2: A Visual Guide to Practical De-identification In FPF’s outreach to the AG, we commended the office for its […]
Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers
Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment.
New Study: Companies are Increasingly Making Data Accessible to Academic Researchers, but Opportunities Exist for Greater Collaboration
Washington, DC – Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, FPF reveals findings from research and interviews with experts in the academic and industry communities. Three main areas are discussed: 1) The extent to which leading companies make data available to support published research that contributes to public knowledge; 2) Why and how companies share data for academic research; and 3) The risks companies perceive to be associated with such sharing, as well as their strategies for mitigating those risks.
Privacy Protective Research: Facilitating Ethically Responsible Access to Administrative Data
Jules Polonetsky, CEO, Future of Privacy Forum, Omer Tene, Senior Fellow, Future of Privacy Forum, and Daniel Goroff, Vice President and Program Director, Alfred P. Sloan Foundation authored a paper titled Privacy Protective Research: Facilitating Ethically Responsible Access to Administrative Data. This paper will be featured in an upcoming edition of The Annals of the American Academy of Political and Social Science.
Announcing the Inaugural Issue of Future of Privacy Forum's Privacy Scholarship Reporter
Future of Privacy Forum is pleased to announce it has published the inaugural issue of the Privacy Scholarship Reporter. This regular newsletter will highlight recent privacy research and is published by the Privacy Research and Data Responsibility Network (RCN), an FPF initiative supported by the National Science Foundation.
Advancing Knowledge Regarding Practical Solutions for De-Identification of Personal Data: A Call for Papers
De-identification of personal information plays a central role in current privacy policy, law, and practice. Yet there are deep disagreements about the efficacy of de-identification to mitigate privacy risks. Some critics argue that it is impossible to eliminate privacy harms from publicly released data using de-identification because other available data sets will allow attackers to identify individuals through linkage attacks.
FPF Welcomes New Senior Fellow – Ira Rubinstein
FPF is proud to welcome its newest Senior Fellow, Ira Rubinstein. Ira will be working with FPF staff, fellows and members on a number of cross-Atlantic privacy issues and will be collaborating with EU academics and institutions on projects focused on de-identification, ethics, big data, and other issues. Ira Rubinstein is a Senior Fellow at […]
Controlling the Future of Privacy
Last week, I was fortunate enough to see several cool new applications of location technology and social data at two conferences which bookended my week. Privacy issues were addressed at the end of each conference, which I understand: a lecture about privacy is the last thing entrepreneurs and researchers want to hear. Unfortunately, privacy can […]
Comments to NTIA on Big Data and Privacy
Today, FPF submitted comments to the NTIA as it begins its exploration of how big data impact the Consumer Privacy Bill of Rights. While the NTIA sought comment on over a dozen key questions, our filing focus largely on four issues: (1) the need for additional clarity surrounding the flexible application of the Consumer Privacy […]
Comments for the White House "Big Data Review"
This afternoon, FPF submitted comments to help inform the White House Office of Science and Technology Policy’s “Big Data Review.” Announced in January, the White House Big Data Review has been a helpful exercise in scoping out how big data is changing our society. Through public workshops at MIT, NYU, and Berkeley, the review has […]