The proliferation of data collection has encouraged innovative uses of data in all corners of society. The analysis of individuals’ personal data once done by researchers in academic institutions has been embraced by a rapidly growing number of companies and not-for-profit organizations. High-profile cases of experimentation and manipulation of user data, as in the case of Facebook and OkCupid, threaten to lead organizations to keep research results confidential to avoid public scrutiny or potential legal liability. FPF is bringing together stakeholders to help develop frameworks and standards to protect the valuable research these organizations produce and to ensure the ethical collection and use of data for research purposes. Additionally, FPF has a number of projects directly related to encouraging the responsible use and sharing of corporate data for academic research, including its annual Award for Research Data Stewardship and a pilot Ethical Data Sharing Review Committee to facilitate trusted transfer of data between corporate and research organizations.
Featured
Overcoming Hurdles to Effective Data Sharing for Researchers
In 2021, challenges faced by academics in accessing corporate data sets for research and the issues that companies were experiencing to make privacy-respecting research data available broke into the news. With its long history of research data sharing, FPF saw an opportunity to bring together leaders from the corporate, research, and policy communities for a conversation […]
Data Sharing … By Any Other Name
There are many different uses of the term “data sharing” to describe a relationship between parties who share data from one organization to another organization for a new purpose. Some uses of the term data sharing are related to academic and scientific research purposes, and some are related to transfer of data for commercial or government purposes. ..it is imperative that we are more precise which forms of sharing we are referencing so that the interests of the parties are adequately considered, and the various risks and benefits are appropriately contextualized and managed.
Five Things Lawyers Need to Know About AI
Lawyers are trained to respond to risks that threaten the market position or operating capital of their clients. However, when it comes to AI, it can be difficult for lawyers to provide the best guidance without some basic technical knowledge. This article shares some key insights from our shared experiences to help lawyers feel more at ease responding to AI questions when they arise.
Brain-Computer Interfaces: Privacy and Ethical Considerations for the Connected Mind
BCIs are computer-based systems that directly record, process, analyze, or modulate human brain activity in the form of neurodata that is then translated into an output command from human to machine. Neurodata is data generated by the nervous system, composed of the electrical activities between neurons or proxies of this activity. When neurodata is linked, or reasonably linkable, to an individual, it is personal neurodata.
Blog Summary: Ethical Concerns and Challenges in Research using Secondary Data
Digital data is a strategic asset for business. It is also an asset for researchers seeking to answer socially beneficial questions using company held data. Research using secondary data introduces new challenges and ethical concerns for research administrators and research ethics committees, like IRBs. FPF Senior Researcher, AI & Ethics, Dr. Sara Jordan, analyzes some […]
FPF Ethical Data Use Committee will Support Research Relying on Private Sector Data
FPF has launched an independent ethical review committee to provide oversight for research projects that rely upon sharing of corporate data with researchers. Whether researchers are studying the impact of platforms on society, supporting evidence based policymaking, or understanding issues from COVID to climate change, personal data held by companies is increasingly essential to advancing scientific knowledge.
FPF Report Outlines Opportunities to Mitigate the Privacy Risks of AR & VR Technologies
A new report from the Future of Privacy Forum (FPF), Augmented Reality + Virtual Reality: Privacy & Autonomy Considerations in Emerging, Immersive Digital Worlds, provides recommendations to address the privacy risks of augmented reality (AR) and virtual reality (VR) technologies. The vast amount of sensitive personal information collected by AR and VR technologies creates serious risks […]
Manipulative UX Design & the Role of Regulation: Event Highlights
On March 24, the FPF hosted “Dark Patterns:” Manipulative UX Design and the Role of Regulation. So-called “dark patterns” are user interface design choices that benefit an online service by coercing, manipulative, or deceiving users into making unintended or potentially harmful decisions. The event provided a critical examination of the ways in which manipulative interfaces can […]
FPF Health and AI & Ethics Policy Counsels Present a Scientific Position at ICML 2020 and at 2020 CCSQ World Usability Day
On November 12, 2020, FPF Policy Counsels Drs. Rachele Hendricks-Sturrup and Sara Jordan presented privacy-by-design alongside human-centered design concepts during the 2020 CCSQ World Usability Day virtual conference. This presentation followed Drs. Hendricks-Sturrup’s and Jordan’s July 2020 scientific position paper presented at the International Conference on Machine Learning (ICML) 2020, entitled “Patient- Reported Outcomes: A Privacy-Centric and Federated Approach […]
A Look Back at the Role of Law and the Right To Privacy in LGBTQ+ History
By Katelyn Ringrose, Christopher Wolf Diversity Law Fellow at the Future of Privacy Forum, and Christopher Wood, Executive Director of LGBT Tech, with thanks to Connor Colson, FPF Policy Intern. LGBTQ+ rights are, and have always been, linked with privacy. Over the years, privacy-invasive laws, practices, and norms have been used to oppress LGBTQ+ individuals […]