The proliferation of data collection has encouraged innovative uses of data in all corners of society. The analysis of individuals’ personal data once done by researchers in academic institutions has been embraced by a rapidly growing number of companies and not-for-profit organizations. High-profile cases of experimentation and manipulation of user data, as in the case of Facebook and OkCupid, threaten to lead organizations to keep research results confidential to avoid public scrutiny or potential legal liability. FPF is bringing together stakeholders to help develop frameworks and standards to protect the valuable research these organizations produce and to ensure the ethical collection and use of data for research purposes. Additionally, FPF has a number of projects directly related to encouraging the responsible use and sharing of corporate data for academic research, including its annual Award for Research Data Stewardship and a pilot Ethical Data Sharing Review Committee to facilitate trusted transfer of data between corporate and research organizations.
Lawyers are trained to respond to risks that threaten the market position or operating capital of their clients. However, when it comes to AI, it can be difficult for lawyers to provide the best guidance without some basic technical knowledge. This article shares some key insights from our shared experiences to help lawyers feel more at ease responding to AI questions when they arise.
BCIs are computer-based systems that directly record, process, analyze, or modulate human brain activity in the form of neurodata that is then translated into an output command from human to machine. Neurodata is data generated by the nervous system, composed of the electrical activities between neurons or proxies of this activity. When neurodata is linked, or reasonably linkable, to an individual, it is personal neurodata.
Digital data is a strategic asset for business. It is also an asset for researchers seeking to answer socially beneficial questions using company held data. Research using secondary data introduces new challenges and ethical concerns for research administrators and research ethics committees, like IRBs. FPF Senior Researcher, AI & Ethics, Dr. Sara Jordan, analyzes some […]
FPF has launched an independent ethical review committee to provide oversight for research projects that rely upon sharing of corporate data with researchers. Whether researchers are studying the impact of platforms on society, supporting evidence based policymaking, or understanding issues from COVID to climate change, personal data held by companies is increasingly essential to advancing scientific knowledge.
A new report from the Future of Privacy Forum (FPF), Augmented Reality + Virtual Reality: Privacy & Autonomy Considerations in Emerging, Immersive Digital Worlds, provides recommendations to address the privacy risks of augmented reality (AR) and virtual reality (VR) technologies. The vast amount of sensitive personal information collected by AR and VR technologies creates serious risks […]
On March 24, the FPF hosted “Dark Patterns:” Manipulative UX Design and the Role of Regulation. So-called “dark patterns” are user interface design choices that benefit an online service by coercing, manipulative, or deceiving users into making unintended or potentially harmful decisions. The event provided a critical examination of the ways in which manipulative interfaces can […]
FPF Health and AI & Ethics Policy Counsels Present a Scientific Position at ICML 2020 and at 2020 CCSQ World Usability Day
On November 12, 2020, FPF Policy Counsels Drs. Rachele Hendricks-Sturrup and Sara Jordan presented privacy-by-design alongside human-centered design concepts during the 2020 CCSQ World Usability Day virtual conference. This presentation followed Drs. Hendricks-Sturrup’s and Jordan’s July 2020 scientific position paper presented at the International Conference on Machine Learning (ICML) 2020, entitled “Patient- Reported Outcomes: A Privacy-Centric and Federated Approach […]
By Katelyn Ringrose, Christopher Wolf Diversity Law Fellow at the Future of Privacy Forum, and Christopher Wood, Executive Director of LGBT Tech, with thanks to Connor Colson, FPF Policy Intern. LGBTQ+ rights are, and have always been, linked with privacy. Over the years, privacy-invasive laws, practices, and norms have been used to oppress LGBTQ+ individuals […]
Last week, FPF hosted a virtual event honoring the winners of the first-ever FPF Award for Research Data Stewardship: University of California, Irvine Professor of Cognitive Sciences Mark Steyvers and Lumos Labs, represented by General Manager Bob Schafer. In addition to the awardees, the event featured Daniel L. Goroff, Vice President and Program Director at […]
FPF Submits Comments Regarding Data Protection & COVID-19 Ahead of National Committee on Vital and Health Statistics Hearing
Yesterday, FPF submitted comments to the National Committee on Vital and Health Statistics (NCVHS) ahead of a Virtual Hearing of the Subcommittee on Privacy, Confidentiality, and Security on September 14, 2020. The hearing will explore considerations for data collection and use during a public health emergency, in light of the deployment of new technologies for […]