CCPA, face to face with the GDPR: An in depth comparative analysis guide
The General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) and the California Consumer Privacy Act of 2018 (‘CCPA’) both aim to guarantee strong protection for individuals regarding their personal data and apply to businesses that collect, use, or share consumer data, whether the information was obtained online or offline.
Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems Report
Data-driven and evidence-based social policy innovation can help governments serve communities better, smarter, and faster. Integrated Data Systems (IDS) use data that government agencies routinely collect in the normal course of delivering public services to shape local policy and practice. They can use data to evaluate the effectiveness of new initiatives or bridge gaps between public services and community providers.
Future of Privacy Forum and Actionable Intelligence for Social Policy: ‘Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems’
Washington, DC – Today, Future of Privacy Forum and Actionable Intelligence for Social Policy released Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems. Nothing to Hide provides governments and their partners working to integrate data for policy and program improvement with the necessary tools to lead privacy-sensitive, inclusive engagement efforts. In addition to a narrative step-by-step guide to communication and engagement on data privacy, the toolkit is supplemented with action-oriented appendices, including worksheets, checklists, exercises, and additional resources.
The Privacy Expert’s Guide to AI and Machine Learning Report
Today, FPF announces the release of The Privacy Expert’s Guide to AI and Machine Learning. This guide explains the technological basics of AI and ML systems at a level of understanding useful for non-programmers, and addresses certain privacy challenges associated with the implementation of new and existing ML-based products and services.
Facial Detection Technologies Report & Infographic
The infographic, Understanding Facial Detection, Characterization, and Recognition Technologies, along with Privacy Principles for Facial Recognition Technology in Consumer Applications Report helps businesses and policymakers better understand and evaluate the growing use of face-based biometric technology systems when used for consumer applications. The consumer-facing applications of facial recognition technology continue to evolve, and the technology […]
Communicating about Data Privacy and Security Report
The ADRF Network is an evolving grassroots effort among researchers and organizations who are seeking to collaborate around improving access to and promoting the ethical use of administrative data in social science research. As supporters of evidence-based policymaking and research, FPF has been an integral part of the Network since its launch and has chaired the network’s Data Privacy and Security Working Group since November 2017.
Beyond Explainability: A Practical Guide to Managing Risk in Machine Learning Models Report
Beyond Explainability aims to provide a template for effectively managing this risk in practice, with the goal of providing lawyers, compliance personnel, data scientists, and engineers a framework to safely create, deploy, and maintain ML, and to enable effective communication between these distinct organizational perspectives.
Understanding Session Replay Scripts – a Guide for Privacy Professionals
Researchers at Princeton University’s Center for Information Technology Policy (CITP) have demonstrated that many websites are using third-party tools to track visitors’ individual browsing sessions. “Session replay scripts” can raise serious privacy concerns if implemented incorrectly — but with the right safeguards, can be part of a range of ordinary, useful web analytics tools. FPF has published a 3-page guide for Privacy Professionals to assist in deciding whether and how to implement session replay scripts.
City of Seattle Open Data Risk Assessment Report
FPF requested feedback from the public on its proposed Draft Open Data Risk Assessment for the City of Seattle. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy. To ensure its Open Data program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked FPF with creating and deploying an initial privacy risk assessment methodology for open data.
Unfairness By Algorithm: Distilling the Harms of Automated Decision-Making Report & Infographic
Analysis of personal data can be used to improve services, advance research, and combat discrimination. However, such analysis can also create valid concerns about differential treatment of individuals or harmful impacts on vulnerable communities. These concerns can be amplified when automated decision-making uses sensitive data (such as race, gender, or familial status), impacts protected classes, or affects individuals’ eligibility for housing, employment, or other core services. When seeking to identify harms, it is important to appreciate the context of interactions between individuals, companies, and governments—including the benefits provided by automated decision-making frameworks, and the fallibility of human decision-making.