Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems

Data-driven and evidence-based social policy innovation can help governments serve communities better, smarter, and faster. Integrated Data Systems (IDS) use data that government agencies routinely collect in the course of delivering public services to shape local policy and practice. They can inform the design and implementation of programs, help measure and evaluate outcomes across the lifecourse, and enable policy-makers to better address complex social problems.

Respecting privacy is paramount to IDS’ success. The use of IDS to link sensitive personal data is typically governed by stringent local, state, and federal privacy laws and regulations, as well as rigorous technical safeguards and ethical norms. Nevertheless, individuals and communities routinely have questions and concerns about how their information is used and protected.

For lasting success, IDS need to develop “social license” to integrate data. Ultimately, societal acceptance and approval depend not merely on legal compliance with privacy rules, but on legitimacy, credibility, and public trust. Inclusive public engagement and effective communications around privacy are necessary for IDS to build trust in the public sector and to create strong, sustainable relationships with the communities they serve.

In order to help IDS and government leaders engage stakeholders and increase communities’ trust in the value of IDS, Future of Privacy Forum (FPF) and Actionable Intelligence for Social Policy (AISP) have created the Nothing to Hide toolkit.

This toolkit provides IDS stakeholders with the necessary tools to support and lead privacy-sensitive, inclusive engagement efforts. A narrative step-by-step guide to IDS communication and engagement is supplemented with action-oriented appendices, including worksheets, checklists, exercises, and additional resources, available below.

READ FULL REPORT

Download Handouts

Appendix B – Communications Talking Points and Exercises

Appendix C – Engagement Worksheets, Checklists, and Sample Materials

Future of Privacy Forum and Actionable Intelligence for Social Policy Release ‘Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems’

FOR IMMEDIATE RELEASE

October 31, 2018

Contact:

Kelsey Finch, Policy Counsel, [email protected]

FPF Communications, [email protected]

Future of Privacy Forum and Actionable Intelligence for Social Policy Release ‘Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems’

Washington, DC – Today, Future of Privacy Forum and Actionable Intelligence for Social Policy released Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems. Nothing to Hide provides governments and their partners working to integrate data for policy and program improvement with the necessary tools to lead privacy-sensitive, inclusive engagement efforts. In addition to a narrative step-by-step guide to communication and engagement on data privacy, the toolkit is supplemented with action-oriented appendices, including worksheets, checklists, exercises, and additional resources.

Integrated Data Systems leverage the data that agencies routinely collect in the course of delivering public services to help governments serve communities better, smarter, and faster. By linking data across government silos, IDS can inform the evidence-based design and implementation of programs, help measure and evaluate outcomes across the lifecourse, and enable policy-makers to better address complex social problems.

Respecting privacy is paramount to successful data sharing and integration efforts. The use of sensitive personal data is governed by local, state, and federal privacy laws and regulations, as well as rigorous technical safeguards and ethical norms. Nevertheless, individuals and communities routinely have questions and concerns about how their information is used and protected. The strongest IDS lean into opportunities to talk about why data are necessary for social policy improvement and innovation—and also make time to listen to and address stakeholders’ concerns, expectations, and priorities.

“The path to lasting success for IDS is establishing sound, two-way communications; empowering stakeholders; and continually serving the public good,” said Kelsey Finch, Policy Counsel at FPF. “Ultimately, societal acceptance and approval for evidence-based policy depend not merely on legal compliance with privacy rules, but on each IDS’ legitimacy, credibility, and public trust.”

“The state and local governments we work with take their role as data stewards very seriously. Like us, they believe that an ethical imperative exists to respectfully share and use data as a public asset, with the appropriate safeguards in place,” said Della Jenkins, Executive Director of AISP.

FPF and AISP hope this toolkit will help government leaders and IDS stakeholders to articulate that commitment, and do the hard work of both talking and listening about data privacy. In doing so, they are bound to increase both communities’ trust in the value of data sharing and their long-term impact.

###

This material is based upon work supported by the Corporation for National and Community Service (CNCS). Opinions or points of view expressed in this document are those of the authors and do not necessarily reflect the official position of, or a position that is endorsed by, CNCS or the Social Innovation Fund.

Thanks to our partners at Third Sector Capital Partners and the members of our Empowering Families and AISP Learning Community for sharing experiences and insights about data privacy and engagement.

About Future of Privacy Forum

Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

About Actionable Intelligence for Social Policy

Actionable Intelligence for Social Policy is an initiative that focuses on the development, use, and innovation of integrated data systems (IDS) for policy analysis and program reform. AISP encourages social innovation and social policy experimentation so government can work better, smarter and faster. Learn more about AISP by visiting www.aisp.upenn.edu.

FPF Privacy Book Club – The Known Citizen: A History of Privacy in Modern America (December 5, 2018)

The FPF Privacy Book Club provides members with the opportunity to read a wide range of books — privacy, data, ethics, academic works, and other important data relevant issues — and have an open discussion of the selected literature.

We are excited to share The Known Citizen: A History of Privacy in Modern America by Professor Sarah E. Igo was chosen as the popular favorite by our readers. We are thrilled Professor Igo will be joining us for the December book club to introduce her book and answer questions.

Please join us on Wednesday, December 5, at 2:00 pm (EST) for the next FPF Privacy Book Club. If you are an existing member of the Book Club, you will receive the virtual conference dial-in information near the discussion date. You can join the Book Club here. Please feel free to forward this sign up link to friends who also may be interested.

JOIN PRIVACY BOOK CLUB

Video

The Privacy Expert's Guide to AI And Machine Learning

Today, FPF announces the release of The Privacy Expert’s Guide to AI and Machine Learning. This guide explains the technological basics of AI and ML systems at a level of understanding useful for non-programmers, and addresses certain privacy challenges associated with the implementation of new and existing ML-based products and services.

Contents Include:

Introduction:

Advanced algorithms, machine learning (ML), and artificial intelligence (AI) are appearing across digital and technology sectors from healthcare to financial institutions, and in contexts ranging from voice-activated digital assistants, to traffic routing, identifying at-risk students, and getting purchase recommendations on various online platforms Embedded in new technologies like autonomous cars and smart phones to enable cutting edge features,  AI is equally being applied to established industries such as agriculture and telecomm to increase accuracy and efficiency. Moving forward, machine learning is likely to be the foundation of many of the products and services in our daily lives, becoming unremarkable in much the same way that electricity faded from novelty to background during the industrialization of modern life 100 years ago.

Understanding AI and its underlying algorithmic processes presents new challenges for privacy officers and others responsible for data governance in companies ranging from retailers to cloud service providers. In the absence of targeted legal or regulatory obligations, AI poses new ethical and practical challenges for companies that strive to maximize consumer benefits while preventing potential harms.

For privacy experts, AI is more than just Big Data on a larger scale. Artificial Intelligence is differentiated by its interactive qualities – systems that collect new data in real time via sensory inputs (touchscreens, voice, video or camera inputs), and adapt their responses and subsequent functions based on these inputs. The unique features of AI and ML include not just big data’s defining characteristic of tremendous amounts of data, but the additional uses, and most importantly, the multi-layered processing models developed to harness and operationalize that data. AI-driven applications offer beneficial services and research opportunities, but pose potential harms to individuals and groups when not implemented with a clear focus on protecting individual rights and personal information. The scope of impact of these systems means it is critical that associated privacy concerns are addressed early in the design cycle, as lock-in effects make it more difficult to later modify harmful design choices. The design must include on-going monitoring and review as well, as these systems are literally built to morph and adapt over time. Intense privacy reviews must occur for existing systems as well, as design decisions entrenched in current systems impact future updates built upon these models.

As AI and ML programs are applied across new and existing industries, platforms, and applications, policymakers and corporate privacy officers will want to ensure that individuals are treated with respect and dignity, and retain the awareness, discretion and controls necessary to control their own information

Read the full guide here.

Learning from Europe but looking beyond for privacy law

FPF’s CEO, Jules Polonetsky, recently published an opinion piece in The Hill that discusses the need for comprehensive federal privacy legislation. Jules explains:

Any legislation should also consider the increasingly sophisticated privacy tools that are emerging, including differential privacy to measure privacy risk, homomorphic encryption that can enable privacy safe data analysis, and many new privacy compliance tools that are helping companies better manage data. A law that will stand the test of time and successfully protect privacy rights while enabling valuable uses of data should include mechanisms to incentivize such technology measures.

READ FULL ARTICLE

FPF Perspective: Limit Law Enforcement Access to Genetic Datasets

By John Verdi and Carson Martinez

Today, researchers published a paper detailing how governments can use public genetic databases to identify criminal suspects. These activities raise real questions about when it’s appropriate for law enforcement to analyze genetic information, and how best to protect individuals whose genetic data has been analyzed as part of a commercial service, but who are not accused of a crime.

FPF recently published Privacy Best Practices for Consumer Genetic Testing Services. The document includes strong protections for genetic data, particularly with regard to government access: law enforcement should obtain a warrant before seeking the disclosure of genetic data from companies; firms should demand valid legal process before disclosing genetic data and publish annual transparency reports. If government representatives are deviating from this approach, lawmakers or courts should impose clear, common-sense restrictions.

The sort of law enforcement search described in the paper would not be permitted by the FPF Best Practices. The search involves uploading genetic information discovered at crime scenes to an online database to identify an individual or an individual’s relatives. We require that companies only process genetic information uploaded with an individual’s permission; a crime scene upload necessarily occurs without the unknown subject’s permission. Further, leading companies require legal process before they will disclose genetic information to the government and have vigorously pushed back against law enforcement access to genetic data, many times declining to provide data in response to what they determined to be inappropriate requests. It is hard to imagine judges issuing warrants for the sort of general searches contemplated in the paper.

It’s worth noting that the public database at the center of the current discussion – GEDMatch – explicitly tells users that their genetic data will be shared with others without granular consent and subject to government access without a warrant. GEDMatch’s policies conflict with FPF’s Best Practices and are out of step with leading companies, which restrict this kind of access. GEDMatch’s practices are also different in another key way: the service permits users to upload digital genetic profiles. Prominent companies do not typically provide for such uploads, making crime scene genetic data less susceptible to matching without valid legal process.

There are clear benefits of using genetic data to help consumers better understand their health and ancestry. Genetic data, properly obtained and analyzed, can also help law enforcement solve crimes and improve public safety. However, unfettered law enforcement access to genetic information on commercial services would present substantial privacy risks. FPF’s Best Practices articulate a framework that can prevent many of these risks while preserving the public safety value of limited, narrow genetic searches predicated on probable cause and conducted pursuant to appropriate process. As we move forward, it is worth considering additional measures, including restrictions on government activities, technical safeguards, or other steps, that could bolster trust and safety.