Genetic Testing Will Be the Talk of the Table this Thanksgiving
This Thanksgiving, as families gather around the dinner table and discuss heritage and history, genetic testing is sure to be on the menu. Genetic testing companies are offering Black Friday and Cyber Monday discounts on kits to help you discover your genealogy and are sure to report record sales.
It is no surprise that as families come together this week, heritage, health, and the other fascinating information that can be drawn from DNA will be the talk of the table. From conversations about new family connections and serious health conditions to what types of wines best fit your genetic taste profile, DNA insights are becoming an important part of family discussions. And as with any family discussion, navigating serious or sensitive topics takes thoughtfulness and diplomacy; choosing a genetic testing provider also calls for careful consideration.
While today it is easier than ever to learn about family history, individuals should also be aware that genetic data is one of the most sensitive categories of personal information and warrants a high standard of privacy protection. Genetic data may be used to identify risk of future medical conditions, contain unexpected information that may be unsettling, and reveal information about the test taker’s family members. Because genetic information is so sensitive, you’ll want to know how a company will protect and use genetic data before buying Grandpa a kit on Black Friday.
One key way to assess a company’s genetic privacy practices is to look to the principles highlighted in the Future of Privacy Forum’s Privacy Best Practices for Consumer Genetic Testing Services, a set of standards for the collection, use, and sharing of genetic data. Companies that currently support the Best Practices include: Ancestry, 23andMe, Helix, MyHeritage, Habit, African Ancestry, FamilyTreeDNA, and Living DNA.
You also should carefully examine the company’s privacy policy to be sure you are choosing a company that has your genetic privacy in mind. Here are five important questions you should consider when deciding which genetic test to purchase (hint: all the answers should be YES):
Does the Company Ask for Your Consent Before Sharing Your Individual-Level Genetic Data with Third Parties? People choose to share their genetic data with third parties for a range of purposes (e.g., participate in scientific research or connect with potential relatives). However, genetic testing companies should never share your individual-level genetic data with third parties without your knowledge, particularly with insurers, employers, and educational institutions.
Does the Company Provide You the Ability to Delete Your Genetic Data and Destroy Your Biological Sample If You Choose? Companies may have default policies to destroy all samples once testing is completed, retain data or samples for only a finite period of time, or retain data and samples indefinitely or until you close your account. Companies should be clear about their retention practices and offer prominent ways to delete your genetic data and destroy your biological sample.
Does the Company Require Valid Legal Process before Disclosing Your Genetic Data to the Government? As we have seen in recent cases like the Golden State Killer, genetic data can be a powerful investigative tool for government. However, government access to your genetic data should not be as easy as pumpkin pie, as it presents substantial privacy risks. Companies should require that government entities obtain valid legal process before they disclose genetic data.
Does the Company Notify You of Material Changes to Its Privacy Statement and Ask You to Agree to the Changes? Companies may modify their privacy statements occasionally, and sometimes they significantly change how genetic data is collected, used, and stored. Companies may also be bought, sold, or go out of business. But before changes are implemented, you should be notified and given an opportunity to review the changes and choose whether or not you want to continue using the services.
Does the Company Have Strong Data Security Practices? As more than 12 million individuals have had their DNA tested, the potential for hacking and data breaches has become an increasing concern. Given the uniqueness of genetic data, companies should maintain a comprehensive security program through practices such as: secure storage of biological samples and genetic data, encryption, data-use agreements, contractual obligations, and accountability measures.
As we gather this week to give thanks for our families and heritage, let us also take a moment to consider the ways that genetic data can bring us closer together … and why it is important to protect it.
Limor Shmerling Magazanik Joins Israel Tech Policy Institute as Managing Director and Future of Privacy Forum as Senior Fellow
Limor Shmerling Magazanik Joins Israel Tech Policy Institute as Managing Director and Future of Privacy Forum as Senior Fellow
Former Senior Official at Israel’s Privacy Protection Authority to Lead ITPI
Washington, DC – November 20, 2018 – The Israel Tech Policy Institute and Future of Privacy Forum today announced Limor Shmerling Magazanik as ITPI Managing Director and FPF Senior Fellow. In this role, Magazanik will provide leadership on day-to-day operational matters of ITPI, including directing ITPI’s policy agenda; engaging policymakers, regulators, academics, and business leaders; convening multi stakeholder groups for discussion; and overseeing communications with the public and the advisory board.
“We are thrilled that Limor has joined our team,” said Jules Polonetsky, FPF CEO and ITPI Co-Founder. “She has a proven track record of success bringing together senior leaders from government, academia, civil society and the private sector to shape data governance principles and practices. We look forward to expanding our footprint in Israel under her thoughtful leadership.”
Major projects for ITPI in 2019 include data protection law, digital economy issues, supporting Israel’s emerging leadership in privacy technologies and enabling smart city and connected transportation deployments.
Magazanik comes to ITPI and FPF after a decade with the Privacy Protection Authority, serving most recently as Director of Strategic Alliances and previously as Director of Licensing & Inspection. She led policy initiatives and regulation in technology driven sectors and promoted compliance with data protection, privacy, cybersecurity and digital identity regulation. She is an adjunct lecturer at the Hebrew University Faculty of Law and the IDC Herzlia School of Law, has LL.B., MA and LL.M. degrees from Tel Aviv University and is a CIPP/E, CIPP/US, CIPM.
“After 10 years with the Privacy Protection Authority, I am excited to help connect the Israeli tech community to the Future of Privacy Forum’s world-class tech policy expertise,” said Magazanik. “I believe Israel can be a leader in developing technologies that enhance privacy protection.”
ITPI Co-Founder Omer Tene said, “Limor, who in her previous position coordinated extensively with European data protection regulators, is perfectly placed to bridge between regulators and policymakers on the one hand and tech innovators from Tel Aviv to Silicon Valley on the other hand.”
Magazanik has deep experience tackling information society issues such as the Internet of Things, autonomous vehicles, smart cities, biometrics, social networks, digital health care, credit data, fintech and more. She has a multifaceted background in both government and the private sector, having practiced corporate, property and banking law, as well as working in product and project management in the Israeli high-tech industry.
###
About Israel Tech Policy Institute
Israel Tech Policy Institute is an incubator for tech policy leadership and scholarship, advancing ethical practices in support of emerging technologies. Learn more about ITPI by visiting www.techpolicy.org.il.
About Future of Privacy Forum
Future of Privacy Forum is a global non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.
Long Overdue: Comprehensive Federal Privacy Law
FPF has long supported federal comprehensive consumer privacy law. We believe that both businesses and consumers will gain from one clear standard that provides consumers with needed protections and provides industry with certainty and guidance.
On Friday, FPF filed comments to the National Telecommunications and Information Administration (NTIA) in response to the Administration’s September 2018 Request for Comments on a federal approach to consumer privacy. The NTIA has requested input on the best approach to strengthen existing consumer data protections in the United States while promoting the administration’s high-level goals, including: enhancing legal clarity; reducing legal fragmentation; and increasing national and global interoperability.
In our comments, we called on Congress to draft and pass a national comprehensive consumer privacy law that would create baseline legal protections for individuals in the United States. In doing so, we recommend that such a law address issues of interoperability with existing federal sectoral laws and global privacy frameworks, and avoid creating conflicting requirements with existing frameworks in order to promote beneficial cross-border data flows (as an example, we have previously addressed the unintended consequences of various nations’ data localization laws).
We also note that a national privacy law would be likely to preempt some similar state legislative efforts, both as a natural outcome of the Supremacy Clause and as a matter of policy to support clarity and consistency in businesses’ compliance obligations. We also believe that consumers should not expect to have fewer privacy rights — such as the right to access, correct, or delete information, or to exercise meaningful control over whether that information is used for unexpected purposes, shared with others, or sold — simply because they live in one state rather than another. However, we flag for the Administration certain key implementation questions that should be carefully considered — such as the effect of a national law on the role of state attorneys general, enforcement actions under generally applicable business practices laws, and existing state constitutional rights to privacy.
We also recommend that the Administration address a range of important substantive considerations of a draft bill, including:
treating covered data with nuance in crafting legislative definitions;
promoting internal accountability, oversight, and training;
recognizing distinctions between sensitive and non-sensitive data; and
creating incentives for socially beneficial uses of data and for technical solutions that can resolve privacy issues while supporting data utility
We commend the NTIA and the Department of Commerce for their engagement on this important issue, and look forward to continuing to engage with stakeholders on a federal approach to guaranteeing clear, consistent, and meaningful privacy and security protections in the United States.
“So the truth is, that yes, our cars are learning more about us, but what they learn may save our lives.”
Video
Video Credit: CBS Interactive Inc.
Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems
Data-driven and evidence-based social policy innovation can help governments serve communities better, smarter, and faster. Integrated Data Systems (IDS) use data that government agencies routinely collect in the course of delivering public services to shape local policy and practice. They can inform the design and implementation of programs, help measure and evaluate outcomes across the lifecourse, and enable policy-makers to better address complex social problems.
Respecting privacy is paramount to IDS’ success. The use of IDS to link sensitive personal data is typically governed by stringent local, state, and federal privacy laws and regulations, as well as rigorous technical safeguards and ethical norms. Nevertheless, individuals and communities routinely have questions and concerns about how their information is used and protected.
For lasting success, IDS need to develop “social license” to integrate data. Ultimately, societal acceptance and approval depend not merely on legal compliance with privacy rules, but on legitimacy, credibility, and public trust. Inclusive public engagement and effective communications around privacy are necessary for IDS to build trust in the public sector and to create strong, sustainable relationships with the communities they serve.
In order to help IDS and government leaders engage stakeholders and increase communities’ trust in the value of IDS, Future of Privacy Forum (FPF) and Actionable Intelligence for Social Policy (AISP) have created the Nothing to Hide toolkit.
This toolkit provides IDS stakeholders with the necessary tools to support and lead privacy-sensitive, inclusive engagement efforts. A narrative step-by-step guide to IDS communication and engagement is supplemented with action-oriented appendices, including worksheets, checklists, exercises, and additional resources, available below.
Future of Privacy Forum and Actionable Intelligence for Social Policy Release ‘Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems’
Future of Privacy Forum and Actionable Intelligence for Social Policy Release ‘Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems’
Washington, DC – Today, Future of Privacy Forum and Actionable Intelligence for Social Policy released Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems. Nothing to Hide provides governments and their partners working to integrate data for policy and program improvement with the necessary tools to lead privacy-sensitive, inclusive engagement efforts. In addition to a narrative step-by-step guide to communication and engagement on data privacy, the toolkit is supplemented with action-oriented appendices, including worksheets, checklists, exercises, and additional resources.
Integrated Data Systems leverage the data that agencies routinely collect in the course of delivering public services to help governments serve communities better, smarter, and faster. By linking data across government silos, IDS can inform the evidence-based design and implementation of programs, help measure and evaluate outcomes across the lifecourse, and enable policy-makers to better address complex social problems.
Respecting privacy is paramount to successful data sharing and integration efforts. The use of sensitive personal data is governed by local, state, and federal privacy laws and regulations, as well as rigorous technical safeguards and ethical norms. Nevertheless, individuals and communities routinely have questions and concerns about how their information is used and protected. The strongest IDS lean into opportunities to talk about why data are necessary for social policy improvement and innovation—and also make time to listen to and address stakeholders’ concerns, expectations, and priorities.
“The path to lasting success for IDS is establishing sound, two-way communications; empowering stakeholders; and continually serving the public good,” said Kelsey Finch, Policy Counsel at FPF. “Ultimately, societal acceptance and approval for evidence-based policy depend not merely on legal compliance with privacy rules, but on each IDS’ legitimacy, credibility, and public trust.”
“The state and local governments we work with take their role as data stewards very seriously. Like us, they believe that an ethical imperative exists to respectfully share and use data as a public asset, with the appropriate safeguards in place,” said Della Jenkins, Executive Director of AISP.
FPF and AISP hope this toolkit will help government leaders and IDS stakeholders to articulate that commitment, and do the hard work of both talking and listening about data privacy. In doing so, they are bound to increase both communities’ trust in the value of data sharing and their long-term impact.
###
This material is based upon work supported by the Corporation for National and Community Service (CNCS). Opinions or points of view expressed in this document are those of the authors and do not necessarily reflect the official position of, or a position that is endorsed by, CNCS or the Social Innovation Fund.
Thanks to our partners at Third Sector Capital Partners and the members of our Empowering Families and AISP Learning Community for sharing experiences and insights about data privacy and engagement.
About Future of Privacy Forum
Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.
About Actionable Intelligence for Social Policy
Actionable Intelligence for Social Policy is an initiative that focuses on the development, use, and innovation of integrated data systems (IDS) for policy analysis and program reform. AISP encourages social innovation and social policy experimentation so government can work better, smarter and faster. Learn more about AISP by visiting www.aisp.upenn.edu.
FPF Privacy Book Club – The Known Citizen: A History of Privacy in Modern America (December 5, 2018)
The FPF Privacy Book Club provides members with the opportunity to read a wide range of books — privacy, data, ethics, academic works, and other important data relevant issues — and have an open discussion of the selected literature.
We are excited to share The Known Citizen: A History of Privacy in Modern America by Professor Sarah E. Igo was chosen as the popular favorite by our readers. We are thrilled Professor Igo will be joining us for the December book club to introduce her book and answer questions.
Please join us on Wednesday, December 5, at 2:00 pm (EST) for the next FPF Privacy Book Club. If you are an existing member of the Book Club, you will receive the virtual conference dial-in information near the discussion date. You can join the Book Club here. Please feel free to forward this sign up link to friends who also may be interested.
The Privacy Expert's Guide to AI And Machine Learning
Today, FPF announces the release of The Privacy Expert’s Guide to AI and Machine Learning. This guide explains the technological basics of AI and ML systems at a level of understanding useful for non-programmers, and addresses certain privacy challenges associated with the implementation of new and existing ML-based products and services.
Advanced algorithms, machine learning (ML), and artificial intelligence (AI) are appearing across digital and technology sectors from healthcare to financial institutions, and in contexts ranging from voice-activated digital assistants, to traffic routing, identifying at-risk students, and getting purchase recommendations on various online platforms Embedded in new technologies like autonomous cars and smart phones to enable cutting edge features, AI is equally being applied to established industries such as agriculture and telecomm to increase accuracy and efficiency. Moving forward, machine learning is likely to be the foundation of many of the products and services in our daily lives, becoming unremarkable in much the same way that electricity faded from novelty to background during the industrialization of modern life 100 years ago.
Understanding AI and its underlying algorithmic processes presents new challenges for privacy officers and others responsible for data governance in companies ranging from retailers to cloud service providers. In the absence of targeted legal or regulatory obligations, AI poses new ethical and practical challenges for companies that strive to maximize consumer benefits while preventing potential harms.
For privacy experts, AI is more than just Big Data on a larger scale. Artificial Intelligence is differentiated by its interactive qualities – systems that collect new data in real time via sensory inputs (touchscreens, voice, video or camera inputs), and adapt their responses and subsequent functions based on these inputs. The unique features of AI and ML include not just big data’s defining characteristic of tremendous amounts of data, but the additional uses, and most importantly, the multi-layered processing models developed to harness and operationalize that data. AI-driven applications offer beneficial services and research opportunities, but pose potential harms to individuals and groups when not implemented with a clear focus on protecting individual rights and personal information. The scope of impact of these systems means it is critical that associated privacy concerns are addressed early in the design cycle, as lock-in effects make it more difficult to later modify harmful design choices. The design must include on-going monitoring and review as well, as these systems are literally built to morph and adapt over time. Intense privacy reviews must occur for existing systems as well, as design decisions entrenched in current systems impact future updates built upon these models.
As AI and ML programs are applied across new and existing industries, platforms, and applications, policymakers and corporate privacy officers will want to ensure that individuals are treated with respect and dignity, and retain the awareness, discretion and controls necessary to control their own information
Learning from Europe but looking beyond for privacy law
FPF’s CEO, Jules Polonetsky, recently published an opinion piece in The Hill that discusses the need for comprehensive federal privacy legislation. Jules explains:
Any legislation should also consider the increasingly sophisticated privacy tools that are emerging, including differential privacy to measure privacy risk, homomorphic encryption that can enable privacy safe data analysis, and many new privacy compliance tools that are helping companies better manage data. A law that will stand the test of time and successfully protect privacy rights while enabling valuable uses of data should include mechanisms to incentivize such technology measures.
Today, researchers published a paper detailing how governments can use public genetic databases to identify criminal suspects. These activities raise real questions about when it’s appropriate for law enforcement to analyze genetic information, and how best to protect individuals whose genetic data has been analyzed as part of a commercial service, but who are not accused of a crime.
FPF recently published Privacy Best Practices for Consumer Genetic Testing Services. The document includes strong protections for genetic data, particularly with regard to government access: law enforcement should obtain a warrant before seeking the disclosure of genetic data from companies; firms should demand valid legal process before disclosing genetic data and publish annual transparency reports. If government representatives are deviating from this approach, lawmakers or courts should impose clear, common-sense restrictions.
The sort of law enforcement search described in the paper would not be permitted by the FPF Best Practices. The search involves uploading genetic information discovered at crime scenes to an online database to identify an individual or an individual’s relatives. We require that companies only process genetic information uploaded with an individual’s permission; a crime scene upload necessarily occurs without the unknown subject’s permission. Further, leading companies require legal process before they will disclose genetic information to the government and have vigorously pushed back against law enforcement access to genetic data, many times declining to provide data in response to what they determined to be inappropriate requests. It is hard to imagine judges issuing warrants for the sort of general searches contemplated in the paper.
It’s worth noting that the public database at the center of the current discussion – GEDMatch – explicitly tells users that their genetic data will be shared with others without granular consent and subject to government access without a warrant. GEDMatch’s policies conflict with FPF’s Best Practices and are out of step with leading companies, which restrict this kind of access. GEDMatch’s practices are also different in another key way: the service permits users to upload digital genetic profiles. Prominent companies do not typically provide for such uploads, making crime scene genetic data less susceptible to matching without valid legal process.
There are clear benefits of using genetic data to help consumers better understand their health and ancestry. Genetic data, properly obtained and analyzed, can also help law enforcement solve crimes and improve public safety. However, unfettered law enforcement access to genetic information on commercial services would present substantial privacy risks. FPF’s Best Practices articulate a framework that can prevent many of these risks while preserving the public safety value of limited, narrow genetic searches predicated on probable cause and conducted pursuant to appropriate process. As we move forward, it is worth considering additional measures, including restrictions on government activities, technical safeguards, or other steps, that could bolster trust and safety.