Privacy Best Practices for Consumer Genetic Testing Services

The Future of Privacy Forum, along with leading consumer genetic and personal genomic testing companies 23andMe, Ancestry, Helix, MyHeritage, and Habit, released Privacy Best Practices for Consumer Genetic Testing Services. These companies have been joined by African Ancestry, FamilyTreeDNA,* and Living DNA in supporting the Best Practices as a clear articulation of how leading firms can build trust with consumers.

Consumer genetic tests, tests that are marketed to consumers by private companies, have empowered consumers to learn more about their biology and take a proactive role in their health, wellness, ancestry, and lifestyle. When consumers expressly grant permission and provide an informed consent, they can choose to share their genetic data with responsible researchers to help them discover important breakthroughs in biomedical research, healthcare, and personalized medicine.

The Best Practices establish standards for genetic data generated in the consumer context by making recommendations for companies’ privacy practices that require:

Supporters of the Best Practices include: Ancestry, 23andMe, Helix, MyHeritage, Habit, African Ancestry, FamilyTreeDNA,* and Living DNA.

The Association for Molecular Pathology (AMP), the premier global, molecular diagnostic professional society, recently revised its official position for all consumer genomic testing. Among other conditions, AMP will support consumer genetic testing if test providers adhere to FPF’s Best Practices (among other requirements): Announcement and Position Statement.

Should you have questions about this report, please contact the lead author, Carson Martinez, at [email protected]

*In January 2019, Family Tree DNA revealed an agreement with the FBI that conflicts with FPF’s Best Practices. FPF immediately removed Family Tree DNA as a supporter.


Video and Podcast


Additional Resources



Press


Future of Privacy Forum and Leading Genetic Testing Companies Announce Best Practices to Protect Privacy of Consumer Genetic Data

FOR IMMEDIATE RELEASE

July 31, 2018

Contact:

Carson Martinez, Health Policy Fellow, [email protected]

Melanie Bates, Director of Communications, [email protected]

Future of Privacy Forum and Leading Genetic Testing Companies Announce

Best Practices to Protect Privacy of Consumer Genetic Data

23andMe, Ancestry, Helix, and other leading consumer genetic and personal genomic testing companies back strong protections, including express consent, transparency reports, and strong security requirements

Washington, DC – Today, Future of Privacy Forum, along with leading consumer genetic and personal genomic testing companies 23andMe, Ancestry, Helix, MyHeritage, and Habit, released Privacy Best Practices for Consumer Genetic Testing Services. The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.

“Supporting strong and transparent industry-wide guidelines that provide people with confidence that companies in this growing field will protect their privacy is critical to the continued success of this nascent business sector,” said Jules Polonetsky, CEO, FPF. “That is why we have been working with the industry leaders for the past year to develop privacy and data principles that we and our peers in the personal genomics industry can embrace. We believe that these best practices are essential to engendering trust so that all people can safely access their genetic information.”

Consumer genetic tests, tests that are marketed to consumers by private companies, have empowered consumers to learn more about their biology and take a proactive role in their health, wellness, ancestry, and lifestyle. When consumers expressly grant permission and provide an informed consent, they can choose to share their genetic data with responsible researchers to help support a better understanding of the role of genetic variation in our ancestry, health, well-being, and much more.

“Protecting our customers’ privacy is Ancestry’s highest priority,” said Eric Heath, Chief Privacy Officer, Ancestry. “As a leader in the direct to consumer DNA testing market, Ancestry recognizes the important role that our industry can play in protecting the privacy and data of all customers. We understand the sensitive nature of the information our industry handles and our responsibility as stewards. We are grateful for the Future of Privacy Forum’s leadership in working to get these Best Practices drafted, vetted and aligned, and look forward to seeing these Best Practices broadly adopted across the industry.”

Today, more consumer genetic testing services are available than ever before, prices for testing are becoming increasingly affordable, and the speed at which testing is completed is accelerating. As the industry continues to expand and the technology becomes more accessible, it is vital that the industry acknowledge and address the risks posed to individual privacy when Genetic Data is generated in the consumer context.

“We’re seeing such a rapid progression of the industry, owing to both the advances in technology and the increasing accessibility of genomic information for personal and research use,” said Elissa Levin, Head of Policy and Clinical Affairs, Helix. “We think it’s essential to take a leadership position to continue to grow the industry responsibly, in ways that keep consumer safety at the forefront of action, and pave the way for better experiences and learnings that ultimately help people lead better lives.“

“Everyone who participates in a genetic testing service deserves to have their information protected, no matter which service or product they use. It’s imperative that all consumer genetic testing companies adhere to comprehensive privacy protections, and clearly communicate their policies to consumers in a transparent manner,” said Kate Black, Global Privacy Officer, 23andMe. “With over a decade of experience as a leader in consumer genetic testing, we’ve built incredibly strong privacy practices. We are happy to now work with the industry and an organization like the FPF to solidify best practices, and help ensure proper protection of consumers’ genetic information more broadly.”

The Best Practices are also supported by other consumer genetic testing companies including African Ancestry and FamilyTreeDNA.*

The Best Practices establish standards for genetic data generated in the consumer context by making recommendations for companies’ privacy practices that require:

“The Best Practices recognize that Genetic Data is sensitive information that warrants a high standard of privacy protection,” said Carson Martinez, Policy Fellow, FPF. “Genetic Data may be used to identify predispositions and potential risk for future medical conditions; may reveal information about the individual’s family members, including future children; may contain unexpected information or information of which the full impact may not be understood at the time of collection; and may have cultural significance for groups or individuals. It is therefore critical that the appropriate level of privacy protections is implemented.”

In producing the Best Practices, FPF and privacy leaders at the companies incorporated input from the FTC, a wide variety of genetics experts, and privacy and consumer advocates.

To request comment from FPF or the leading consumer genetic testing companies that were involved with these Best Practices released today, please find contact information below:

 

###

The Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

 

*In January 2019, Family Tree DNA revealed an agreement with the FBI that conflicts with FPF’s Best Practices. FPF immediately removed Family Tree DNA as a supporter.

PrivacyNews.TV

PrivacyNewsTV videos can be viewed below.  You can navigate here via privacynews.tv, can watch the videos on Facebook or on FPF’s YouTube channel.

https://www.facebook.com/FutureofPrivacy/videos/10156312557271327/

July 31, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Carson Martinez, Health Policy Fellow, Future of Privacy Forum

Topic: Consumer Genetics – Getting Privacy Right!


https://www.facebook.com/FutureofPrivacy/videos/vl.126800454699972/10156294134311327/?type=1

July 23, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Ethical Research and Dropbox


https://www.facebook.com/FutureofPrivacy/videos/10156280349601327/

July 17, 2018

Featuring: Tyler Park, Education Privacy Project Fellow and Erika Ross, Education Privacy Project Communications Associate

Topic: Protecting Student Privacy and School Safety


https://www.facebook.com/FutureofPrivacy/videos/10156266195201327/

July 11, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Federal Privacy Legislation and Book Recommendations


https://www.facebook.com/FutureofPrivacy/videos/10156185218491327/

June 7, 2018

Featuring: Amelia Vance, Sara Collins, and Erika Ross

Topic: Education Edition


https://www.facebook.com/FutureofPrivacy/videos/10156120400156327/

May 9, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Lauren Smith, Policy Counsel, Future of Privacy Forum

Topic: PrivacyNewsTV: California and Autonomous Car Data, Should Robots Insist You Say Please?


https://www.facebook.com/FutureofPrivacy/videos/vl.126800454699972/10156085638336327/?type=1

April 24, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Psychographic Election Targeting of You


March 13, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Does GDPR Apply to You? 


February 28, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Are Kids Addicted to Technology


February 14, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Messenger for Kids


https://www.facebook.com/FutureofPrivacy/videos/10155807821736327/

January 11, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Lauren Smith, Policy Counsel, Future of Privacy Forum

Topic: Automated Decision Making


https://www.facebook.com/FutureofPrivacy/videos/10155784905446327/

January 2, 2018

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Monitoring online content, no hate speech, no bullying, no porn, no spam!


https://www.facebook.com/FutureofPrivacy/videos/10155729990406327/

December 12, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: PrivacyNewsTV: Hi Alexa, OK Google, Hello Siri, Hi Bixby


https://www.facebook.com/FutureofPrivacy/videos/10155676241611327/

November 21, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Location Privacy: We know where you are shopping on Black Friday. And we know that some of you are NOT going home, because of painful political discussions.


https://www.facebook.com/FutureofPrivacy/videos/10155658077311327/

November 14, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Facebook does NOT want your nude pictures PLUS academic research using corporate data


https://www.facebook.com/FutureofPrivacy/videos/10155606432891327/

October 25, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Artificial Intelligence – What are companies and top researchers doing to ensure AI is a force for good?


https://www.facebook.com/FutureofPrivacy/videos/10155590876421327/

October 19, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Stacey Gray, Policy Counsel, Future of Privacy Forum

Topic: Can an individual track you leveraging mobile ads? And what can you do prevent this?


https://www.facebook.com/FutureofPrivacy/videos/10155569384541327/

October 11, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: To Be Honest App — And Facebook, Google and Russian election ads.


https://www.facebook.com/FutureofPrivacy/videos/10155533699896327/

September 27, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Creepy ad or lifesaving ad? (and more)


https://www.facebook.com/FutureofPrivacy/videos/10155497506511327/

September 14, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: The Equifax breach – Are you thinking about your tax return?!


https://www.facebook.com/FutureofPrivacy/videos/10155454726251327/

August 29, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: City of Seattle + best Papers


https://www.facebook.com/FutureofPrivacy/videos/10155306950076327/

July 11, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Lauren Smith, Policy Counsel, Future of Privacy Forum

Topic: Data and the Future of Mobility


https://www.facebook.com/FutureofPrivacy/videos/10155266442591327/

June 30, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Special Guest!

Topic: Do Teens Care About Privacy? Let’s Find Out!


https://www.facebook.com/FutureofPrivacy/videos/10155216468911327/

June 16, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Alexa, bring me a dozen eggs, right away!


https://www.facebook.com/FutureofPrivacy/videos/10155194363576327/

June 9, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: A sherpa for FERPA


https://www.facebook.com/FutureofPrivacy/videos/10155178745826327/

June 5, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Apple announces Intelligent Tracking Protection for Safari


https://www.facebook.com/FutureofPrivacy/videos/10155147764416327/

May 26, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Homomorphic Encryption and Online Ads


https://www.facebook.com/FutureofPrivacy/videos/10155086415501327/

May 5, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Brenda Leong, Senior Counsel & Director of Strategy, Future of Privacy Forum

Topic: Should you use your smartphone fingerprint sensor? Yes!


https://www.facebook.com/FutureofPrivacy/videos/10155065853646327/

April 28, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Privacy and the Bible


https://www.facebook.com/FutureofPrivacy/videos/10154996230026327/

April 7, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Kelsey Finch, Policy Counsel, Future of Privacy Forum

Topic: smart, smart, and not so smart


https://www.facebook.com/FutureofPrivacy/videos/10154911077581327/

March 10, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Stacey Gray, Policy Counsel, Future of Privacy Forum

Topic: Did the CIA hack YOUR tv?


https://www.facebook.com/FutureofPrivacy/videos/10154866998096327/

February 23, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Announcing the Privacy Calendar! 


https://www.facebook.com/FutureofPrivacy/videos/10154822448806327/

February 7, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Law Enforcement and Your Email 


https://www.facebook.com/FutureofPrivacy/videos/10154799162131327/

January 30, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Connected Car Privacy Guide 


https://www.facebook.com/FutureofPrivacy/videos/10154726733796327/

January 6, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum and Lauren Smith, Policy Counsel, Future of Privacy Forum

Topic: NYC Taxi Commission Don’t Track Me 


https://www.facebook.com/FutureofPrivacy/videos/10154721508381327/

January 4, 2017

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Alexa, do you have any evidence? 


https://www.facebook.com/FutureofPrivacy/videos/10154626694701327/

December 1, 2016

Featuring: Melanie Bates, Director of Communications, Future of Privacy Forum and Amelia Vance, Policy Counsel, Future of Privacy Forum

Topic: Parents Support School Tech and Data, But Want Privacy Assurances


https://www.facebook.com/FutureofPrivacy/videos/10154604306296327/

December 1, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Connected Dolls, Talking Dinosaurs and Battling Robots


https://www.facebook.com/FutureofPrivacy/videos/10154515864851327/

November 2, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Security Tip: The One Thing You Should Do Right Now!


https://www.facebook.com/FutureofPrivacy/videos/10154456104011327/

October 14, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: More on Apple Privacy plus EU Update


https://www.facebook.com/FutureofPrivacy/videos/10154434383571327/

October 7, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: FPF 8 Years: A Privacy Agenda from the Obama Administration >>> the new Administration


https://www.facebook.com/FutureofPrivacy/videos/10154366875186327/

September 13, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: iOS Limits Ad Tracking


https://www.facebook.com/FutureofPrivacy/videos/10154347428331327/

September 6, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Post Labor Day Privacy Thoughts


https://www.facebook.com/FutureofPrivacy/videos/10154334508326327/

September 2, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Don’t Get Scammed! Plus Privacy Papers for Policymakers 


https://www.facebook.com/FutureofPrivacy/videos/10154313591811327/

August 26, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: WhatsApp Facebook What’s Up? 


https://www.facebook.com/FutureofPrivacy/videos/10154274691436327/

August 12, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Facebook, Ad-Blocking, and Privacy 


https://www.facebook.com/JulesPolonetsky/videos/vb.500456196/10154331373841197/?type=2&theater

July 20, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Dino and Cognitoys


https://www.facebook.com/FutureofPrivacy/videos/10154055684416327/

May 16, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Trending on Facebook: Ben Carson, Penis Transplant, and Supreme Court


https://www.facebook.com/FutureofPrivacy/videos/10154049609711327/

May 13, 2016

Featuring: Staff, Future of Privacy Forum

Topic: FPF Tech Lab


April 22, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: FTC, Cookies, and PII


https://www.facebook.com/JulesPolonetsky/videos/10154105375451197/

April 21, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Internet of Things Privacy: Alexa


https://www.facebook.com/JulesPolonetsky/videos/10154102986011197/

April 20, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Moderating Extremists on Social Media


https://www.facebook.com/JulesPolonetsky/videos/10154100837401197/

April 19, 2016

Featuring: Jules Polonetsky, CEO, Future of Privacy Forum

Topic: Outsourcing the Right to Be Forgotten

https://www.facebook.com/FutureofPrivacy/videos/10156266195201327/

Policy Brief: European Commission’s Strategy for AI, explained

The European Commission published a Communication on “Artificial Intelligence for Europe” on April 24th 2018. It highlights the transformative nature of AI technology for the world and it calls for the EU to lead the way in the approach of developing AI on a fundamental rights framework. AI for good and for all is the motto the Commission proposes. The Communication could be summed up as announcing concrete funding for research projects, clear social goals and more thinking about everything else.

The Communication lays out proposed actions for the following years, fully taking into account that cooperation with Member States and at EU level is crucial. There are already some Member States that developed AI strategies. France presented its national strategy for AI on March 29 – and president Emmanuel Macron has been quite vocal about it. Germany also set up a platform on learning systems to enable a strategic dialogue between academia, industry and the government, and it has put forward a report on the ethics of automated and connected driving. Finland has put forward a strategy as well. The Commission doesn’t want to see a fragmented Single Market when it comes to AI and it transpires from the Communication that this was one of the main reasons to take action at this stage.

The Strategy proposed by the Commission contains several streams of action, of which the major ones are:

Each of them will be briefly detailed below.

1) Financial support, including for the creation of an “AI Toolbox”

The Commission pledged 1.5 billion euros (1.75 billion dollars) in the next two years (2018-2020) for research and innovation in AI technologies under the Horizon 2020 program, primarily to support applications which address societal challenges in sectors such as health, transport and agrifood. Through public-private partnerships, this amount is estimated to increase with 2.5 billion euros over the same period of time. The Commission will also support the strengthening of AI research excellence centers. One ambitious target is to stimulate the uptake of AI across Europe through a toolbox for potential users, with a focus on small and medium-sized enterprises, non-tech companies and public administrations.

The toolbox will include:

In addition, the Commission aims to stimulate more private investments in AI under the European Fund for Strategic Investments (at least 500 million euros in 2018-2020).

2) Initiatives to make data available for researchers: a new support centre for data sharing

Acknowledging that data is essential for the development of AI, the Commission wants to act towards growing the European data space. It is important however to mention that the focus of the data sharing initiatives is primarily on non-personal data and public sector information (traffic, meteorological, economic and financial data or business registers). As for personal data and privately-held data, the Commission emphasizes that any sharing initiative must ensure full respect for legislation on the protection of personal data.

To support making data available in a responsible way, the Commission published together with the Communication as series of new initiatives and guidance:

3) Dealing with the impact of AI on EU workforce

Preparing the society as a whole for the impact of AI is the first main challenge identified by the Communication in this area. The aim is to develop programs that will help all Europeans to develop basic digital skills, as well as skills which are complementary to and cannot be replaced by any machine, such as critical thinking, creativity or management.

The second challenge is the impact of AI on jobs and workers. The Commission announced that the EU will focus efforts to help workers in jobs which are likely to be the most transformed or disappear due to automation, robotics and AI. This means also ensuring access for all citizens, including workers, to social protection. The Commission intends to set-up dedicated re-training schemes in connection with the Blueprint on sectoral cooperation on skills for professional profiles which are at risk of being automated, with financial support from the European Social Fund.

The third challenge is training more specialists in AI, including attracting talent from abroad. The Commission estimates that there are about 350.000 vacancies for such professionals in Europe, pointing to significant skills gap. To this end, the European Institute of Innovation and Technology will integrate AI across curricula in the education courses it supports.

4) Ensuring both legal and ethical frameworks for the development of AI

The Communication does not announce any new legislative proposal, but it emphasizes how the current proposals debated in Brussels are relevant for AI (the Regulation on the free flow of non-personal data, the ePrivacy Regulation and the Cybersecurity Act) and that they need to be adopted as soon as possible so that citizens and businesses alike will be able to trust the technology they interact with. The possibility that the Product Liability Directive will be revised is announced, but the only concrete step mentioned for the time being is an analysis of the current provisions and whether they are fit to deal with AI technology.

In addition, the Commission highlights the role the GDPR plays on regulating the use of personal data for AI related purposes, including with regard to the right of individuals to receive meaningful information about the logic involved in automated decision-making and their right not to be subject to solely automated decision-making except in limited circumstances. The Commission intends to support national and EU-level consumer organisations and data protection supervisory authorities in building an understanding of AI-powered applications with the input of the European Consumer Consultative Group and of the European Data Protection Board.

As for dealing with ethics, the Communication announces that “AI Ethics Guidelines” will be adopted by the end of the year, in collaboration with “all relevant stakeholders”. The Guidelines are expected to address issues such as the future of work, fairness, safety, security, social inclusion and algorithmic transparency. To this end, the Commission set up a High Level Working Group for AI and the European AI Alliance.

5) Establishing partnerships with Member States

Finally, the Commission focuses on establishing partnerships at EU level. By the end of the year the Commission will work on a coordinated plan with Member States to maximise the impact of investments at EU and national levels, exchange information on the best way for governments to prepare Europeans for the AI transformation and address legal and ethical considerations. At the same time, the Commission plans to systematically monitor AI-related developments at Member State level.

It is relevant to note here that a week before the Commission published its strategy on AI, 25 EU Member States signed a Declaration of Cooperation on Artificial Intelligence, and they were joined one month later by the rest of the Member States. Currently, all 29 EU Member States are signatories to the Declaration.

For more information on the Future of Privacy Forum’s work on AI, or if you have questions related to this Policy Brief, contact:

Brenda Leong, Senior Counsel and Director of Strategy at [email protected]

Gabriela Zanfir-Fortuna, PhD, Policy Counsel at [email protected]

FPF Publishes Report Supporting Stakeholder Engagement and Communications for Researchers and Practitioners Working to Advance Administrative Data Research

The ADRF Network is an evolving grassroots effort among researchers and organizations who are seeking to collaborate around improving access to and promoting the ethical use of administrative data in social science research. As supporters of evidence-based policymaking and research, FPF has been an integral part of the Network since its launch and has chaired the network’s Data Privacy and Security Working Group since November 2017.

This summer, ADRF Network published its first set of working group reports in order to help advance standards and best practices for administrative data researchers and practitioners. The reports address priority issues in administrative data research, including: Data Quality and StandardsData Sharing Governance and Management; and Communicating about Data Privacy and Security. The working groups engaged over 30 experts from academic universities, government agencies, and other institutions.

FPF CEO Jules Polonetsky and FPF Policy Counsel Kelsey Finch led the work on Communicating about Data Privacy and Security as part of its ongoing efforts to support  proactive and privacy-focused stakeholder engagement and communications around administrative data research. While strong privacy safeguards are the foundation of any administrative data research, learning to effectively communicate about how and why administrative data are being used and protected and providing stakeholders with meaningfully input in the research process is essential to maintaining public trust.

In the report, we identify the “why, when, who, and how” of communicating about data privacy and security while doing administrative data research. The report highlights the importance of engaging a diversity of stakeholders at multiple stages in the research lifecycle, and includes an initial matrix model building on the GovLab’s People-Led Innovation framework to ensure active engagement. We also apply the model to a hypothetical research project to further inspire researchers and practitioners to think creatively about meaningful opportunities for stakeholder engagement.

Publication of these reports is a pivotal step toward developing industry-wide best practices for researchers and practitioners working to advance administrative data research. We believe that stakeholder engagement and communicating about data privacy and security are crucial to the future success of administrative data research.

Full reports can be downloaded here.

FPF is thankful to the Alfred P. Sloan Foundation for making this work possible; to Monica King for her leadership of the ADRF Network; and our fellow Data Privacy & Security Working Group members for their thoughtful contributions. Working Group participants included: Elizabeth Dabney, Data Quality Campaign; Tanvi Desai, Data Strategy Consultant;  Valerie Holt, ECDataWorks; Della Jenkins, Actionable Intelligence for Social Policy; Stefaan Verhulst, GovLab; and Evan White, California Policy Lab.

The Top 10: Student Privacy News (Feb – July 2018)

The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.  Approximately every month, we post “The Top 10,” a blog with our top student privacy stories. This blog is cross-posted at studentprivacycompass.org.

The Top 10

1. School Safety and Student Privacy, Part 1: how do we prevent these tragedies while protecting student privacy?

The horrific shooting at Marjory Stoneman Douglas High School in February has brought up and highlighted numerous student privacy issues. As I discussed in FPF’s statement to the Federal Commission on School Safety, there are many legitimate reasons, including attempting to prevent acts of violence, that schools surveil students, but there are also significant privacy and equity concerns that must be considered. In the wake of the shooting:

2. School Safety and Student Privacy, Part 2: does the law need to change?

President Trump’s school security plan called for a review of FERPA. The Commission held a meeting on Curating a Healthier & Safer Approach: Issues of Mental Health and Counseling for Our Young” on July 11th, and FPF’s Vice President of Policy, John Verdi, was invited to testify.

Other Related News Stories:

3. Facebook, Cambridge Analytica, and the Anti-Tech Wave

Privacy has been in the headlines non-stop with Facebook and Cambridge Analytica over the past couple months, and numerous news articles asked stakeholders what they thought all of this means for student privacy and schools, especially around Mark Zuckerberg’s testimony to Congress. In the wake of those hearings, schools across the country are assessing their own data policies. Facebook has announced it will not seek any additional parental consent from teenage users in the United States. Common Sense Media filed a request with the FTC to investigate how Facebook’s “data mishandling” in the wake of Cambridge Analytica has impacted teenagers, specifically. 

In the meantime, the tech backlash has continued to impact ed tech: Wired reports that “It’s time for a serious talk about the science of tech addiction” and its impact on well-being and health; “Ninety-five percent of principals said students spend too much time on digital devices when they’re not in school” via EdWeek;  two higher ed professors write that “There Are No Guardrails on Our Privacy Dystopia” in Motherboard; and  “Screen timealso continues to be a big topic of debate on blogs and in the news that has implications for privacy.  

4. New Department of Education guidance says that parents, not minor students, must consent to college admissions pre-test surveys and data sharing.

FPF explained the guidance here.

5. Privacy and Research

6. Senators Blumenthal and Daines reintroduced their 2015 student privacy bill aimed at vendors, the SAFE KIDS Act, in March.

Meanwhile, in May, the House Education and Workforce Committee held a their fourth hearing on student privacy in three years. FPF’s Amelia Vance was invited to testify. You can watch the hearing here, read the testimony of the speakers here, or read the write-up in EdScoop.

7. Can students or parents record what happens in school?

Three news stories raised this question:

8. Fordham University’s Center on Law and Information Policy (CLIP) released the study, “Transparency and the Marketplace for Student Data,” which examines the practices of data brokers who buy and sell information about students.

In the study, the authors describe existing privacy laws, map the commercial marketplace, and describe the challenges of understanding how data about students is collected and used. FPF released a blog responding to the concerns raised in the study, and expanding on how various federal and state privacy laws – from FERPA to FCRA to PPRA – may or may not apply to the practices described.

9. NAICU has come out in favor of the “Right to Know Before You Go” Act,

However, they still remain opposed to the College Transparency Act, favored by most other public colleges, advocates, and policy makers.

10. Common Sense Media released a report on the “2018 State of EdTech Privacy.”

Common Sense Media also recently introduced simpler privacy ratings for education apps to make “privacy and security more accessible.”

Just for Fun

There is a fantastic new video from the Utah State Board of Education on the Other FERPA Exceptions (see their original break-out hit here!)

FPF Testifies Before Federal Commission on School Safety

By Amelia Vance, Sara Collins, Tyler Park, and Erika Ross

John Verdi, the Future of Privacy Forum’s Vice President of Policy, testified today before the Federal Commission on Student Safety meeting, “Curating a Healthier & Safer Approach: Issues of Mental Health and Counseling for Our Young.” He recommended that, rather than changing current federal student privacy law, the Commission should explore opportunities to educate school officials and other stakeholders regarding the existing legal authorities for sharing data to support school safety.

He provided three concrete recommendations that the Commission could follow to improve student safety and safeguard privacy:

  1. Be mindful of the full range of privacy risks and harms, as well as the importance of privacy safeguards, as it considers options to improve school safety;
  2. Support efforts to better educate and communicate with stakeholders regarding existing legal authorities that permit data sharing to promote health and safety within a framework that mitigates privacy risks to students; and
  3. Call for neutral, expert analysis of empirical data regarding the nature, extent, and leading causes of the key privacy risks and safety risks facing students and schools.

Mr. Verdi stated that the privacy risks pose particular challenges when they arise in the context of children’s or students’ personal information. Physical harm and loss of liberty are especially egregious when the victim is a child. Financial fraud and identity theft increasingly target young Americans, who are often unable to discover or combat the crimes until years later. Children are also susceptible to specialized schemes – including medical identity theft – that can create substantial health risks when multiples individuals’ medical records are merged as a result of the crime.

FERPA already contains a specific exception that permits information to be shared to protect the health and safety of students, whether the child in question is a threat to themselves, or to others. In 2008, the Department of Education amended FERPA regulations to remove the language requiring strict construction of this exception and permit disclosure when an articulable and significant safety threat exists.  The Department assured school officials they would support the disclosure if there was a “rational basis for the school’s determination” at the time it was made.

The 2008 amendments adopted a “totality of the circumstances” test and the “rational basis” approach to Department review of school officials’ decisions. The “totality of the circumstances” test authorizes disclosure of protected student information when the totality of the circumstances suggest that disclosure would mitigate a health or safety threat; this test broadened schools’ authority, replacing the previous “strict construction” standard, which suggested that disclosure was only authorized when strictly necessary to preserve health and safety. The “rational basis” approach assures districts that the Department does not second-guess disclosure decisions from a perspective of perfect hindsight; instead, the Department will view assertion of the health and safety exception as appropriate if the district identifies an articulable threat that serves as the rational basis for the disclosure.

Mr. Verdi testified that untethering disclosure authority from the “totality of the circumstances” or “rational basis” tests would necessarily increase privacy risks to students. He also noted that a dramatic broadening of authority could increase sharing of student information in a way that overwhelms administrators with data, casts suspicion on students who show no signs of violent behavior, and fails to promptly identify individuals who pose genuine threats to school safety. In particular, he mentioned that mentally ill students can be disincentivized from seeking help if they fear that their privacy will not be protected.

Mr. Verdi advocated that the Commission instead focus on educating school officials and other stakeholders regarding the existing legal authorities for sharing data to support school safety. The Department of Education’s Privacy Technical Assistance Center (PTAC) has been vital for schools seeking practical guidance on FERPA. PTAC could publish guidance, hold training sessions, and provide additional technical assistance on this issue.

Finally, Mr. Verdi recommended the the Commission call for further research into the nature, extent, and leading causes of the key privacy risks and safety risks facing students and schools.

At a previous Federal Commission on School Safety listening session, FPF’s Amelia Vance, Director of the Education Privacy Project, spoke on the important balance between schools’ obligation to protect student privacy while providing a safe learning environment for students. She touched on the importance of schools being transparent about their interactions with law enforcement and that data sharing should only be engaged in when there is a serious threat of violence, not a minor infraction of a school code.

FPF is a non-profit organization focused on consumer privacy issues. FPF primarily equips and convenes key stakeholders to find actionable solutions to the privacy concerns raised by the speed of technological development. FPF’s Education Privacy Project works to ensure student privacy while supporting technology use and innovation in education that can help students succeed. Among other initiatives, FPF maintains the education privacy resource center website, FERPA|Sherpa, and co-founded the Student Privacy Pledge.

Read Mr. Verdi’s full testimony here.