California SB 980 Would Codify Many of FPF’s Best Practices for Consumer Genetic Testing Services, but Key Differences Remain
Authors: John Verdi (Vice President of Policy) and Katelyn Ringrose (Christopher Wolf Diversity Law Fellow)
In July 2018, the Future of Privacy Forum released Privacy Best Practices for Consumer Genetic Testing Services. FPF developed the Best Practices following consultation with technical experts, regulators, leading consumer genetic and personal genomic testing companies, and civil society. The FPF Best Practices include strict standards for the use and sharing of genetic information generated in the consumer context. Companies that pledged to follow the guidelines, including Ancestry, 23andMe, and Helix promised to:
- provide safeguards for how genetic information is collected, used, shared, and retained;
- implement consent requirements for the initial collection and certain subsequent disclosures of genetic information;
- guarantee consumer rights to access, correction, and deletion;
- ban sharing genetic information absent consent or legal process; and
- implement strong data security protections and privacy by design principles.
California lawmakers are currently considering SB 980 (the “Genetic Information Privacy Act”). SB 980 would establish obligations for direct-to-consumer genetic testing companies and others that collect or process genetic information. If passed by the legislature and approved by the Governor, the bill would become effective on January 1, 2021.
Many of SB 980’s provisions align closely with FPF’s Best Practices, including the bill’s emphasis on consumers’ rights to notice, choice, and transparency. Leading direct-to-consumer genetic testing companies are already obliged to follow the Best Practices, as they have made public commitments that are enforceable by the Federal Trade Commission and state Attorneys General. SB 980’s provisions would extend these requirements to all covered entities that do business in California.
Some of SB 980’s provisions diverge from the FPF Best Practices. For example, FPF’s Best Practices and SB 980 both require companies to obtain opt-in consent before they use DNA test results for marketing, but proposed amendments to SB 980 would further require companies to provide consumers with an opportunity to opt out of contextual marketing – ads placed on web pages and apps based on page content rather than sensitive personal information. SB 980’s treatment of contextual advertising is also inconsistent with the California Privacy Rights Act of 2020 (CPRA) – the comprehensive privacy ballot initiative that would govern the use of much sensitive health data and would not require companies to provide an opt-out for non-personalized, contextual advertising. In addition, SB 980 diverges from FPF’s Best Practices regarding government access to DNA information, with SB 980 preserving an option for companies to voluntarily provide genetic data to law enforcement in the absence of a court order or consumer consent; FPF’s Best Practices would prohibit such disclosures in most cases.
Below, we analyze SB 980’s approach to: (1) consent; (2) marketing; (3) privacy policies; (4) research; and (5) penalties and enforcement. We also examine (6) several other federal and state laws that currently regulate genetic privacy.
- Consent for Genetic Data //
FPF’s Best Practices and SB 980 take similar approaches – requiring different methods of consent (express opt-in vs. opt-out), depending on the sensitivity and uses of the data. Both SB 980 and the Best Practices emphasize express, affirmative consent as a baseline requirement for collecting genetic information. They each require that companies provide opt-out consent mechanisms for consumers regarding use of non-genetic information, such as purchase histories or web browsing information.
FPF’s Best Practices require initial express consent for genetic information collection, as well as separate express consent for the use of genetic material outside of the initial scope of collection. Secondary express consent is also required before a company engages in the onward transfer of individual-level information or the use of genetic information for incompatible or materially different secondary uses. Companies are also required to provide additional consent measures for consumers or organizations that submit genetic information on behalf of others. In a similar vein, SB 980 would require prior authorization from consumers for the initial collection of their genetic information and separate authorization for each subsequent disclosure.
FPF’s Best Practices define express consent as a consumer’s statement or clear affirmative action in response to a clear, meaningful, and prominent notice, while encouraging companies to use flexible consent mechanisms that are effective within the context of the service, in-app or in-browser experience, and relationship between the company and individual.
- Marketing //
The FPF Best Practices and SB 980 differ in their approach to consent for marketing and advertising purposes, including marketing on the basis of non-genetic information. The Best Practices prohibit companies from marketing to consumers on the basis of their genetic information, unless the consumer provides separate express consent for such marketing or marketing is clearly described in the initial express consent as a primary function of the product or service. Marketing to a consumer on the basis of their purchase history is permitted if the consumer is provided the option to opt-out of such marketing. Marketing to anyone under the age of 18 is prohibited.
The Best Practices do not require companies to obtain opt-in consent or provide an opt-out for “customized content or offers by the company on its own websites and services.” This provision is intended to permit 1) contextual advertising (i.e., advertising that is tailored to the other content on a particular page on a website, rather than targeted to a particular user); and 2) first-party offers displayed to users on the basis of information within the same platform, such as when a logged in user receives an email offer based on information they viewed on the company’s own website while logged in. This approach aligns with leading privacy norms, including the approach taken by the Department of Health and Human Services in interpreting the Health Insurance Portability and Accountability Act (HIPAA), which exempts certain first-party communications related to treatment and health-related products from its definition of “marketing.” It is also consistent with the California Privacy Rights Act of 2020 (CPRA), the privacy ballot initiative that would establish rights to opt out of the sale and uses of sensitive health data and would codify a narrow exemption for non-personalized, contextual advertising.
Like FPF’s Best Practices, SB 980 also requires companies to obtain opt-in consent before marketing based on a consumer’s genetic data. A recent amendment would align SB 980’s and FPF’s approaches to marketing based on purchase history, requiring provision of an opt-out. However, a related SB 980 amendment would require companies to provide users with mechanisms to opt out of contextual advertising. This approach would be inconsistent with most leading norms, including HIPAA and the California Privacy Rights Act. This is because, in contrast to targeted or behavioral advertising, contextual advertising is not typically viewed as implicating significant privacy risks. Indeed, privacy advocates have cited contextual advertising as a privacy-protective model that displays marketing messages on web pages based on the content of the page, not information about an individual.
- Privacy Policies //
Similarly, SB 980 would require all direct-to-consumer genetic or illness testing services companies to provide consumers with “clear and complete information regarding the company’s policies and procedures for the collection, use, and disclosure, of genetic data” through “a summary of its privacy practices, written in plain language” and “a prominent and easily accessible privacy notice.”
- Research //
FPF’s Best Practices encourage the socially beneficial use of genetic information in research while providing strong privacy protections. This nuanced approach strikes a careful balance between the societal benefits of genetic research and individual’s privacy interests. The Best Practices require companies to obtain informed consent before using identifiable data for research, and promote research on strongly deidentified datasets. The Best Practices require companies to engage in consumer education and make resources available regarding the implications and consequences of research.
The consumer genetic and personal genomic testing industry produces an unprecedented amount of genetic information, which in turn provides the research community the ability to analyze large and diverse genetic datasets. Genetic research enables scientists better understand the role of genetic variation in our ancestry, health, well-being, and more. In order to recognize the role of big data in corporate research and the difficulty of obtaining individual consent (see Omer Tene and Jules Polonetsky’s “Beyond IRBs: Ethical Guidelines for Data Research” identifying the regulatory gaps between federally funded human subject research and corporate research) the Best Practices recognize the important role of Institutional Review Boards (IRBs) and ethical review processes.
FPF’s Best Practices also provide incentives for researchers and others to deidentify genetic data when practical. Deidentification of genetic information is an incredibly complex issue (see FPF and Privacy Analytics’s “A Practical Path Toward Genetic Privacy”), and the risk of reidentification of genetic data can be limited by rigorous technical, legal, and organizational controls.
SB 980 also requires informed consent before using data for research, “in compliance with the federal policy for the protection of human research subjects” — effectively the same standard as the FPF Best Practices. Similarly, SB 980 also promotes strong deidentification of data, meaning data that “cannot be used to infer information about, or otherwise be linked to, a particular identifiable individual,” provided it is also subject to public commitments and contractual obligations to not make attempts to reidentify the data.
- Penalties and Enforcement //
Companies that have publicly committed to comply with FPF’s Best Practices are subject to enforcement by the Federal Trade Commission (FTC) under the agency’s Section 5 authority to prohibit deceptive trade practices. State Attorneys General and other authorities have similar powers to bring enforcement actions against companies that violate broadly applicable consumer protection laws.
SB 980 includes a tiered penalty structure, with negligent violations of the act subject to civil penalties not to exceed one thousand dollars ($1,000) and willful violations between $1,000 and $10,000 plus court costs. Penalties for wilful violations would be paid to the individual to whom the genetic information pertains. Penalties could add up quickly – they are calculated on a per violation, per consumer basis. Earlier versions of SB 980 included criminal penalties; the bill sponsors recently removed criminal liability in favor of a higher civil penalty, raising the maximum fine from $5,000 to $10,000.
- Other Federal and State Laws //
In the United States, a growing number of sectoral laws are applicable to companies that process genetic information. The federal Genetic Information Nondiscrimination Act (GINA) prevents genetic discrimination in health insurance and employment, but GINA does not prohibit discrimination in life insurance, disability or long term care insurance, nor does it provide general privacy protections or limits on law enforcement uses. In an attempt to close regulatory gaps, several states have enacted legislation around law enforcement access to genetic information and discriminatory practices on the behalf of life insurance organizations.
Key state laws governing genetic information include:
- Alaska’s Genetic Privacy Act (2004) which regulates access, retention, and disclosure of genetic information without the “informed and written consent” of the consumer; recognizes that both the genetic information and the DNA samples collected are the property of the consumer; and provides for both civil and criminal penalties for violations of genetic privacy rights. Alaska’s law does not require valid legal process (such as a court order) for law enforcement access to genetic information.
- Florida’s House Bill 1189, Genetic Information for Insurance Purposes (passed and awaiting the Governor’s approval as of March 2020), would bar life, disability and long-term care insurance companies from using consumer genetic test results for coverage purposes.
- Nevada’s comprehensive Genetic Information Act (2013) prohibits the collection, retention, or disclosure of genetic information without prior consent from the individual; requires law enforcement to obtain a court order prior to accessing genetic information; provides consumers the right to inspect and obtain genetic records; requires entities holding genetic information to destroy that information if consent is withdrawn; and provides criminal penalties and a private right of action for violations of the law.
Genetic and personal genomic tests increase consumers’ access to and control of their genetic information; empower consumers to learn more about their biology and take a proactive role in their health, wellness, ancestry, and lifestyle; and enhance biomedical research efforts. The consumer genetic and personal genomic testing industry is producing an unprecedented amount of genetic information, which provides the research community the ability to analyze a significantly larger and more diverse range of genetic data to observe and discover new patterns and connections. Access to genetic information enables researchers to gain a better understanding of the role of genetic variation in our ancestry, health, well-being, and much more. While genetic information poses incredible benefits, genetic information is also sensitive information that warrants a high standard of privacy protection.
FPF’s Best Practices provide a model for strong privacy safeguards with detailed provisions that support clinical research and public health. Key portions of California SB 980 are consistent with the Best Practices, and would require all companies to provide consumers with important transparency, choice, and security safeguards. Several SB 980 amendments and provisions diverge from the Best Practices in important ways, including how the bill would treat contextual advertising and government access to data.
Supreme Court Rules that LGBTQ Employees Deserve Workplace Protections–More Progress is Needed to Combat Unfairness and Disparity
Authors: Katelyn Ringrose (Christopher Wolf Diversity Law Fellow) and Dr. Sara Jordan (Policy Counsel, Artificial Intelligence and Ethics)
Today’s Supreme Court ruling in Bostock v. Clayton County—clarifying that Title VII of the Civil Rights Act bans employment discrimination on the basis of sexual orientation and gender identity—is a major victory in the fight for LGBTQ civil rights. Title VII established the Equal Employment Opportunity Commission (EEOC), and bans discrimination on the basis of sex, race, color, national origin and religion by employers, schools, and trade unions involved in interstate commerce or those doing business with the federal government. Today’s 6-3 ruling aligns with Obama-era protections, including a 2014 executive order extending Title VII protections to LGBTQ individuals working for the federal contractors.
In this post, we examine the impact of today’s decision, as well as (1) voluntary anti-discrimination efforts adopted by companies for activities not subject to federal protections; (2) helpful resources on the nexus of privacy, LGBTQ protections, and big data; and (3) the work FPF has done to identify and mitigate potential harms posed by automated decision-making.
In Bostock, the Supreme Court determined that discrimination on the basis of sexual orientation or transgender status are forms of sex discrimination, holding: “Today, we must decide whether an employer can fire someone simply for being homosexual or transgender. The answer is clear. An employer who fires an individual for being homosexual or transgender fires that person for traits or actions it would not have questioned in members of a different sex. Sex plays a necessary and undisguisable role in the decision, exactly what Title VII forbids.”
Bostock resolved the issue through analysis of three cases:
- R.G. & G.R. Harris Funeral Homes Inc. v. Equal Employment Opportunity Commission, where Aimee Stephens worked as a funeral director at R.G. & G.R. Harris Funeral Homes. When Aimee informed the funeral home’s owner that she was transgender, the business owner fired her, saying it would be “unacceptable” for her to appear and behave as a woman.
- Altitude Express v. Zarda, where Donald Zarda, a skydiving instructor in Long Island, N.J., was fired from his job because of his sexual orientation.
- Bostock v. Clayton County, where Gerald Lynn Bostock was fired from his job as a county child welfare services coordinator when his employer learned Gerald is gay.
“Today is a great day for the LGBTQ community and LGBTQ workers across the nation. The United States Supreme Court decision could not have come at a better time given the current COVID-19 crisis and the protests taking place across the country. However, there still remains much work to be done, especially around the areas of data and surveillance tools. The well-documented potential for abuse and misuse of these tools by unregulated corporations as well as government and law enforcement agencies should give serious pause to anyone who values their privacy–especially members of communities like ours that have been historically marginalized and discriminated against,” says Carlos Gutierrez, Deputy Director & General Counsel of LGBT Tech. “Today’s decision will protect over 8 million LGBT workers from work discrimination based on their sexual orientation or gender identity. This is especially heartening given that 47% or 386,000 of LGBTQ health care workers, people on the frontlines of the COVID-19 battle, live in states that had no legal job discrimination protections.”
We celebrate today’s win. However, it is now more critical than ever to address data-driven unfairness that remains legally permissible and harmful to the LGBTQ community.
Bostock should also influence a range of anti-discrimination efforts. In recent years, many organizations have engaged in various efforts to combat discrimination even when their activities are not directly regulated by the Civil Rights Act. When implementing such anti-discrimination programs, organizations often look to the Act to identify protected classes and activities. Bostock provides clarity — organizations should include sexual orientation and gender identity in the list of protected classes even if their activities wouldn’t otherwise be regulated under Title VII.
Anti-Discrimination Efforts //
Title VII of the the Civil Rights Act has historically barred discrimination on the basis of sex, race, color, national origin and religion; the Civil Rights Act, including Title VII, is the starting point for anti-discrimination compliance programs. Even companies that do not have direct obligations under the Act (including ad platforms) have utilized the Act to guide their anti-discrimination efforts (see the Network Advertising Initiative’s Code of Conduct). According to the Human Rights Campaign, the number of Fortune 100 companies that have publicly pledged to non-discrimination employment policies on the basis of gender identity increased from 11% (gender identity) and 96% in 2003 (sexual orientation) to 97% and 98% respectively by 2018.
We caution that simply not collecting or ignoring sensitive information will not always be a solution that ensures discrimination is avoided. Even without explicit data, proxy information can reveal sensitive information. Furthermore, in order to assess whether protected classed are treated unfairly, it will sometimes be important to collect information that can identify discrimination. While sensitive data collection has its benefits and risks, the lack of data available to researchers can mean that policymakers do not have the information necessary to understand disparities in enough depth to create responsive policy solutions.
Helpful Resources //
- “Big Data” and the Risk of Employment Discrimination, by Allan King & Marko Mrkonich, grappling with the many ways employers use correlative methods of analyzing big data, and how those efforts are in tension with Title VII, the to the extent the correlations those methods discover overlap with protected employee characteristics.
- Big Data for All: Privacy and User Control in the Age of Analytics, by Omer Tene and Jules Polonetsky, noting that inaccurate, manipulative, or discriminatory conclusions may be drawn from perfectly innocuous, accurate data—and like any interpretative process, algorithms are subject to error, inaccuracy, and bias.
- The Real Cost of LGBT Discrimination, by the World Economic Forum, recognizing that discrimination doesn’t just harm individuals—but also families, companies, and entire countries.
- Challenges for Mitigating Bias in Algorithmic Hiring, by The Brookings Institution, recognizing that, in instances of disparate impact caused by automated decision making, there are numerous bars to relief, including the fact that (1) plaintiffs may not have sufficient information to suspect or demonstrate disparate impact; (2) it is unclear whether predictive validity is sufficient to defend against a claim of disparate impact;and (3) many proposed solutions to mitigating disparities from screening decisions require knowledge of legally protected characteristics.
- Lessons from Fair Lending Law for Fair Marketing and Big Data, by Peter Swire, explaining that fair lending laws provide guidance as to how to approach discrimination that allegedly has an illegitimate, disparate impact on protected classes; furthermore data can plays an important role in being able to assess whether a disparate impact exists.
- Algorithmic Fairness, by the Software & Information Industry Association, explaining that automated decision making can be designed and used in ways that preserve fairness for all, but this will not happen automatically–getting those outcomes requires designing those features and using them in ways that preserve these values.
Unfairness by Algorithm //
While discriminatory decisions made by a human are clearly regulated, the full range of potentially discriminatory decisions made by a computer are not yet well understood. Yet algorithmic harms may be similarly pernicious, as well as more difficult to identify or amenable to redress using available legal remedies.
In a 2017 Future of Privacy Forum report, Unfairness by Algorithm: Distilling the Harms of Automated Decision Making, we identified four types of harms—loss of opportunity, economic loss, social detriment, and loss of liberty—to depict the various spheres of life where automated decision-making can cause injury. The report recognizes that discriminatory decisions and resulting unfairness as determined by algorithms can lead to distinct collective and societal harms. For example, use of proxies, such as “gayborhood” ZIP codes in algorithms or resume clues regarding LGBTQ community activism, can lead to employment discrimination and result in the same differential access to job opportunities.
As organizations commit to LGBTQ protections, an adherence to data protection and fairness principles are one way to battle systemic discrimination. These principles include ensuring fairness in automated decisions, enhancing individual control of personal information, and protecting people from inaccurate and biased data.
Today’s decision regarding workplace protections could not be more welcome, particularly now as data from the Human Rights Campaign shows that 17% of LGBTQ people and 22% of LGBTQ people of color have reported becoming unemployed as a result of COVID-19. However, the fight for inclusivity and equality does not stop with law and legislation. Further work is necessary to ensure that data-driven programs uncover and redress discrimination, rather than perpetuate it.
FPF and Privacy Analytics Identify “A Practical Path Toward Genetic Privacy”
Paper highlights de-identification standards, re-identification research, and emerging technical, contractual, and policy protections that can safeguard genetic data while supporting research.
Genomic data is arguably the most personal of all personally identifiable information (“PII”). Techniques to de-identify genomic data to limit privacy and security risks to individuals–while that data is used for research and statistical purposes–are at the center of discussions among stakeholders engaged in genetic research.
The Future of Privacy Forum (FPF) and Privacy Analytics have partnered to publish “A Practical Path Toward Genetic Privacy in the United States.” The white paper is intended to highlight the personal nature of genetic data, describe existing regulatory requirements, and discuss emerging developments regarding the de-identification & re-identification of genetic data while highlighting consensus practices organizations are taking to safeguard genomic information.
“Genetics has become increasingly valuable to cutting-edge medical research, with implications from public health to rare disease diagnostics,” said Katelyn Ringrose, FPF Policy Fellow. “Observing this evolution, FPF and Privacy Analytics collaborated to create a practical path forward; one which will protect the privacy of those individuals who contribute their genomes to fuel such incredible discoveries.”
The white paper explores and drives discussion around two prominent examples of privacy engineering solutions applicable to genetic privacy: differential privacy and secure (multi-party) computation. Although technical solutions like these show promise in protecting genetic data, companies should also follow emerging privacy and security-centric norms that are evolving in the space, including the use of:
- Access Controls – Depending on the nature of the data and its identifiability, access controls can limit access to certain individuals and institutions.
- Contractual Controls – Researchers and institutions can be required to enter into a data use agreement prior to being able to access data, in order to ensure that that data is accessed only for legitimate purposes and that identifiability remains low.
- Security Protocols – Organizations sharing genetic data can create specific security protocols dictating how researchers utilize data in open access or controlled-access data repositories.
FPF hopes that this white paper will help guide stakeholders in the genetics arena, including those stakeholders providing and utilizing genetic data to identify health risks, learning more about rare diseases, and creating new treatments and precise diagnostics. We look forward to continuing to support cutting-edge research, while aiming to mitigate the risks associated with the use of genetic data.
The Future of Privacy Forum works on issues regarding de-identification, ethics, and health data.
Read the White Paper
For additional information about this publication or the Future of Privacy Forum’s health working group, please contact Rachele Hendricks Sturrup ([email protected]) and Katelyn Ringrose ([email protected]).
Privacy and Pandemics: A Thoughtful Discussion
As the COVID-19 virus spreads, governments, researchers, and healthcare institutions are seeking to obtain and deploy consumer data to track the spread of the virus, deliver emergency supplies, target travel restrictions and quarantines, and develop vaccines and cures. But can data collected from phones, credit cards, and other sources be used in this emergency without opening the door to lasting or limitless surveillance?
Yesterday, FPF convened a Virtual Workshop with a dozen ethicists, academics, government officials, and corporate leaders, and over 100 corporate attendees, to discuss responsible data sharing in times of crisis. It’s the first in a series of events about privacy and pandemics that FPF will use to develop best practices and policy recommendations for decision makers.
Participants discussed how recent “data for good” initiatives have informed data sharing during the crisis, concerns about data sharing in a time of low trust, lessons learned from past pandemics, how to effectively protect privacy and civil liberties, and what the COVID-19 pandemic means for the future of data sharing between companies, academics, and governments.
A more detailed workshop report is forthcoming, but in the interest of urgency we share the most important advice that arose in the Workshop for companies with data that could be of value to public health:
- Understand how your own data sets relate to the needs of health experts. Any data set should be just one input into a broader epidemiological model. Some sets are not large enough, accurate enough, or relevant enough to be useful. Several participants warned that sharing flawed data or treating one data set as a “silver bullet” can lead decision-makers astray. Instead, companies should be sure to understand both the best ways that their data can be used and the risks associated with sharing their specific data. It is essential to work with medical and public health partners to understand their data needs, rather than merely provide analysis based on data collected for commercial uses.
- Continue to follow your guidelines for data protection during the crisis, and recognize that your standards for sharing have not changed. Participants agreed that data protection principles should not be abandoned because there is a crisis, but pointed out that the standards for prioritizing review of projects have changed because of pandemic-driven urgency. Many companies regularly face smaller scale emergency requests for data. Some companies have established expedited processes to quickly elevate exigent data-sharing decisions to the highest levels.
- Establish clear boundaries. History tells us that it is difficult to discontinue practices started in an emergency. In the absence of clear systemic rules, organizations should establish an exit strategy up front to protect against continued “emergency” practices after the crisis. Companies must be clear that data shared now should not be kept forever or used for other purposes; clear rules help maintain and build public trust in their programs.
- Use data protection safeguards, such as anonymization and aggregating data. These are established techniques, but there is no standard definition about what they mean, and much skepticism about the ability to guarantee anonymization. Companies should explain how they use techniques in specific situations. While data is being shared during this emergency, organizations must continue to follow principles such as data minimization, proportionality and destroying data after it is transferred or used.
- Work with a partner that has controls in place. Companies with established data for good programs have been working with partners to ensure data sets are appropriate, anonymized, and aggregated as much as possible. Participants expressed that working through existing arrangements is preferable to developing new partnerships in the midst of a crisis. For example, many university research groups have already data sharing agreements in place which have been vetted by Institutional Review Boards. These groups could act as a trusted partner between companies and public agencies.
- Be transparent. To maintain public trust, companies need to clearly explain what data is being shared, with whom, and for what purpose.
The Workshop’s participants agreed that it would be better if more companies, non-profits, governments, and academics had been working collaboratively on the technical infrastructure, governance structures, and legal frameworks for data sharing in an emergency before the COVID-19 pandemic hit.
Some participants recommended ways to strengthen the “data for good” ecosystem over time, including standing up new trust structures. One participant recommended strengthening the “data enablers” in the system, such as institutional or ethical review boards, which can serve as checks on ill-advised data sharing and also facilitate connecting data sources – often, companies that have data with socially beneficial uses – with data users, like researchers and policymakers.
Participants also agreed that data protection and humanitarian action are completely compatible. While the trade-offs for decisions about sharing data have changed, there still should be a thoughtful and legally justified process for considering what data to share, with whom, for what purposes, and how it should be protected.
Many more insights and details were gathered and will inform FPF’s ongoing work with stakeholders to identify best practices and policy recommendations for decision makers.
Questions to Ask Before You Buy a Genetic Testing Kit on Black Friday
By Rachele Hendricks-Sturrup and Katelyn Ringrose
On Black Friday and Cyber Monday, millions of consumers will hurry to their nearest doorbuster sale or boot up their favorite sales portal to buy a price-slashed consumer genetic testing kit. Some genetic testing kits will be up to half off this year, and the market as a whole is projected to more than triple from a valuation of $99 million this past year to a projected $310 million in 2022.
Last year on Black Friday, AncestryDNA alone sold about 1.5 million testing kits. According to Wired, that means that consumers sent in around 2,000 gallons of saliva—enough spit to fill a modest above-ground swimming pool. Consumers are drawn to the tests for genealogical purposes, and new market offerings are being used as strategies to help raise consumer awareness on genetic health risks.
With that much genetic material exchanging hands, it is important for consumers to think carefully about which kit provider will prioritize consumer privacy. DNA contains deeply personal information which can be incredibly beneficial for consumers. But that same information may also contain unexpected and deeply personal information that could be unsettling, and reveal information about the test taker’s family members. It deserves a high standard of protection.
However, laws like the Health Insurance Portability and Accountability Act (HIPAA), the central U.S. health privacy law, do not apply to genetic information collected and housed by consumer genetic testing companies. Due to this regulatory gap, consumers should find out from the companies themselves, and prior to buying a test for themselves or a loved one, how the companies will protect and use the genetic data they provide and collect.
Here are five important questions consumers should ask before buying a genetic testing kit on Black Friday or Cyber Monday:
- Does the Company Ask for Your Consent Before Sharing Your Individual-Level Genetic Data with Third Parties? People choose to share their genetic data with third parties for a range of purposes (e.g., to participate in scientific research or connect with unknown biological relatives). However, genetic testing companies should never share your individual-level genetic data with third parties without your knowledge and consent, particularly with insurers, employers, and educational institutions.
- Do You Have the Ability to Delete Your Genetic Data and Destroy Your Biological Sample If You Choose? Companies may have default policies to destroy all samples once testing is completed, retain data or samples for only a finite period of time or in accordance with regulations, or retain data and samples indefinitely or until you close your account. Companies should be clear about their retention practices and offer prominent ways to delete your genetic data from their databases and destroy your biological sample.
- Does the Company Require a Valid Legal Process Before They will Disclose Your Genetic Data to Law Enforcement? As we have seen in prolific cases like the Golden State Killer, genetic data can be a powerful investigative tool for government. However, government access to your genetic data presents substantial privacy risks. Companies should require that government entities obtain valid legal process, like a warrant, subpoena, or a legal order before they disclose genetic data.
- What are the Company’s Notification Practices When it Comes to Conveying Material Changes to Their Privacy Policies? Companies may modify their privacy policies or statements occasionally, and sometimes they significantly change how genetic data is collected, used, and stored. But before changes are implemented, you should be notified and given an opportunity to review the changes to decide if you want to continue using the company’s services.
- Has the Company Committed to Strong Technical Data Security Practices? As more than 26 million individuals have had their DNA tested, the potential for hacking and data breaches is an increasing concern. Given the uniqueness of genetic data, companies should maintain a comprehensive security program through practices such as: secure storage of biological samples and genetic data, encryption, data-use agreements, contractual obligations, and accountability measures.
For consumers who are interested in learning more, the Future of Privacy Forum’s Privacy Best Practices for Consumer Genetic Testing Services set forth standards for the collection, use, and sharing of genetic data. The standards embrace express consent mechanisms for the transfer of data to third parties and have provisions restricting marketing based on genetic data, among other privacy-centric protections. Companies that currently support these best practices include: Ancestry, 23andMe, Helix, MyHeritage, Habit, African Ancestry, and Living DNA.
Before you buy a genetic test kit as a gift or for yourself for this holiday season, take a moment to consider how our genetic information shapes who we are… and whether you are dealing with a company that promises to protect it.
For more information and to learn how to become involved with FPF’s health privacy efforts, please contact Katelyn Ringrose at [email protected] or Rachele Hendricks-Sturrup at [email protected].