India: Proposed Data Regulation Overhaul Includes New Draft Rules for Processing Non-Personal Data

Authors: Sameer Avasarala

——-

Disclaimers

This guest post is by Sameer Avasarala, a Data Protection and Technology Lawyer in Bengaluru. The material/opinion expressed is exclusively that of the author alone and does not expresses the views of Cyril Amarchand Mangaldas or any other firm / organization that the author is associated with. He can be contacted at [email protected]

Data protection and informational privacy have been gaining mainstream momentum in India with significant movement around the Aadhaar project, the forthcoming comprehensive regime for the protection of personal data and evolving data market trends. The legislators are now also considering regulating the processing of non-personal data, as shown in a new Report released by a Committee of experts put together for this purpose by the Ministry of Electronics and Information Technology. This contribution will set out the general background of data related regulatory efforts in India (1), and then it will look closely to the proposed rules for processing non-personal data: (2) its definition and classification, (3) the data localization requirement for sensitive and critical non-personal data, (4) guidance on anonymization, and (5) proposed data sharing obligations for organizations.

1) Setting the scene: a fundamental right to privacy and a growing data market

The recognition of a fundamental right to privacy by the Supreme Court[1] (Puttaswamy), as well as devising the triple test[2] as a basis to evaluate laws which may restrict the right to privacy and its application to the Aadhaar project[3], have been instrumental in triggering mainstream discourse around privacy in India.

At the same time, the data market in India is an exponentially growing market, with some studies estimating it to be a USD 16 billion industry by 2025 at a staggering 26% compounded annual growth. The government recognizes the need for a data governance framework to act as a catalyst for the growth of data economy in India.

Based on the normative foundation in Puttaswamy, the Ministry of Electronics and Information Technology (‘MEITY’) has constituted a Committee of Experts for a data protection framework, whose report and the resulting draft bill led to the introduction of the Personal Data Protection Bill in 2019 (‘PDP Bill’) in the Parliament. The PDP Bill is currently being reviewed by a joint parliamentary committee which is expected to present its report before the Parliament in the upcoming monsoon session, which may in turn be rescheduled owing to the COVID-19 pandemic.

Separately, the MEITY also constituted a committee of experts to deliberate on a data governance framework for India (‘Committee’) with a view to study various issues relating to non-personal data and make specific suggestions on its regulation. The Committee released its report on July 12, 2020 (‘Report’) and makes substantive recommendations on the scope, classification, ownership and other issues related to non-personal data. It also makes a clarion call for a comprehensive non-personal data regulation in India, to complement the future law dedicated to personal data. In addition, the Committee recommends the establishment of an overarching, cross-sectoral Non-Personal Data Authority (‘NPDA’).

2) Proposed definition and classification of non-personal data

The Report identifies existing issues such as entry barriers for startups and new businesses owing to first-mover advantage of market leaders and data monopolies, to name a few. Business, innovation and research are identified as cornerstones for furthering an inclusive framework for India’s data economy. It is also in line with the Draft National e-Commerce Policyin identifying data of Indian residents as an important ‘national resource’.

‘Non-personal data’ has been defined in the Report as any data that is not personal data[4], or is without any personally identifiable information. This includes personal data that has been anonymized[5] and aggregated data in which individual specific events are no longer identifiable, apart from data that was never personally identifiable. The Report classifies non-personal data into:

The Report recognizes natural persons, entities and communities to whom non-personal data (prior to anonymization or aggregation) relates as ‘data principals’ and entities which undertake collection, storage and processing of non-personal data as ‘data custodians’. It also enables communities or groups of data principals to exercise their rights through ‘data trustees’.

3) Data localization requirements for sensitive and critical non-personal data

The Report classifies individuals to whom the data relates before it is being anonymized, as the ‘owners’ of private non-personal data and it recommends obtaining consent of the data principal (at the time of collection) for anonymization and use thereafter.

Private non-personal data is also further sub-classified based on a sensitivity spectrum, taking into account considerations of national security, collective harm, invasion to collective privacy, business sensitive information and anonymized data. Private non-personal data is, thus, categorized into ‘sensitive non-personal data’ and ‘critical non-personal data’. Sensitive personal data[6] and critical personal data[7] which have been anonymized will be considered to be ‘sensitive non-personal data’ and ‘critical non-personal data’ respectively. The Report recommends localization of sensitive non-personal data and critical non-personal data, in line with the requirements applicable to localization[8] of sensitive personal data and critical personal data under the PDP Bill.

4) Guidance on anonymization

Though an offshoot to regulation of non-personal data, the Report provides new insight into the regulatory perspective on anonymization of personal data in India. From a lack of an anonymization standard under the current information technology law[9], to an indicative list of de-identifiers for ‘totally anonymized data’ applicable to health records, the regulatory viewpoint on anonymization has been vastly inconsistent. Recently, a protocol released by the MEITY in relation to the Aarogya Setu, a contact tracing mobile application, indicated a high anonymization standard, based on ‘means likely to be used to identify’ individuals, generally similar to the General Data Protection Regulation.

Against this background, the Report recognizes the residual risk of re-identification associated with anonymized information and considers anonymized sensitive personal data and critical personal data as sensitive NPD and critical NPD respectively. While the Report suggests techniques and tools for anonymization, as part of an anonymization primer, the introduction of data localization requirements and classification of non-personal data in a similar manner to personal data may deter in practice the use of anonymized information.

5) Data sharing and registration obligations

The Report recognizes ‘data businesses’ as a horizontal category of businesses involved in data collection and processing. Based on specific threshold requirements, the Report proposes a compliance regime to govern such data businesses, including registration and mandatory disclosure of specific information to the NPDA. Interestingly, a similar requirement for ‘data fiduciaries’ is included in the PDP Bill[10]. Accordingly, they would need to submit to the proposed Data Protection Authority any personal data anonymized or other non-personal data to enable better targeting of delivery of service or formulation of evidence-based policies to the Government.

The Report on regulating non-personal data is also proposing that data custodians may be required to share non-personal metadata about users and communities, to be stored digitally in meta-data directories in India and made available on an open-access basis to encourage development of novel products and services.

The Report contemplates three broad purposes for data sharing:

  1. a) Non-personal data shared for sovereign purposes may be used by the Government, regulators and law enforcement authorities, inter alia, for cyber security, crime and investigation, public health and in sectoral developments.
  2. b) Non-personal data shared for core public interest purposes may be used for general and community use, research and innovation, delivery of public services, policy development etc.
  3. c) Non-personal data shared for economic purposes may be used by business entities for research, innovation and doing business. It may also be leveraged as training data for AI/ML systems.

A ‘checks-and-balances’ system is proposed for ensuring compliance with data sharing and other requirements based on measures such as expert probing for vulnerabilities. The Report also recommends establishments of data spaces, data trusts and cloud innovation labs and research centers which may act as physical environments to test and implement digital solutions and promote intensive data-based research. It also includes guiding principles for a technology architecture to digitally implement rules for data sharing, ranging from mechanisms for accessing data through data trusts, standardized data exchange processes, techniques to prevent re-identification of anonymized information and distributed storage for data security.

The Report recommends a three-tiered system architecture including legal safeguards, technology and compliance to enable data sharing, in addition to a policy switch which enables a single digital clearing house for regulatory management of non-personal data.

Finally, the Report proposes classification of high value or special public interest data sets, for instance, geospatial, telecommunications and health data. However, it does not specifically indicate any implications of such classification.

Compliance with processing of non-personal data requirements would be ensured by a newly created NPDA. The Report recognizes the need to harmonize guidance issued by the NPDA in line with sectoral regulations. The NPDA is sketched out to have an enabling role (to ensure a level playing field) in addition to enforcement (to address market failures),

6) Conclusion: More to come

The Committee is currently inviting public comments[11] and is likely to hold public consultations on the policy options proposed. While there is no clear timeline around framing and enacting a data governance framework for non-personal data, it is likely that the PDP Bill would be enacted by the Parliament prior to it. The PDP Bill may also be relevant in setting context for the forthcoming non-personal data framework, given the ability of the Government to solicit non-personal and anonymized personal data.

While the Report is helpful in setting context for the forthcoming regulations for non-personal data and in proposing a data governance regime, the Government is likely to evaluate its content, hold wider consultations and consider other policy aspects prior to formulating a comprehensive data framework governing non-personal data in India.

[1] Justice K. S. Puttaswamy v. Union of India, (2017) 10 SCC 1

[2] Modern Dental College & Research Centre & Ors v. State of Madhya Pradesh & Ors, AIR 2016 SC 2601

[3] Justice K. S. Puttaswamy v. Union of India, (2019) 1 SCC 1

[4] Rule 2(1)(i), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011; Section 3(28), Personal Data Protection Bill, 2019

[5] Section 3(3), Personal Data Protection Bill, 2019

[6] Section 3(36), Personal Data Protection Bill, 2019

[7] Section 33, Personal Data Protection Bill, 2019

[8] Section 34, Personal Data Protection Bill, 2019

[9] Rule 2(1)(i), The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (SPDI Rules)

[10] Section 91, Personal Data Protection Bill, 2019

[11] MyGov ‘Share your Inputs on the Draft Non-Personal Data Governance Framework’, available at https://www.mygov.in/task/share-your-inputs-draft-non-personal-data-governance-framework/

Student Privacy, Special Education, and Online Learning: Navigating COVID-19 and Beyond

COVID-19 continues to disrupt education and has forced many schools to pivot to virtual, or online, learning for the fall semester. And virtual learning poses unique student privacy challenges, particularly for students with disabilities, such as:

Can a student’s family members and parents be present during a live virtual class?

Can a class be recorded for a student to view later?

Can a student receive one-on-one services or teletherapy via video conferencing?

How do you know if a video conferencing platform is safe to use?

To help educators navigate some of these common challenges and concerns, the Future of Privacy Forum (FPF) and the National Center for Learning Disabilities (NCLD) partnered to develop a new resource, Student Privacy and Special Education: An Educator’s Guide During and After COVID-19. The Educator’s Guide covers how federal laws including the Family Educational Rights and Privacy Act (FERPA), Individuals with Disabilities Education Act (IDEA), Children’s Online Privacy and Protection Act (COPPA), and Health Insurance Portability and Accountability Act (HIPAA) apply to virtual learning.

To help address a common source of frustration and confusion for many educators, students, and their families, the Educator’s Guide outlines best practices for educators around the use of video classrooms, including:

As part of their reopening plans, many schools plan on collecting new levels of sensitive health information from staff, students, and their families to assess and mitigate health risks. This information will be collected in a variety of manners, including self-reported surveys or screenings such as temperature checks. Students with disabilities or special health care considerations that may be more vulnerable to COVID-19 may be at risk of discrimination based on their health or disability status. For more insight on the student privacy implications of this data collection, read FPF’s issue brief on this topic here. Educators seeking further support may be interested in FPF’s new series of “Student Privacy and Pandemics” professional development training for educators, which includes a module focused on video classrooms. FPF is also developing an ongoing series of issue briefs on the student privacy implications of various reopening strategies schools are considering, including the uses of temperature checks and thermal scans, as well as wearable technologies, to help identify potential COVID-19 cases.

Future of Privacy Forum, National Center on Learning Disabilities Release New Student Privacy and Virtual Learning Guide

The Future of Privacy Forum (FPF) and the National Center for Learning Disabilities (NCLD) today released a new resource designed to help educators navigate the unique student privacy challenges raised by COVID-19 and the shift to virtual learning, particularly for students with disabilities. Student Privacy and Special Education: An Educator’s Guide During and After COVID-19 is available for download and use at this link.

“As educators teach virtually even more, there will be many barriers to instructing every student, but concerns about data privacy shouldn’t be one,” said Lindsay Jones, President and CEO of the National Center for Learning Disabilities. “It’s important for educators to have clear information about how privacy laws impact the delivery of virtual instruction, because now more than ever we must use ed tech in a way that helps us effectively teaching all students in a virtual world.”

As part of their reopening plans, many schools are also collecting new levels of sensitive health information from staff, students, and their families, through self-reported surveys and or screenings such as temperature checks, to assess and mitigate health risks. Students with disabilities or special health care considerations that may be more vulnerable to COVID-19 may be at risk of discrimination based on their health or disability status.

“Reopening plans must balance protecting health and protecting student privacy and educational rights,” said Amelia Vance, FPF’s Director of Youth and Education Privacy. “It’s a difficult – but incredibly important – balance. Schools and districts should have clear plan in place for how they will collect, use and store health data to ensure it is not ultimately used to limit educational access or opportunities for vulnerable students.”

Student Privacy and Special Education: An Educator’s Guide During and After COVID-19 covers how key privacy laws like the Family Educational Rights and Privacy Act (FERPA), Individuals with Disabilities Education Act (IDEA), Children’s Online Privacy and Protection Act (COPPA), and Health Insurance Portability and Accountability Act (HIPAA) apply to distance learning, and answers common questions about virtual learning and special education services including:

Educators seeking further support can visit FPF’s student privacy-focused website, Student Privacy Compass, which houses additional resources for educators such as The Educator’s Guide to Student Privacy and a new “Privacy and Pandemics” resource collection including a series of professional development trainings for educators.

NCLD’s website has additional Resources & Tools on COVID-19 for educators and parents, including an Educator’s Guide to Virtual Learning, a brief on Relevant Laws & Best Practices for Online Learning and a guide to Inclusive Technology During the COVID-19 Crisis.

To learn more about the Future of Privacy Forum, visit www.fpf.org. To learn more about the National Center for Learning Disabilities, visit www.ncld.org.

 

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts and includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. For more information, visit www.fpf.org.

About NCLD

The National Center for Learning Disabilities (NCLD) is a Washington, DC-based national policy, advocacy, and research organization that works to improve the lives of the 1 in 5 children and adults nationwide with learning and attention issues — by empowering parents and young adults, transforming schools, and advocating for equal rights and opportunities. www.ncld.org.

 

FPF Releases Follow-Up Report on Consumer Genetics Companies and Practice of Transparency Reporting

Screen Shot 2020 07 30 At 10.09.43 Am 1200x545

Today, the Future of Privacy Forum (FPF) published “Consumer Genetic Testing Companies & The Role of Transparency Reports in Revealing Government Requests for User Data,” examining how leading consumer genetic testing companies require valid legal processes before disclosing consumer genetic information to the government, as well as how companies publish transparency reports around such disclosures. The report analyzes transparency reports from leading consumer genetic testing companies to draw conclusions about the frequency of law enforcement requests, the type of information requested, the frequency with which companies complied with law enforcement requests, and how consumer genetic testing companies’ transparency practices differ from other related industries.

The report builds on FPF’s 2018 “Privacy Best Practices for Consumer Genetic Testing Services,” which acknowledges that government agencies may have legitimate reasons to request access to consumer genetic data from time to time, but requires that companies only provide access to such information when required by law. The new report will be helpful in conveying the benefits of transparency reporting, the different avenues for law enforcement requests, and the types of law enforcement requests issued to consumer genetics companies.

“Genetic data can help law enforcement agencies solve crimes and vindicate wrongfully accused individuals, but unfettered government access to genetic information held by commercial services presents substantial privacy risks” says FPF Vice President of Policy John Verdi. “Transparency reporting plays a key role in mitigating these risks by shedding light on how consumer DNA testing services respond to government access requests for both genetic information and non-genetic account information.”

The report articulates five main conclusions:

To learn more, read the report.

FPF & VUB Roundtable – Banking on Personal Data

Authors: Caroline Hopland, Ine van Zeeland, and Rob van Eijk


FPF & Vrije Universiteit Brussel: Banking on Personal Data

Last week, Future of Privacy Forum (FPF) and Vrije Universiteit Brussel (VUB) hosted a roundtable on open banking. Twenty subject-matter experts discussed some of the main challenges in banking and data protection law: consent and its various interpretations, (re)use and sharing of personal data, and interoperability. The European Data Protection Board (EDPB) has begun to address whether the unresolved questions can be answered. Aspects of these issues were highlighted in the roundtable.

In terms of next steps, FPF and VUB will turn to engaging and understanding how the EDPB has set out the path forward with another round of discussions in mid-September. The next question is: to what extent can insightful use cases from other parts of the world serve as an inspiration to European banks, regulators, and watchdog groups?

 

Open Banking: EDPB guidance

The EDPB released, on July 17, 2020, their Guidelines 06/2020 on the interplay of the Second Payment Services Directive (PSD2) and the General Data Protection Regulation (GDPR) – version for public consultation. While transaction information is increasingly digitized and banks are announcing data sharing agreements with ever more external parties, the European Commission plans to announce a new digital finance strategy, in a continued effort to ‘open up’ banking. Experts in the industry, however, warn that there still are many unresolved questions when it comes to banks sharing data with third parties.

To consumers, the situation appears complex. They tend to hold banks accountable whenever something goes wrong, as these are the familiar parties to them. Roundtable participants from South Africa and Israel confirmed this was the case in their countries as well: criticism for malpractices at fintechs is still directed at banks. Government authorities or regulators could perhaps play a role in improving trust among the wider public. For instance, in the UK there is a dispute management system for disagreements between banks and third parties; this could be offered to consumers as well. As it is, consumers whose rights have been breached can only go to court, which is a high threshold for many. When it comes to data breaches, it would also be in the interest of fintechs to have clarity on liabilities in order to take out adequate insurance.

 

The difference between consent under PSD2 & the GDPR

The first presenter discussed the difference between consent under PSD2, the European directive regarding digital payment services, and the GDPR, in various banking situations.

They explained that PSD2 sets out rules concerning:

– Strict security requirements for electronic payments and the protection of financial data of Payment Service User (PSU), guaranteeing safe authentication and reducing the risk of fraud;

– The transparency of conditions and information requirements for payment services; and

– The rights and obligations of PSUs and providers of payment services.

They stated that the GDPR, however, generally focuses on strengthening the rights of the data subjects and sanctioning any violation of data subject rights. Both PSD2 and the GDPR set standards with respect to the safe-keeping of personal data and information provision to PSU (respectively, data subjects under the GDPR, when the PSUs are individuals). A Third Party Provider (TPP) must comply with both PSD2 and GDPR but, where one regulation is silent and the other regulation restricts processing of personal data, the protection of personal data shall prevail, (e.g., for the processing of special categories of data, apply Article 9 of the GDPR). Additionally, the processing of sensitive personal data, like health data, is prohibited unless one of the exemption clauses set out in Article 9, section 2 of the GDPR applies.

As for consent, the ‘explicit consent’ prongs within Article 94 of the PSD2 should be seen as an additional safeguard to ensure the consumer fully consents and understands the contract. However, PSD2 consent for the parameters of the interaction with the financial institution and GDPR consent for personal data processing should not be conflated. The presenter also touched upon Germany’s pragmatic approach, where consent of a PSU under Article 64 of the PSD2 (‘authorization’) also includes the explicit consent for data processing.

The presenter described the roles and responsibilities of a TPP & an Account Servicing Payment Service Provider (ASPSP) under PSD2 and GDPR with respect to their responsibility for safeguarding personal data during data transfer. The ASPSP must comply with Articles 66(1), (4), 67(1), (3) of the PSD2, and transfer of client data is justified according to Article 6 (1)(c) of the GDPR (providing a legal obligation). Once the TPP obtains access to a consumer’s data, it assumes its own responsibility with respect to processing personal data. If an interface from a bank complies with the Regulatory Technical Standards (RTS) on Strong Customer Authentication and Common and Secure Communication, then the data flow will be secured. The TPP must then explore any technical solutions in order to exclude certain categories of data that are not needed. This is an open issue right now and has not yet been solved.

The first presenter mentioned: ‘Consent requires that information is clearly provided to the data subject. The data subject should know to whom the data flow is going, and to whom the data is being disclosed to. This should be as precise as possible. This is why we are trying to precisely describe the purpose, parties, and recipients of the data flow. Bottom line: Consent under PSD2 might be totally valid, but not valid under the GDPR.’

 

Data (Re)use

The second presenter further touched upon these issues and spoke about what they mean in practice. The second presenter said that, ‘the idea behind PSD2 is that it should allow for competitiveness in the market.’

They discussed the purpose limitation requirements under PSD2 and the GDPR. Articles 66 & 67 of the PSD2 specifies the purposes for which the data is processed, but the GDPR, by the rights afforded to data subjects, causes a bit of disparity. This is shown in the EDPB’s Guidelines, and therefore creates a dual approach depending on whether the PSU is a legal entity or not. Articles 5(1)(b) of the GDPR, and Articles 66(3)(g) and 67(2)(f) of the PSD2 set out the specified purposes for processing and are almost identical. PSD2 creates the first element of the GDPR’s ‘purpose limitation’ principle and specifies the purposes for which the data is processed.

They stated that the specified purposes in Article 66(2)(f) of the PSD2 limits the Payment Initiation Service Provider’s (PISP) processing to payment initiation services to:

– Initiate the payment order

– Instruction from PSU + ‘consent’

– Instruction to Account Servicing Payment Service Provider (ASPSP)

– Receive authorization from ASPSP;

While Article 66(3)(g) of the PSD2 limits the Account Information Service Provider’s (AISP) processing to account information services to:

Instruction from PSU + ‘consent’

– Access payment accounts held by PSU

– Receive authorization from the ASPSP

– Receive account information from ASPSP

– Display account information back to PSU.

The presenter explained that payment data transactions are not considered personal data if the data relates to a Corporate PSU, whereas such data would amount to personal data for individual PSUs.

Moreover, they described the EDPB’s Guidelines on PSD2 relating to further processing, and explained that:

– Compatibility is unavailable under Article 6(4) of the GDPR (paragraph 22 of the Guidelines). The relevant lawful basis for processing under the PSD2 is performance of a contract, for which the ‘necessary’ requirement has been restrictively interpreted (see EDPB Guidelines 2/2019).

– Further processing of payments data can only be based on EU or Member State law or the data subject’s consent. The Guidelines note that relevant Member State law includes counter terrorism and anti-money laundering (paragraph 23 of the Guidelines) but leaves a question mark as to whether fraud prevention and detection activities are captured (paragraph 70 of the Guidelines alludes to this, in combination with Article 94(1) of the PSD2 but this was not explicit in the Guidelines).

In practice, this means that: 1) PSD2 is not lex specialis; the two regimes operate together; 2) this is a dual regime under PSD2; and for a corporate PSU, the only permitted processing appears to be restricted by Articles 66 and 67 of the PSD2, with no recourse to obtain consent under GDPR for corporate PSUs; 3) there are no compatible purposes under Article 6(4) of the GDPR; 4) we must be mindful of EU and Member State Law; and 5) the role of consent, especially consent that meets the requirements of the GDPR, has a very important role in determining how ancillary services can be provided under PSD2 to individual PSUs (i.e. data subjects).

The second presenter stated: ‘If the fulfilment of the account information or payment initiation service falls under performance of the contract,  then the TPP ought to also get the consent of the data subject to see if they can share the data with third parties so that they can offer them services unless this can be included in the contract; however, this will need careful consideration in line with the EDPB’s Guidelines 2/2019.’

The second presenter expressed that: ‘If, for example, PSD2 deals with consolidation of various accounts for purposes of analytics as an account information service, then the account information service ends because the consolidation has taken place. If the purpose is different and not previously covered in the contract – assuming it can be covered in the contract and that such additional services are necessary – the TPP must obtain consent from the individual PSU to share the PSU’s data with recipients for the further purposes of X, Y, Z. However, an ecosystem based on consent is unlikely to be viable in practice. Further clarity from the EDPB is required owing to the inherent conflict between what the Guidelines suggest and what PSD2 was designed to achieve.’

 

Interoperability

The third presenter focused on the Financial Data Exchange (FDX) model. FDX has the potential to be an alternative model to the open banking model that has been gaining momentum in, e.g., the UK and Australia. FDX is user centric by design and creates a consent receipt where it can trace users’ data through a data transferring ecosystem (e.g., from the bank to the permission party). FDX focuses on control, access, transparency, traceability, security, and data minimization, and ensures that consumers have full control of their data. By creating a fully traceable ecosystem with consent receipts, users will be able to identify the liable party if there is a breach of their data.

 

To learn more about FPF in Europe, please visit fpf.org/eu.

 

11th Annual Privacy Papers for Policymakers — Call for Nominations

The Future of Privacy Forum (FPF) invites privacy scholars and authors with an interest in privacy issues to nominate finished papers for consideration for FPF’s annual Privacy Papers for Policymakers Award.

PURPOSE

DEADLINE

SUBMISSION REQUIREMENTS:

REVIEW PROCESS

AWARDS EVENT

The Future of Privacy Forum will invite winning authors to present their work at an annual event with top policymakers and privacy leaders in the Spring of 2021 (date TBD). FPF will also publish a digest of the summaries of the winning papers for distribution to policymakers in the U.S. and abroad.

California SB 980 Would Codify Many of FPF’s Best Practices for Consumer Genetic Testing Services, but Key Differences Remain

Authors: John Verdi (Vice President of Policy) and Katelyn Ringrose (Christopher Wolf Diversity Law Fellow) 


In July 2018, the Future of Privacy Forum released Privacy Best Practices for Consumer Genetic Testing Services. FPF developed the Best Practices following consultation with technical experts, regulators, leading consumer genetic and personal genomic testing companies, and civil society. The FPF Best Practices include strict standards for the use and sharing of genetic information generated in the consumer context. Companies that pledged to follow the guidelines, including Ancestry, 23andMe, and Helix promised to: 

California lawmakers are currently considering SB 980 (the “Genetic Information Privacy Act”). SB 980 would establish obligations for direct-to-consumer genetic testing companies and others that collect or process genetic information. If passed by the legislature and approved by the Governor, the bill would become effective on January 1, 2021. 

Many of SB 980’s provisions align closely with FPF’s Best Practices, including the bill’s emphasis on consumers’ rights to notice, choice, and transparency. Leading direct-to-consumer genetic testing companies are already obliged to follow the Best Practices, as they have made public commitments that are enforceable by the Federal Trade Commission and state Attorneys General. SB 980’s provisions would extend these requirements to all covered entities that do business in California. 

Some of SB 980’s provisions diverge from the FPF Best Practices. For example, FPF’s Best Practices and SB 980 both require companies to obtain opt-in consent before they use DNA test results for marketing, but proposed amendments to SB 980 would further require companies to provide consumers with an opportunity to opt out of contextual marketing – ads placed on web pages and apps based on page content rather than sensitive personal information. SB 980’s treatment of contextual advertising is also inconsistent with the California Privacy Rights Act of 2020 (CPRA) – the comprehensive privacy ballot initiative that would govern the use of much sensitive health data and would not require companies to provide an opt-out for non-personalized, contextual advertising. In addition, SB 980 diverges from FPF’s Best Practices regarding government access to DNA information, with SB 980 preserving an option for companies to voluntarily provide genetic data to law enforcement in the absence of a court order or consumer consent; FPF’s Best Practices would prohibit such disclosures in most cases. 

Below, we analyze SB 980’s approach to: (1) consent; (2) marketing; (3) privacy policies; (4) research; and (5) penalties and enforcement. We also examine (6) several other federal and state laws that currently regulate genetic privacy. 

  1. Consent for Genetic Data //

FPF’s Best Practices and SB 980 take similar approaches – requiring different methods of consent (express opt-in vs. opt-out), depending on the sensitivity and uses of the data. Both SB 980 and the Best Practices emphasize express, affirmative consent as a baseline requirement for collecting genetic information. They each require that companies provide opt-out consent mechanisms for consumers regarding use of non-genetic information, such as purchase histories or web browsing information.

FPF’s Best Practices require initial express consent for genetic information collection, as well as separate express consent for the use of genetic material outside of the initial scope of collection. Secondary express consent is also required before a company engages in the onward transfer of individual-level information or the use of genetic information for incompatible or materially different secondary uses. Companies are also required to provide additional consent measures for consumers or organizations that submit genetic information on behalf of others. In a similar vein, SB 980 would require prior authorization from consumers for the initial collection of their genetic information and separate authorization for each subsequent disclosure.

FPF’s Best Practices define express consent as a consumer’s statement or clear affirmative action in response to a clear, meaningful, and prominent notice, while encouraging companies to use flexible consent mechanisms that are effective within the context of the service, in-app or in-browser experience, and relationship between the company and individual. 

  1. Marketing //

The FPF Best Practices and SB 980 differ in their approach to consent for marketing and advertising purposes, including marketing on the basis of non-genetic information. The Best Practices prohibit companies from marketing to consumers on the basis of their genetic information, unless the consumer provides separate express consent for such marketing or marketing is clearly described in the initial express consent as a primary function of the product or service. Marketing to a consumer on the basis of their purchase history is permitted if the consumer is provided the option to opt-out of such marketing. Marketing to anyone under the age of 18 is prohibited. 

The Best Practices do not require companies to obtain opt-in consent or provide an opt-out for “customized content or offers by the company on its own websites and services.” This provision is intended to permit 1) contextual advertising (i.e., advertising that is tailored to the other content on a particular page on a website, rather than targeted to a particular user); and 2) first-party offers displayed to users on the basis of information within the same platform, such as when a logged in user receives an email offer based on information they viewed on the company’s own website while logged in. This approach aligns with leading privacy norms, including the approach taken by the Department of Health and Human Services in interpreting the Health Insurance Portability and Accountability Act (HIPAA), which exempts certain first-party communications related to treatment and health-related products from its definition of “marketing.” It is also consistent with the California Privacy Rights Act of 2020 (CPRA), the privacy ballot initiative that would establish rights to opt out of the sale and uses of sensitive health data and would codify a narrow exemption for non-personalized, contextual advertising.

Like FPF’s Best Practices, SB 980 also requires companies to obtain opt-in consent before marketing based on a consumer’s genetic data. A recent amendment would align SB 980’s and FPF’s approaches to marketing based on purchase history, requiring provision of an opt-out. However, a related SB 980 amendment would require companies to provide users with mechanisms to opt out of contextual advertising. This approach would be inconsistent with most leading norms, including HIPAA and the California Privacy Rights Act. This is because, in contrast to targeted or behavioral advertising, contextual advertising is not typically viewed as implicating significant privacy risks. Indeed, privacy advocates have cited contextual advertising as a privacy-protective model that displays marketing messages on web pages based on the content of the page, not information about an individual. 

  1. Privacy Policies // 

FPF’s Best Practices require companies to furnish privacy notices that are prominent, publicly accessible, and easy to read. The Best Practices require companies to include certain information within their policies, including standards regarding: data collection, consent, use, onward transfer, access, security, and retention/deletion practices. Furthermore, the Best Practices note that “a high-level overview of the key principles should be provided preceding the full privacy policy.” The overview should take the form of a short document or statement that provides basic, essential information, including whether the privacy policy for genetic information is different than that of other data (e.g. registration data, browsing (cookies or website) tracking, and/or personal information). 

Similarly, SB 980 would require all direct-to-consumer genetic or illness testing services companies to provide consumers with “clear and complete information regarding the company’s policies and procedures for the collection, use, and disclosure, of genetic data” through “a summary of its privacy practices, written in plain language” and “a prominent and easily accessible privacy notice.”  

  1. Research //

FPF’s Best Practices encourage the socially beneficial use of genetic information in research while providing strong privacy protections. This nuanced approach strikes a careful balance between the societal benefits of genetic research and individual’s privacy interests. The Best Practices require companies to obtain informed consent before using identifiable data for research, and promote research on strongly deidentified datasets. The Best Practices require companies to engage in consumer education and make resources available regarding the implications and consequences of research. 

The consumer genetic and personal genomic testing industry produces an unprecedented amount of genetic information, which in turn provides the research community the ability to analyze large and diverse genetic datasets. Genetic research enables scientists better understand the role of genetic variation in our ancestry, health, well-being, and more. In order to recognize the role of big data in corporate research and the difficulty of obtaining individual consent (see Omer Tene and Jules Polonetsky’s Beyond IRBs: Ethical Guidelines for Data Researchidentifying the regulatory gaps between federally funded human subject research and corporate research) the Best Practices recognize the important role of Institutional Review Boards (IRBs) and ethical review processes. 

FPF’s Best Practices also provide incentives for researchers and others to deidentify genetic data when practical. Deidentification of genetic information is an incredibly complex issue (see FPF and Privacy Analytics’s “A Practical Path Toward Genetic Privacy”), and the risk of reidentification of genetic data can be limited by rigorous technical, legal, and organizational controls.

SB 980 also requires informed consent before using data for research, “in compliance with the federal policy for the protection of human research subjects” — effectively the same standard as the FPF Best Practices. Similarly, SB 980 also promotes strong deidentification of data, meaning data that “cannot be used to infer information about, or otherwise be linked to, a particular identifiable individual,” provided it is also subject to public commitments and contractual obligations to not make attempts to reidentify the data.

  1. Penalties and Enforcement //

Companies that have publicly committed to comply with FPF’s Best Practices are subject to enforcement by the Federal Trade Commission (FTC) under the agency’s Section 5 authority to prohibit deceptive trade practices. State Attorneys General and other authorities have similar powers to bring enforcement actions against companies that violate broadly applicable consumer protection laws. 

SB 980 includes a tiered penalty structure, with negligent violations of the act subject to civil penalties not to exceed one thousand dollars ($1,000) and willful violations between $1,000 and $10,000 plus court costs. Penalties for wilful violations would be paid to the individual to whom the genetic information pertains. Penalties could add up quickly – they are calculated on a per violation, per consumer basis. Earlier versions of SB 980 included criminal penalties; the bill sponsors recently removed criminal liability in favor of a higher civil penalty, raising the maximum fine from $5,000 to $10,000. 

  1. Other Federal and State Laws //

In the United States, a growing number of sectoral laws are applicable to companies that process genetic information. The federal Genetic Information Nondiscrimination Act (GINA) prevents genetic discrimination in health insurance and employment, but GINA does not prohibit discrimination in life insurance, disability or long term care insurance, nor does it provide general privacy protections or limits on law enforcement uses. In an attempt to close regulatory gaps, several states have enacted legislation around law enforcement access to genetic information and discriminatory practices on the behalf of life insurance organizations

Key state laws governing genetic information include:

Conclusion //

Genetic and personal genomic tests increase consumers’ access to and control of their genetic information; empower consumers to learn more about their biology and take a proactive role in their health, wellness, ancestry, and lifestyle; and enhance biomedical research efforts. The consumer genetic and personal genomic testing industry is producing an unprecedented amount of genetic information, which provides the research community the ability to analyze a significantly larger and more diverse range of genetic data to observe and discover new patterns and connections. Access to genetic information enables researchers to gain a better understanding of the role of genetic variation in our ancestry, health, well-being, and much more. While genetic information poses incredible benefits, genetic information is also sensitive information that warrants a high standard of privacy protection. 

FPF’s Best Practices provide a model for strong privacy safeguards with detailed provisions that support clinical research and public health. Key portions of California SB 980 are consistent with the Best Practices, and would require all companies to provide consumers with important transparency, choice, and security safeguards. Several SB 980 amendments and provisions diverge from the Best Practices in important ways, including how the bill would treat contextual advertising and government access to data. 

Change Could be Soon Coming to the FTC, the Lead U.S. Agency on Privacy

The U.S. Presidential election is almost upon us, and it could have a big impact on the future of the Federal Trade Commission (FTC), the de facto national privacy regulator and law enforcer. The FTC lineup has been steady since 2018 but that could soon change – no matter who wins the election.

Prior to the appointment of the five current Commissioners, the FTC had only two serving. This happened because new Commissioners were not appointed by the President and confirmed by the Senate as they finished their terms and departed. Though all five current FTC Commissioners were appointed in 2018, their terms in office end years apart.

FTC Commissioners’ Terms

Commissioners serve seven-year terms, with appointment and expiration dates set on a staggered schedule. The FTC Act has been interpreted to mean that Commissioners’ seven-year terms run “with the seat,” so that the term expires on the scheduled date, regardless of when the Commissioner was appointed, confirmed, and sworn in. If a Commissioner’s replacement is not appointed at the end of their term, they may stay on until their replacement is seated. This is currently the case with Commissioner Chopra, whose term ended in September 2019.

Commissioners can be re-appointed. Sometimes they leave before the end of their term.

Commission Chairs often leave when there is a change in Presidential Administration. If FTC Chairman Joe Simons chose to step down, his vacancy could be filled by a new Chair, or a non-Chair Commissioner could be appointed and a sitting Commissioner elevated to Chair.

Nominating New Commissioners

Two of the five Commissioners must not be from the President’s political party. It’s typical for the Administration and Senate leaders to agree to “pair” appointees from each party when there is more than one vacancy in order to ease Senate confirmation. That would be unlikely in a new Democratic administration because there would not be a Republican vacancy unless more than one sitting Republican vacated their seats. But “pairing” can happen across agencies, with Senate leaders of each party agreeing to move the nominations they support as part of complicated bi-partisan agreements.

Although the current Commissioners reflect a range of ideological perspectives, the agency has generally been fortunate to be led by appointees recognized for their professionalism, integrity, policy smarts, and ability to collaborate across party lines – traits that will be valued in their eventual successors as well.

The European Commission Considers Amending the General Data Protection Regulation to Make Digital Age of Consent Consistent

The European Commission published a Communication on its mandated two-year evaluation of the General Data Protection Regulation (GDPR) on June 24, 2020 in which it discusses as a future policy development “the possible harmonisation of the age of children consent in relation to information society services.” Notably, harmonizing the age of consent for children across the European Union is one of only two areas in the GDPR that the Commission is considering amending after further review of practice and case-law. Currently, the GDPR allows individual Member States some flexibility in determining the national age of digital consent for children between the ages of 13 and 16. However, upon the two-year review, the Commission expressed concerns that the variation in ages across the EU results in a level of uncertainty for information society services–any economic activities taking place online–and may hamper “cross-border business, innovation, in particular as regards new technological developments and cybersecurity solutions.”

“For the effective functioning of the internal market and to avoid unnecessary burden on companies, it is also essential that national legislation does not go beyond the margins set by the GDPR or introduces additional requirements when there is no margin,” stated the Commission in its report. Some believe stringent child privacy requirements can push companies to abandon the development of online services for children to avoid legal risks and technical burdens, which creates a void for companies from countries with lax child privacy protections. In addition to the GDPR’s varying ages of digital consent, there are also differing interpretations of the obligations on information society services regarding children. For example, the United Kingdom’s proposed Age Appropriate Design Code defines a child as a person under the age of 18 and lays out additional requirements for information society services to build in privacy by design to better protect children online.

Prior to the GDPR, European data protection law did not include special protections for children, instead providing the same privacy protections across all age groups. The GDPR recognized that children are particularly vulnerable to harm and exploitation online and included provisions extending a higher level of protection for children. However, a universal consensus on the age of a child does not exist, and the flexibility provided by the GDPR creates a fragmented landscape of ages requiring parental consent across the EU. While complying with different ages of consent is relatively straightforward in the physical world where activities are generally limited within national boundaries, given the nature of online services operating across states, the lack of consistency of ages is a significant barrier for companies. Information society service providers are obliged to verify the age of a user, their nationality, and confirm the age of consent for children for that Member State prior to allowing access to their services. This burden may pose a competitive disadvantage for companies operating in the EU or result in measures depriving children and teens the benefits of using these services, as companies choose either to invest significant resources in age verification and parental consent mechanisms or to abandon the market for children and age gate their services instead.

The Commission also initiated a pilot project to create an infrastructure for implementing child rights and protection mechanisms online, which is scheduled to commence on January 1, 2021. The project aims to map existing age-verification and parental consent mechanisms both in the EU and abroad and assess the comprehensive mapping results to create “an interoperable infrastructure for child online protection including in particular age-verification and obtaining parental consent of users of video-sharing platforms or other online services.”

Currently, Member States require or recommend varying age verification and parental consent mechanisms. In addition to the UK’s Age Appropriate Design Code, the German youth protection law requires businesses to use scheduling restrictions to ensure that content harmful to children is not available during the day when children are online; to use technical methods to keep children from accessing inappropriate content, such as sending adults a PIN after age verification; or to use age labeling that youth protection software, downloaded by parents on their children’s devices, can read. However, the efficacy of these methods is unclear and unproven. As such, a sweeping review of existing methods may reveal best practices to be widely adopted within the EU and serve as a model for other countries, including the United States.

FPF & BrightHive Release Playbook to Create Responsible Contact Tracing Initiatives, Address Privacy & Ethics Concerns

A new playbook from the Future of Privacy Forum (FPF) and BrightHive, Responsible Data Use Playbook for Digital Contact Tracing, provides a series of considerations to assist stakeholders in setting up a digital contact tracing initiative to track and manage the spread of COVID-19, while addressing privacy concerns raised by these technologies in an ethical, responsible manner.

“Digital contact tracing technologies will play an instrumental role in localities’ responses to the COVID-19 pandemic, but these technologies – if designed, developed, and deployed without thoughtful planning – can raise privacy concerns and elevate disparities,” said FPF CEO Jules Polonetsky. “Contact tracing initiatives should take a measured approach to location tracking, data sharing, purpose limitations and proportionality. If deployed hastily, these technologies risk exacerbating existing societal inequalities, including racial, socioeconomic, and digital divides.”

As COVID-19 continues to spread through communities across the United States and abroad, public health officials are turning to digital contact tracing technologies (DCTT) as a means of tracking cases, identifying sources of transmission, and informing people who may have been exposed in order to prevent further transmission. For contact tracing to be effective, however, people must share sensitive personal information on their whereabouts or with whom they have been in close proximity so that their connections or locations can be mapped and tracked, raising ethical and privacy concerns.

“This playbook is intended to support coalitions, including public health agencies and application developers, in designing and implementing a digital contact tracing initiative,” said Natalie Evans Harris, Head of Strategic Initiatives at BrightHive. “The playbook provides a series of considerations that purposefully address privacy concerns and support the development of ethical and responsible digital contact tracing protocols.”

The playbook walks stakeholders interested in setting up a digital contact tracing initiative through a checklist of actions, from coalition-building in support of the initiative to implementation across the lifecycle of the initiative. To learn more, read the playbook.

 

About FPF

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting fpf.org.

About BrightHive

BrightHive helps organizations, networks and communities securely and responsibly link their data to enhance their impact, empower individual and collective decision making, and increase equity of opportunity. Learn more by visiting brighthive.io