FPF Presents @ RightsCon 2020: “Frontiers in health data privacy: navigating blurred expectations across the patient-consumer spectrum”

The patient-consumer spectrum is a growing concept in which healthcare is rapidly transitioning from a periodic activity in fixed, traditional health care settings to an around-the-clock activity that involves the generation, use, and integration of data reflecting many aspects of individuals’ lives and behaviors. Accompanying this spectrum are blurred distinctions between traditional versus consumer-generated health information and differences in expectations of how health information across this spectrum should be protected or treated.

On July 27, 2020, during the RightsCon 2020 virtual conference, the Future of Privacy Forum’s (FPF’s) Health Policy Counsel and Lead, Dr. Rachele Hendricks-Sturrup sat down with three health data governance and policy expert panelists to explore the privacy and policy implications across the broadening patient-consumer spectrum:

An audience poll was taken to garner the panel audience’s perspectives regarding the privacy of consumer-generated versus traditional health care data. Just over half (52%) of the audience members who participated in the poll felt that the privacy of consumer-generated health data should be treated the same as traditional health care data:

 

 

 

 

 

 

These split results highlight the need to discuss data privacy and rights across the growing patient-consumer spectrum. The panelists took on this challenge and offered the following key takeaways:

Dr. Hendricks-Sturrup and the panelists concluded that, in order to successfully navigate blurred expectations of privacy across this spectrum and make progress toward establishing meaningful legal and policy frameworks and best practices, diverse stakeholders from industry, academia, and civil society must be engaged and barriers to their collaboration must be addressed.

Read the post-panel white paper here.

Watch the RightsCon panel below:

https://www.youtube.com/watch?v=fDjw4BVLFeo

To learn more about the FPF Health Initiative, contact Dr. Rachele Hendricks-Sturrup at [email protected].

 

Call for Position Statements on Responsible Uses of Technology and Health Data During Times of Crisis

Event Overview

The Future of Privacy Forum, in collaboration with the National Science Foundation, Duke Sanford School of Public Policy, SFI ADAPT Research Centre, Dublin City University, and Intel Corporation presents Privacy & Pandemics: Responsible Uses of Technology and Health Data During Times of Crisis — An International Tech and Data Conference, including a two day virtual workshop on October 27-28, 2020 to explore the value and limits of data and technology in the context of a global crisis. At 10 months into the COVID-19 pandemic, what role has tech and data played in combating the crisis, what have we learned about limitations of law, policy and technical tools, and what areas need reform and additional research?

REGISTER HERE

 

Call for Position Statements

We are soliciting position statements from leading technologists, scientists, policymakers, data experts, companies, and regulators to assess early conclusions about how data and technology have each played a role in efforts to study, control the spread of, and track COVID-19. 

This call invites experts with a perspective on areas such as:

We invite you to submit a 500-1000 word position statement to be considered for inclusion in the upcoming workshop. Authors of accepted submissions will be offered a $1,500 (US) stipend to participate in a relevant workshop session or invited to present a “firestarter” at the virtual event to be held October 27th and 28th 2020. Accepted submissions will be distributed to workshop attendees in advance for review, assessment, and discussion. The Planning Committee will also organize a number of invited presentations.

A workshop report will be prepared and used by the National Science Foundation to help set direction for the Convergence Accelerator 2021 Workshops, speeding the transition of convergence research into practice to address grand challenges of national importance. 

This event is sponsored in part by the NSF Convergence Accelerator and the NSF Secure and Trustworthy Cyberspace programs, Intel Corporation, the Privacy Tech Alliance, and OneTrust, LLC.

Position Statement Submission 

We are inviting submissions of position statements to catalyze conversations around the future of privacy and technology during times of crisis. We are seeking original, provocative, well-argued statements of approximately 500-1000 words.

We are interested in statements addressing specific, practical, identified challenges faced by academic researchers, public health experts and agencies, technologists, policymakers, industry and others who have played a role in responding to the COVID-19 crisis.

Works by undergraduate students, graduate students, or unaffiliated scholars, as well as from individuals with academic and/or corporate affiliations are all welcome. Works by interdisciplinary teams, specifically those representing the convergence of fields such as engineering, biology, social, and computer sciences are encouraged.

Submission Deadline

Submission of draft position statements: September 30, 2020. **DEADLINE CLOSED**

Submission Format

Reviewer Team

Review Process

Position statements will be reviewed by at least three reviewers, including a subject matter expert. Reviewers will determine if a position statement will move forward for final decision on inclusion in the workshop.

Additional Information

For more information on this effort, including submission instructions, event details, or other questions, please contact Christy Harris at [email protected].

California’s SB 980 Would Codify Strong Protections for Genetic Data

Author: John Verdi (Vice President of Policy)

This week, SB 980 (the “Genetic Information Privacy Act”) passed the California State Assembly and State Senate, with near unanimous support (54-10 and 39-0). If signed by the Governor before the Sept. 30 deadline, the law would become the first comprehensive genetic privacy law in the United States, establishing significant new protections for consumers of genetic services

As we previously wrote and testified, the Genetic Information Privacy Act incorporates many of the protections in FPF’s 2018 Privacy Best Practices for Consumer Genetic Testing Services. Those Best Practices were drafted and published over the course of 2018 in consultation with a multi-stakeholder group of technical experts, scientists, civil society advocates, leading consumer genetic and personal genomic testing companies, and with input from regulators including the Federal Trade Commission (FTC) and the Department of Health and Human Services (HHS). 

Leading genetic testing companies have adopted the Best Practices, making them enforceable by the FTC and state AGs; SB 980 would extend safeguards to users of other genetics companies, protecting consumers and building trust in the industry.

Below we describe 1) that process and results of FPF’s 2018 efforts; and 2) the significance of SB 980 as compared to existing laws and the voluntarily adopted Best Practices.

FPF’s 2018 Stakeholder Process//

In 2018, Future of Privacy Forum conferred with leading genetics services and other experts to explore ways to address consumer privacy concerns related to genetics services. At the time, concerns were emerging in response to the rapid growth of the consumer genetics industry, and highly publicized cases of law enforcement access to genetic data, including the Golden State Killer investigation.

As a non-profit dedicated to convening divergent stakeholders to create workable best practices for emerging technologies, we solicited and received the input of scientists, consumer privacy advocates, government stakeholders, and other experts. FPF published the resulting Best Practices at the end of July 2018.

Since that time, some, but not all, of the direct-to-consumer genetics companies have voluntarily adopted FPF’s Best Practices. Some companies have chosen not to adopt the Best Practices, or to adopt only certain provisions, while others are supportive but have chosen not to formally incorporate the provisions of the Best Practices into their policies. Privacy policies and other voluntary legal commitments can be enforced by the Federal Trade Commission and State Attorneys General.

Why SB 980 is Significant //

If signed by the Governor, SB 980 would be a landmark law for genetic privacy, going beyond existing federal and state laws as well as self-regulation. Although the federal Genetic Information Nondiscrimination Act (GINA) prohibits certain types of discrimination based on genetic information, it does not provide comprehensive privacy protections for the collection of such data or the many ways that it can be used, sold, or shared (including for advertising or law enforcement purposes).

Similarly, the handful of states that have heretofore addressed genetic information privacy have not established comprehensive protections. For example, some states have enacted “mini-GINAs” (including California), or extended its protections to discrimination in life insurance, disability, or long term care (Florida). Somes states have limited law enforcement access (Nevada), and at least one has attempted to take a more comprehensive approach while recognizing genetic information as the property of the consumer (Alaska).

In contrast, SB 980 would establish broad, comprehensive consumer protections for genetic information. The protections go significantly beyond those that exist for other types of personal information in California under the California Consumer Privacy Act (CCPA), an approach that is justified given the unique sensitivity of genetic information. In particular, genetic information has the ability to reveal intimate information about health and familial connections, and is challenging to de-identify. The bill also contains certain aspects that are unique to the consumer genetics industry, such as the requirement that biological samples be destroyed upon request.

Similarly, SB 980 would go beyond FPF’s Best Practices by directly regulating the entire sector, rather than only the companies that have voluntarily chosen to adopt the Best Practices. Furthermore, although voluntary commitments can be enforced by the Federal Trade Commission (FTC) and others, such enforcement is necessarily limited to unfair and deceptive trade practices, and does not always allow for financial penalties. In contrast, SB 980 would establish civil penalties of up to $1,000 (for negligent violations) or $10,000 (for willful violations)

Penalties could add up quickly, as they are calculated on a per violation, per consumer basis. 

Conclusion // 

Genetic information carries the potential to empower consumers interested in learning about their health and heritage, and to fuel unparalleled discoveries in personalized medicine and genetic research. Given the Future of Privacy Forum’s mission to convene divergent stakeholders towards workable privacy practices for emerging technologies, it continues to be rewarding to play a role in shaping the leading practices for consumer genetic information. We are optimistic that SB 980 represents a major step forward for consumer rights.

Additional FPF Resources // 

  1. California SB 980 Would Codify Many of FPF’s Best Practices for Consumer Genetic Testing Services, but Key Differences Remain (2020); 
  2. FPF Releases Follow-Up Report on Consumer Genetics Companies and Practice of Transparency Reporting (2020);
  3. A Closer Look at Genetic Data Privacy and Nondiscrimination in 2020 (2020);
  4. FPF and Privacy Analytics Identify “A Practical Path Toward Genetic Privacy” (2020);
  5. Consumer Genetic Testing: A Q&A with Carson Martinez (2019).

How the Student Privacy Pledge Bolsters Legal Requirements and Supports Better Privacy in Education

The Student Privacy Pledge is a public and legally enforceable statement by edtech companies to safeguard student privacy, built around a dozen privacy commitments regarding the collection, maintenance, use, and sharing of student personal information. Since it was introduced in 2014 by the Future of Privacy Forum (FPF) and the Software and Information Industry Association, more than 400 edtech companies have signed the Pledge. In 2015, the White House endorsed the Pledge, and it has influenced company practices, school policies, and lawmakers’ approaches to regulating student privacy. Many school districts use the Pledge as they review prospective vendors, and it is aligned with—and has broader coverage than—the most widely-adopted state student privacy law.* 

We are proud that 436 companies have signed the Pledge, with well over 1,000 applications since June 2016. FPF reviews each applicant’s privacy policy and terms of service to ensure that signatories’ public statements align with the Pledge commitments. If an applicant’s policies do not align with the Pledge, we work with the company to bring them into line with the Pledge. We also work with applicants to ensure that they understand the commitments they are making when they become a Pledge signatory. This process may result in the applicant bringing internal compliance and legal resources to bear in ways they previously did not—an effect that increases accountability. Nearly every company that applies ends up altering their privacy practices and/or policies to become a Pledge signatory. 

The Student Privacy Pledge is a voluntary promise, not a law that applies to everyone. But once companies sign the Pledge, the Federal Trade Commission (FTC) and state Attorneys General (AG) have legal authority to ensure they keep their promises. The FTC and state AGs have a track record of using public commitments like the Student Privacy Pledge as tools to enforce companies’ privacy promises. These legal claims arise from the intersection between Pledge promises and Consumer Protection Unfair & Deceptive Acts & Practices (UDAP) statutes—without the Pledge commitments, it would be much more difficult for the FTC and AGs to bring enforcement actions under state and federal UDAP laws. In the absence of a comprehensive federal consumer privacy law, the Pledge provides an important and unique means for privacy enforcement; it complements state and federal student privacy laws that directly regulate companies and schools. 

In addition to enforcement at the federal and state levels, many schools require vendors to adhere to contracts that are modeled after or heavily mirror the Pledge, and schools have contractual rights to enforce these promises. If companies are found to break their commitments, schools can force a vendor to change practices, terminate contracts with the company, and sue for damages. 

When FPF learns of a complaint about a Pledge signatory, we analyze the issue and reach out to the signatory to understand the complaint, the signatory’s policies and practices, and other relevant information. We typically work with the company to resolve any pledge-covered practices that do not align with the Pledge. We seek to bring signatories into compliance with the Pledge rather than remove them as signatories – an action that could result in fewer privacy protections for users, as a former signatory would not be bound by the Pledge’s promises for future activities.

One of the most common misunderstandings about the Pledge is the assumption that the Pledge applies to all products offered by a signatory or used by a student. However, the Student Privacy Pledge applies to “school service providers”—companies that design and market their services and devices for use in schools. When a company offering services to both school audiences and general audiences becomes a Pledge signatory, the Pledge commitments only apply to the services they provide to schools. Companies selling tools or providing services to the general public are not obligated to redesign these products because they sign the Pledge. This is consistent with most state student privacy laws and proposed federal bills. 

The Pledge is by no means a replacement for pragmatic updates to existing student privacy laws and regulations, or for a comprehensive federal privacy law that would cover all consumers, which FPF supports. The Pledge is a set of commitments intended to build transparency and trust by obligating signatories to make baseline commitments about student privacy that can be enforced by the Federal Trade Commission and state attorneys general. The Pledge is not intended to be a comprehensive privacy policy nor to be inclusive of all the many requirements necessary for compliance with applicable federal and state laws. With that said, most signatories take the Pledge because they wish to be thoughtful and conscientious about privacy.

In 2019, we decided to analyze the Pledge in light of the evolving education ecosystem, incorporating what we’ve learned from reviewing thousands of edtech privacy policies and engaging directly with stakeholders and reviewing the 130 state laws that have passed since the Pledge was created. We are excited to apply this knowledge to the Student Privacy Pledge with Student Privacy Pledge 2020, being released this fall.

* The Student Privacy Pledge’s commitments are echoed in the most commonly passed student privacy law aimed at edtech service providers, California’s Student Online Personal Information Protection Act (Cal. Bus. & Prof. Code § 22584). A version of this law has been enacted by several  states, including: Arizona (Ariz. Rev. Stat. § 15-1046), Arkansas (Ark. Code Ann. § 6-18-109), Connecticut (Conn. Gen. Stat. §§ 10-234bb-234dd), Delaware (Del. Code. Ann. tit. 14, §§ 8101), Georgia (Ga. Stat. § 20-2-660), Hawaii (Hi. Rev. Stat. §§ 302A-499-500), Iowa (Iowa Code § 279.70), Kansas (K.S.A. 72-6312), Maine (Me. Rev. St. Ann. 20-A § 951), Maryland (Md. Educ. Code § 4-131), Michigan (Mich. Comp. Laws § 388.1295), Nebraska (Neb. Rev. Stat. § 79-2,153), New Hampshire (NH. St. § 189:68-a), New Jersey (Assembly Bill 4978, signed into law Jan. 2020), North Carolina (N.C. Gen. Stat. § 115C-401.2), Oregon (Or. Rev. Stat. Ann § 336.184-187)Texas (Tex. Educ. Code § 32.151), Virginia (Va. Code Ann. § 22.1-289.01), and Washington State (Wash. Rev. Code § 28A.604).

FPF Presents Expert Analysis to Washington State Lawmakers as Multiple States Weigh COVID-19 Privacy and Contact Tracing Legislation

 

In response to the ongoing public health emergency, over the past few months state legislatures in the United States have diverted their resources towards establishing state and local reopening plans, allocating federal aid, and promoting public trust and public participation by addressing concerns over privacy and civil liberties. 

Many states have introduced bills which would govern collection, use, and sharing of COVID-19 data by a range of entities, including government actors and commercial entities. Unless appropriate guardrails are put in place, data collected by governments through contact tracing could be used in unexpected, inappropriate, or even harmful ways. This has frequently been cited as a factor that undermines the likelihood of over-policed and undocumented individuals to participate in contact tracing, and is also one of the reasons why the Google-Apple Exposure Notification API is only available for decentralized apps. 

Below, we discuss FPF’s participation in a July 28th COVID-19 Public Work Session hosted by the Washington State Senate Committee on Environment, Energy & Technology, and the wide range of active COVID-19 legislation in state legislatures, including New York, New Jersey, and California.

Washington Public Work Session (July 28) 

On July 28, 2020, the Washington State Senate Committee on Environment, Energy & Technology held a Public Work Session to discuss government uses of data and contact tracing technologies. The Committee invited guest experts to give presentations, including FPF’s Senior Counsel Kelsey Finch, Consumer Reports’ Justin Brookman, and the Washington State Department of Health. 

In FPF’s presentation, we recommended for policymakers and technology providers to follow the lead of public health experts, and outlined key considerations when deciding how to design and implement digital contact tracing tools. Important considerations exist between public health, privacy, accuracy, effectiveness, equity, and trust; and best practices are emerging around: (1) transparency about data collection and sharing; (2) purpose & retention limitations (3) privacy impact assessments; (4) prioritization of accessibility; (5) SDK caution; (6) interoperability; and (7) security. Recognizing widespread consensus that apps ought to be voluntary, Ms. Finch also emphasized the need to find ways to promote and maintain public trust. 

These recommendations align with FPF’s recent report to promote responsible data use; and FPF’s April 2020 testimony on the topic of “Enlisting Big Data in the Fight Against Coronavirus,” convened by the U.S. Senate Committee on Commerce, Science, and Transportation.

Legislative Trends in the States 

State governments, private employers, and schools are increasingly turning to new technologies and digital solutions to help address the ongoing public health emergency. Over 20 states are considering, developing, or implementing decentralized Bluetooth-based apps based on the Apple Google Exposure Notification API, an effort supported nationally by the Association of Public Health Laboratories for individuals to receive exposure alerts even when they travel across state borders. 

Meanwhile, most state legislatures have suspended their sessions for the year, with only some states remaining in regular session, and others convening special sessions to address the pandemic. Contact tracing efforts (both manual and digital) rely not only on fast and reliable testing, but also on public participation and trust — a key public health consideration that is leading many states to consider how they can bolster public promises with strong privacy and data protection laws.

As a result, in some states, COVID-19 privacy bills have already been signed into law, including a few notable new state laws:

In other states, COVID-19 privacy legislation has been introduced and remains active — with some bills appearing likely to pass in upcoming weeks. Most notably, active bills in New York, New Jersey, and California, if passed, would create a range of new requirements for both private sector companies and government entities with respect to COVID-19 related health information.

New York

In New York, which will remain in session through the end of 2020, several COVID-19 privacy bills have gained traction in recent months, including:

New Jersey

In New Jersey, the legislative session runs through the end of 2020. In January, a number of geolocation data bills were introduced that remain technically under consideration (e.g., A 193 and A 5259), in addition to general comprehensive privacy bills (e.g., S269 and A2188). However, New Jersey legislators have since prioritized urgent pandemic response bills, including:

California 

In California, the legislature has generally prioritized pandemic related bills over other pieces of legislation such as amendments to the California Consumer Privacy Act (e.g., AB 3119 and AB 3212). However, some related privacy bills remain under consideration, such as other CCPA amendments (AB 1281 and AB 713), and a consumer genetics privacy bill (SB 980). In California, the final day for bills to be passed by the House or the Senate is August 31, 2020. 

Active COVID-19 privacy bills in California include:

On August 20, 2020, two additional noteworthy bills narrowly failed to progress out of the Senate Appropriations Committee, which would have regulated data related to established methods of contact tracing (AB 660) and digital contact tracing tools (AB 1782).

Overall Trends 

While most states, and the federal government, do not have a comprehensive baseline consumer privacy law that applies to all commercial uses of data, many existing federal and state laws do already apply to contact tracing efforts or to certain types of data (such as location data collected by cell phone carriers). For example, all states have unfair and deceptive practices (UDAP) laws and laws governing healthcare entities (supplementing HIPAA). Many states also have strong laws governing the confidentiality of state-held records, such as the California Confidentiality of Medical Information Act (Cal. Civil Code §§ 5656.37 [1992]), and the Uniform Health Care Information Act (National Conference, 988). 

However, as states increasingly contract with private entities to provide digital tools in response to the pandemic, COVID-19 policy frameworks are developing to regulate new data flows across public and private sectors, often involving sensitive location and health information. Both the application of existing state privacy laws and the introduction of new laws to address the pandemic are likely to influence federal and state privacy debates for years to come.

Protected: Future of Privacy Forum’s 2020 Annual Meeting

This content is password protected. Please enter a password to view.





Christy Harris Discusses Trends in Ad Tech

We’re talking to FPF senior policy experts about their work on important privacy issues. Today, Christy Harris, CIPP/US, Director of Technology and Privacy Research, is sharing her perspective on ad tech and privacy.

Prior to joining the FPF team, Christy spent almost 20 years at AOL, where she helped navigate novel consumer privacy issues in the development of internet staples such as AOL Mail, Advertising.com, MapQuest, and The Huffington Post. She also served as Privacy Program Manager at the cybersecurity company FireEye, Inc., where she implemented a vendor management program in preparation for the GDPR and worked to streamline the company’s global data practices.

 

Can you walk us through your career and how you became interested in privacy?

I worked at AOL for nearly 20 years, starting out by providing tech support in one of their call centers before moving up to AOL’s corporate headquarters. When I started working at AOL, the confidentiality of customer information was a core value, ingrained in everything we did, but online privacy as we know it today was barely a burgeoning field. Eventually, I moved to a position at AOL specifically focused on consumer advocacy, working with the Chief Trust Officer who oversaw a variety of consumer advocacy issues including anything related to privacy policies and the company’s data uses. 

Eventually, AOL’s consumer advocacy team evolved into its global privacy team, led by its first official Chief Privacy Officer, Jules [Polonetsky], and the responsibility for protecting user privacy became a rapidly growing team, with broader responsibilities and the ability  and authority to structure and encourage responsible company practices around user data. While Jules left AOL to launch FPF in 2009, AOL remained an FPF supporter, involved in various working groups and other FPF efforts over the years. 

In 2017, I left AOL and spent some time working as an independent consultant for several tech companies. With the EU’s GDPR going into effect in 2018, many companies were scrambling to ensure they would be compliant on Day 1 of its enforcement, and CCPA (California’s newest privacy law) was swiftly on its heels, leading to much uncertainty for companies trying to determine their compliance obligations and the most efficient approaches, while also avoiding costly re-architecture of established systems and processes. I eventually joined FireEye full-time, working across teams to implement a vendor management process that included mechanisms for ensuring global compliance in light of the GDPR. Like many companies managing global operations, there was a strong desire to streamline processes and practices to provide consistency both for customers as well as internal operations.

During my time as a consultant, I also worked on an ad tech-related project for FPF, which eventually led to my current role as the Director of Technology and Privacy Research. 

What projects are you working on at FPF related to ad tech and mobile platforms? 

On a daily basis, I keep a close eye on how companies operate in the online advertising and ad tech space, drawing on my experiences at AOL and an understanding of the operations and needs of advertisers, publishers, and platforms. I also approach ad tech from a more technical perspective: evaluating how ad tech providers build and implement their technology, understanding how the systems operate and to track the flow of consumer data, as well as recognizing the needs and demands of publisher and advertisers leveraging the vast amount of data available in conjunction with the offerings and services of ad tech providers. All of this is part of an overall effort to reconcile how advertisers want to use consumer data with consumer expectations around the use of their data.

A key focus of my work is training – helping policymakers, brands, and privacy officers understand the details and mechanics of online data use so they can each be most effective in their roles.  You can see some of our master class sessions online, and I and my colleagues are available for more tailored group sessions.

Earlier this year, we launched the International Digital Accountability Council (IDAC). After identifying the need for a third-party enforcement and accountability entity to address the gap between legislation and mobile platforms’ rules and requirements, we incubated the IDAC under the FPF umbrella. Today, the IDAC is an independent watchdog organization dedicated to ensuring a fair and trustworthy digital and application marketplace for consumers, encouraging companies to engage in responsible practices. I’m very proud of that effort and look forward to watching them continue to grow into a widely influential organization.

Balancing user expectations with industry standards is an interesting challenge. Consumers typically use an app because it will provide a specific service or allow them to achieve a specific goal — whether that’s managing a calendar, ordering dinner, or passing the time playing a fun game. App companies need to be clear about what it is they are providing to users, how they use and treat the data they collect, and ensure that any secondary or downstream uses of data are not unexpected or discriminatory (even if such uses ensure an app is free to use). 

What do you see happening over the next few years in ad-tech?

Over the past few years, we’ve seen the GDPR have a very significant, global impact — despite ostensibly being a European law, the GDPR has influenced companies’ behaviors with respect to consumer data worldwide. We’re seeing other countries follow the example of the GDPR, working to establish privacy regulations informed by European law and its interpretations. I’ve found it fascinating to see how different cultural norms and expectations with respect to privacy have impacted national and state privacy laws, and it will be interesting to see how they continue to evolve. For example, where Europe recognizes privacy as a fundamental human right and approaches default data practices from that perspective, U.S. companies often rely on a system of notice and choice, requiring users to opt out of certain practices as the default. These differing perspectives are often reflected in how companies collect, use, and share consumer data, and have to be embraced and adapted to accommodate a globally accessible and targeted internet. 

In the United States, we’ve seen California enact regulations embracing approaches similar to the GDPR from a perspective that reflects U.S. cultural norms. I expect California to serve as a driver of additional privacy legislation and the evolution of default approaches in the United States, but I also expect to see changes coming from the ad tech providers themselves as well as the companies leveraging their services. Companies with the power to determine what can be done within their respective environments, for example Google and Apple in the mobile platform context, often drive a significant portion of the policy standards and discussions today. When a company controls the platforms and technologies on it that may be used to interact with consumers, a seemingly minor change on that platform can cause a ripple effect felt across the ecosystem.

Ultimately, I think the push-pull between advertisers and the platforms on which they reach consumers will continue. The brands and advertisers themselves may not always be technical experts, but these organizations excel at finding creative ways to reach their goals. This is where we end up in a “whack-a-mole” situation – platforms’ goals may not align with those of the advertisers on their platforms, creating a constant balancing act. FPF’s role and perspective as data optimists allows us to bring together a variety of stakeholders and experts to help achieve the various goals, using data in new and innovative ways, while always respecting the users at the core of the conversation.

 

Congrats to National Student Clearinghouse



National Student Clearinghouse’s StudentTracker for High Schools Earns iKeepSafe FERPA Badge









July 22, 2015 iKeepSafe.org, a leading digital safety and privacy nonprofit, announced today that it has awarded its first privacy protection badge to StudentTrackerSM for High Schools from the National Student Clearinghouse, the largest provider of electronic student record exchanges in the U.S. Its selection as the first recipient of the new badge reflects the ongoing efforts of the Clearinghouse, which performs more than one billion secure electronic student data transactions each year, to protect student data privacy.









A nonprofit organization founded by the higher education community in 1993, the Clearinghouse provides educational reporting, verification, and research services to more than 3,600 colleges and universities and more than 9,000 high schools. Its services are also used by school districts and state education offices nationwide.









Earlier this year, iKeepSafe launched the first independent assessment program for the Family Educational Rights and Privacy Act (FERPA) to help educators and parents identify edtech services and tools that protect student data privacy.









“The National Student Clearinghouse is as committed to K12 learners as we are to those pursuing postsecondary education, and that also means we’re committed to protecting their data and educational records,” said Ricardo Torres, President and CEO of the Clearinghouse. “So many aspects of education are moving into the digital realm, and we’re focused on providing students with the privacy and protection they deserve in a rapidly changing digital environment.”









The Clearinghouse became the first organization to receive the iKeepSafe FERPA badge by completing a rigorous assessment of its StudentTrackerSM for High Schools product, privacy policy and practices. “As the first company to earn the iKeepSafe FERPA badge, the National Student Clearinghouse has demonstrated its dedication to K12 students and their families, and to the privacy and security of their data,” said iKeepSafe CEO Marsali Hancock.









Products participating in the iKeepSafe FERPA assessment must undergo annual re-evaluation to continue displaying the iKeepSafe FERPA badge. For the evaluation, an independent privacy expert reviewed the StudentTrackerSM for High Schools product, its privacy policy and practices, as well as its data security practices.









For more information, please visit http://ikeepsafe.org/educational-issues/clearinghouses/



India: Proposed Data Regulation Overhaul Includes New Draft Rules for Processing Non-Personal Data

Authors: Sameer Avasarala

——-

Disclaimers

This guest post is by Sameer Avasarala, a Data Protection and Technology Lawyer in Bengaluru. The material/opinion expressed is exclusively that of the author alone and does not expresses the views of Cyril Amarchand Mangaldas or any other firm / organization that the author is associated with. He can be contacted at [email protected]

Data protection and informational privacy have been gaining mainstream momentum in India with significant movement around the Aadhaar project, the forthcoming comprehensive regime for the protection of personal data and evolving data market trends. The legislators are now also considering regulating the processing of non-personal data, as shown in a new Report released by a Committee of experts put together for this purpose by the Ministry of Electronics and Information Technology. This contribution will set out the general background of data related regulatory efforts in India (1), and then it will look closely to the proposed rules for processing non-personal data: (2) its definition and classification, (3) the data localization requirement for sensitive and critical non-personal data, (4) guidance on anonymization, and (5) proposed data sharing obligations for organizations.

1) Setting the scene: a fundamental right to privacy and a growing data market

The recognition of a fundamental right to privacy by the Supreme Court[1] (Puttaswamy), as well as devising the triple test[2] as a basis to evaluate laws which may restrict the right to privacy and its application to the Aadhaar project[3], have been instrumental in triggering mainstream discourse around privacy in India.

At the same time, the data market in India is an exponentially growing market, with some studies estimating it to be a USD 16 billion industry by 2025 at a staggering 26% compounded annual growth. The government recognizes the need for a data governance framework to act as a catalyst for the growth of data economy in India.

Based on the normative foundation in Puttaswamy, the Ministry of Electronics and Information Technology (‘MEITY’) has constituted a Committee of Experts for a data protection framework, whose report and the resulting draft bill led to the introduction of the Personal Data Protection Bill in 2019 (‘PDP Bill’) in the Parliament. The PDP Bill is currently being reviewed by a joint parliamentary committee which is expected to present its report before the Parliament in the upcoming monsoon session, which may in turn be rescheduled owing to the COVID-19 pandemic.

Separately, the MEITY also constituted a committee of experts to deliberate on a data governance framework for India (‘Committee’) with a view to study various issues relating to non-personal data and make specific suggestions on its regulation. The Committee released its report on July 12, 2020 (‘Report’) and makes substantive recommendations on the scope, classification, ownership and other issues related to non-personal data. It also makes a clarion call for a comprehensive non-personal data regulation in India, to complement the future law dedicated to personal data. In addition, the Committee recommends the establishment of an overarching, cross-sectoral Non-Personal Data Authority (‘NPDA’).

2) Proposed definition and classification of non-personal data

The Report identifies existing issues such as entry barriers for startups and new businesses owing to first-mover advantage of market leaders and data monopolies, to name a few. Business, innovation and research are identified as cornerstones for furthering an inclusive framework for India’s data economy. It is also in line with the Draft National e-Commerce Policyin identifying data of Indian residents as an important ‘national resource’.

‘Non-personal data’ has been defined in the Report as any data that is not personal data[4], or is without any personally identifiable information. This includes personal data that has been anonymized[5] and aggregated data in which individual specific events are no longer identifiable, apart from data that was never personally identifiable. The Report classifies non-personal data into:

The Report recognizes natural persons, entities and communities to whom non-personal data (prior to anonymization or aggregation) relates as ‘data principals’ and entities which undertake collection, storage and processing of non-personal data as ‘data custodians’. It also enables communities or groups of data principals to exercise their rights through ‘data trustees’.

3) Data localization requirements for sensitive and critical non-personal data

The Report classifies individuals to whom the data relates before it is being anonymized, as the ‘owners’ of private non-personal data and it recommends obtaining consent of the data principal (at the time of collection) for anonymization and use thereafter.

Private non-personal data is also further sub-classified based on a sensitivity spectrum, taking into account considerations of national security, collective harm, invasion to collective privacy, business sensitive information and anonymized data. Private non-personal data is, thus, categorized into ‘sensitive non-personal data’ and ‘critical non-personal data’. Sensitive personal data[6] and critical personal data[7] which have been anonymized will be considered to be ‘sensitive non-personal data’ and ‘critical non-personal data’ respectively. The Report recommends localization of sensitive non-personal data and critical non-personal data, in line with the requirements applicable to localization[8] of sensitive personal data and critical personal data under the PDP Bill.

4) Guidance on anonymization

Though an offshoot to regulation of non-personal data, the Report provides new insight into the regulatory perspective on anonymization of personal data in India. From a lack of an anonymization standard under the current information technology law[9], to an indicative list of de-identifiers for ‘totally anonymized data’ applicable to health records, the regulatory viewpoint on anonymization has been vastly inconsistent. Recently, a protocol released by the MEITY in relation to the Aarogya Setu, a contact tracing mobile application, indicated a high anonymization standard, based on ‘means likely to be used to identify’ individuals, generally similar to the General Data Protection Regulation.

Against this background, the Report recognizes the residual risk of re-identification associated with anonymized information and considers anonymized sensitive personal data and critical personal data as sensitive NPD and critical NPD respectively. While the Report suggests techniques and tools for anonymization, as part of an anonymization primer, the introduction of data localization requirements and classification of non-personal data in a similar manner to personal data may deter in practice the use of anonymized information.

5) Data sharing and registration obligations

The Report recognizes ‘data businesses’ as a horizontal category of businesses involved in data collection and processing. Based on specific threshold requirements, the Report proposes a compliance regime to govern such data businesses, including registration and mandatory disclosure of specific information to the NPDA. Interestingly, a similar requirement for ‘data fiduciaries’ is included in the PDP Bill[10]. Accordingly, they would need to submit to the proposed Data Protection Authority any personal data anonymized or other non-personal data to enable better targeting of delivery of service or formulation of evidence-based policies to the Government.

The Report on regulating non-personal data is also proposing that data custodians may be required to share non-personal metadata about users and communities, to be stored digitally in meta-data directories in India and made available on an open-access basis to encourage development of novel products and services.

The Report contemplates three broad purposes for data sharing:

  1. a) Non-personal data shared for sovereign purposes may be used by the Government, regulators and law enforcement authorities, inter alia, for cyber security, crime and investigation, public health and in sectoral developments.
  2. b) Non-personal data shared for core public interest purposes may be used for general and community use, research and innovation, delivery of public services, policy development etc.
  3. c) Non-personal data shared for economic purposes may be used by business entities for research, innovation and doing business. It may also be leveraged as training data for AI/ML systems.

A ‘checks-and-balances’ system is proposed for ensuring compliance with data sharing and other requirements based on measures such as expert probing for vulnerabilities. The Report also recommends establishments of data spaces, data trusts and cloud innovation labs and research centers which may act as physical environments to test and implement digital solutions and promote intensive data-based research. It also includes guiding principles for a technology architecture to digitally implement rules for data sharing, ranging from mechanisms for accessing data through data trusts, standardized data exchange processes, techniques to prevent re-identification of anonymized information and distributed storage for data security.

The Report recommends a three-tiered system architecture including legal safeguards, technology and compliance to enable data sharing, in addition to a policy switch which enables a single digital clearing house for regulatory management of non-personal data.

Finally, the Report proposes classification of high value or special public interest data sets, for instance, geospatial, telecommunications and health data. However, it does not specifically indicate any implications of such classification.

Compliance with processing of non-personal data requirements would be ensured by a newly created NPDA. The Report recognizes the need to harmonize guidance issued by the NPDA in line with sectoral regulations. The NPDA is sketched out to have an enabling role (to ensure a level playing field) in addition to enforcement (to address market failures),

6) Conclusion: More to come

The Committee is currently inviting public comments[11] and is likely to hold public consultations on the policy options proposed. While there is no clear timeline around framing and enacting a data governance framework for non-personal data, it is likely that the PDP Bill would be enacted by the Parliament prior to it. The PDP Bill may also be relevant in setting context for the forthcoming non-personal data framework, given the ability of the Government to solicit non-personal and anonymized personal data.

While the Report is helpful in setting context for the forthcoming regulations for non-personal data and in proposing a data governance regime, the Government is likely to evaluate its content, hold wider consultations and consider other policy aspects prior to formulating a comprehensive data framework governing non-personal data in India.

[1] Justice K. S. Puttaswamy v. Union of India, (2017) 10 SCC 1

[2] Modern Dental College & Research Centre & Ors v. State of Madhya Pradesh & Ors, AIR 2016 SC 2601

[3] Justice K. S. Puttaswamy v. Union of India, (2019) 1 SCC 1

[4] Rule 2(1)(i), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011; Section 3(28), Personal Data Protection Bill, 2019

[5] Section 3(3), Personal Data Protection Bill, 2019

[6] Section 3(36), Personal Data Protection Bill, 2019

[7] Section 33, Personal Data Protection Bill, 2019

[8] Section 34, Personal Data Protection Bill, 2019

[9] Rule 2(1)(i), The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (SPDI Rules)

[10] Section 91, Personal Data Protection Bill, 2019

[11] MyGov ‘Share your Inputs on the Draft Non-Personal Data Governance Framework’, available at https://www.mygov.in/task/share-your-inputs-draft-non-personal-data-governance-framework/

Student Privacy, Special Education, and Online Learning: Navigating COVID-19 and Beyond

COVID-19 continues to disrupt education and has forced many schools to pivot to virtual, or online, learning for the fall semester. And virtual learning poses unique student privacy challenges, particularly for students with disabilities, such as:

Can a student’s family members and parents be present during a live virtual class?

Can a class be recorded for a student to view later?

Can a student receive one-on-one services or teletherapy via video conferencing?

How do you know if a video conferencing platform is safe to use?

To help educators navigate some of these common challenges and concerns, the Future of Privacy Forum (FPF) and the National Center for Learning Disabilities (NCLD) partnered to develop a new resource, Student Privacy and Special Education: An Educator’s Guide During and After COVID-19. The Educator’s Guide covers how federal laws including the Family Educational Rights and Privacy Act (FERPA), Individuals with Disabilities Education Act (IDEA), Children’s Online Privacy and Protection Act (COPPA), and Health Insurance Portability and Accountability Act (HIPAA) apply to virtual learning.

To help address a common source of frustration and confusion for many educators, students, and their families, the Educator’s Guide outlines best practices for educators around the use of video classrooms, including:

As part of their reopening plans, many schools plan on collecting new levels of sensitive health information from staff, students, and their families to assess and mitigate health risks. This information will be collected in a variety of manners, including self-reported surveys or screenings such as temperature checks. Students with disabilities or special health care considerations that may be more vulnerable to COVID-19 may be at risk of discrimination based on their health or disability status. For more insight on the student privacy implications of this data collection, read FPF’s issue brief on this topic here. Educators seeking further support may be interested in FPF’s new series of “Student Privacy and Pandemics” professional development training for educators, which includes a module focused on video classrooms. FPF is also developing an ongoing series of issue briefs on the student privacy implications of various reopening strategies schools are considering, including the uses of temperature checks and thermal scans, as well as wearable technologies, to help identify potential COVID-19 cases.