FPF Presents Expert Analysis to Washington State Lawmakers as Multiple States Weigh COVID-19 Privacy and Contact Tracing Legislation

 

In response to the ongoing public health emergency, over the past few months state legislatures in the United States have diverted their resources towards establishing state and local reopening plans, allocating federal aid, and promoting public trust and public participation by addressing concerns over privacy and civil liberties. 

Many states have introduced bills which would govern collection, use, and sharing of COVID-19 data by a range of entities, including government actors and commercial entities. Unless appropriate guardrails are put in place, data collected by governments through contact tracing could be used in unexpected, inappropriate, or even harmful ways. This has frequently been cited as a factor that undermines the likelihood of over-policed and undocumented individuals to participate in contact tracing, and is also one of the reasons why the Google-Apple Exposure Notification API is only available for decentralized apps. 

Below, we discuss FPF’s participation in a July 28th COVID-19 Public Work Session hosted by the Washington State Senate Committee on Environment, Energy & Technology, and the wide range of active COVID-19 legislation in state legislatures, including New York, New Jersey, and California.

Washington Public Work Session (July 28) 

On July 28, 2020, the Washington State Senate Committee on Environment, Energy & Technology held a Public Work Session to discuss government uses of data and contact tracing technologies. The Committee invited guest experts to give presentations, including FPF’s Senior Counsel Kelsey Finch, Consumer Reports’ Justin Brookman, and the Washington State Department of Health. 

In FPF’s presentation, we recommended for policymakers and technology providers to follow the lead of public health experts, and outlined key considerations when deciding how to design and implement digital contact tracing tools. Important considerations exist between public health, privacy, accuracy, effectiveness, equity, and trust; and best practices are emerging around: (1) transparency about data collection and sharing; (2) purpose & retention limitations (3) privacy impact assessments; (4) prioritization of accessibility; (5) SDK caution; (6) interoperability; and (7) security. Recognizing widespread consensus that apps ought to be voluntary, Ms. Finch also emphasized the need to find ways to promote and maintain public trust. 

These recommendations align with FPF’s recent report to promote responsible data use; and FPF’s April 2020 testimony on the topic of “Enlisting Big Data in the Fight Against Coronavirus,” convened by the U.S. Senate Committee on Commerce, Science, and Transportation.

Legislative Trends in the States 

State governments, private employers, and schools are increasingly turning to new technologies and digital solutions to help address the ongoing public health emergency. Over 20 states are considering, developing, or implementing decentralized Bluetooth-based apps based on the Apple Google Exposure Notification API, an effort supported nationally by the Association of Public Health Laboratories for individuals to receive exposure alerts even when they travel across state borders. 

Meanwhile, most state legislatures have suspended their sessions for the year, with only some states remaining in regular session, and others convening special sessions to address the pandemic. Contact tracing efforts (both manual and digital) rely not only on fast and reliable testing, but also on public participation and trust — a key public health consideration that is leading many states to consider how they can bolster public promises with strong privacy and data protection laws.

As a result, in some states, COVID-19 privacy bills have already been signed into law, including a few notable new state laws:

In other states, COVID-19 privacy legislation has been introduced and remains active — with some bills appearing likely to pass in upcoming weeks. Most notably, active bills in New York, New Jersey, and California, if passed, would create a range of new requirements for both private sector companies and government entities with respect to COVID-19 related health information.

New York

In New York, which will remain in session through the end of 2020, several COVID-19 privacy bills have gained traction in recent months, including:

New Jersey

In New Jersey, the legislative session runs through the end of 2020. In January, a number of geolocation data bills were introduced that remain technically under consideration (e.g., A 193 and A 5259), in addition to general comprehensive privacy bills (e.g., S269 and A2188). However, New Jersey legislators have since prioritized urgent pandemic response bills, including:

California 

In California, the legislature has generally prioritized pandemic related bills over other pieces of legislation such as amendments to the California Consumer Privacy Act (e.g., AB 3119 and AB 3212). However, some related privacy bills remain under consideration, such as other CCPA amendments (AB 1281 and AB 713), and a consumer genetics privacy bill (SB 980). In California, the final day for bills to be passed by the House or the Senate is August 31, 2020. 

Active COVID-19 privacy bills in California include:

On August 20, 2020, two additional noteworthy bills narrowly failed to progress out of the Senate Appropriations Committee, which would have regulated data related to established methods of contact tracing (AB 660) and digital contact tracing tools (AB 1782).

Overall Trends 

While most states, and the federal government, do not have a comprehensive baseline consumer privacy law that applies to all commercial uses of data, many existing federal and state laws do already apply to contact tracing efforts or to certain types of data (such as location data collected by cell phone carriers). For example, all states have unfair and deceptive practices (UDAP) laws and laws governing healthcare entities (supplementing HIPAA). Many states also have strong laws governing the confidentiality of state-held records, such as the California Confidentiality of Medical Information Act (Cal. Civil Code §§ 5656.37 [1992]), and the Uniform Health Care Information Act (National Conference, 988). 

However, as states increasingly contract with private entities to provide digital tools in response to the pandemic, COVID-19 policy frameworks are developing to regulate new data flows across public and private sectors, often involving sensitive location and health information. Both the application of existing state privacy laws and the introduction of new laws to address the pandemic are likely to influence federal and state privacy debates for years to come.

Protected: Future of Privacy Forum’s 2020 Annual Meeting

This content is password protected. Please enter a password to view.





Christy Harris Discusses Trends in Ad Tech

We’re talking to FPF senior policy experts about their work on important privacy issues. Today, Christy Harris, CIPP/US, Director of Technology and Privacy Research, is sharing her perspective on ad tech and privacy.

Prior to joining the FPF team, Christy spent almost 20 years at AOL, where she helped navigate novel consumer privacy issues in the development of internet staples such as AOL Mail, Advertising.com, MapQuest, and The Huffington Post. She also served as Privacy Program Manager at the cybersecurity company FireEye, Inc., where she implemented a vendor management program in preparation for the GDPR and worked to streamline the company’s global data practices.

 

Can you walk us through your career and how you became interested in privacy?

I worked at AOL for nearly 20 years, starting out by providing tech support in one of their call centers before moving up to AOL’s corporate headquarters. When I started working at AOL, the confidentiality of customer information was a core value, ingrained in everything we did, but online privacy as we know it today was barely a burgeoning field. Eventually, I moved to a position at AOL specifically focused on consumer advocacy, working with the Chief Trust Officer who oversaw a variety of consumer advocacy issues including anything related to privacy policies and the company’s data uses. 

Eventually, AOL’s consumer advocacy team evolved into its global privacy team, led by its first official Chief Privacy Officer, Jules [Polonetsky], and the responsibility for protecting user privacy became a rapidly growing team, with broader responsibilities and the ability  and authority to structure and encourage responsible company practices around user data. While Jules left AOL to launch FPF in 2009, AOL remained an FPF supporter, involved in various working groups and other FPF efforts over the years. 

In 2017, I left AOL and spent some time working as an independent consultant for several tech companies. With the EU’s GDPR going into effect in 2018, many companies were scrambling to ensure they would be compliant on Day 1 of its enforcement, and CCPA (California’s newest privacy law) was swiftly on its heels, leading to much uncertainty for companies trying to determine their compliance obligations and the most efficient approaches, while also avoiding costly re-architecture of established systems and processes. I eventually joined FireEye full-time, working across teams to implement a vendor management process that included mechanisms for ensuring global compliance in light of the GDPR. Like many companies managing global operations, there was a strong desire to streamline processes and practices to provide consistency both for customers as well as internal operations.

During my time as a consultant, I also worked on an ad tech-related project for FPF, which eventually led to my current role as the Director of Technology and Privacy Research. 

What projects are you working on at FPF related to ad tech and mobile platforms? 

On a daily basis, I keep a close eye on how companies operate in the online advertising and ad tech space, drawing on my experiences at AOL and an understanding of the operations and needs of advertisers, publishers, and platforms. I also approach ad tech from a more technical perspective: evaluating how ad tech providers build and implement their technology, understanding how the systems operate and to track the flow of consumer data, as well as recognizing the needs and demands of publisher and advertisers leveraging the vast amount of data available in conjunction with the offerings and services of ad tech providers. All of this is part of an overall effort to reconcile how advertisers want to use consumer data with consumer expectations around the use of their data.

A key focus of my work is training – helping policymakers, brands, and privacy officers understand the details and mechanics of online data use so they can each be most effective in their roles.  You can see some of our master class sessions online, and I and my colleagues are available for more tailored group sessions.

Earlier this year, we launched the International Digital Accountability Council (IDAC). After identifying the need for a third-party enforcement and accountability entity to address the gap between legislation and mobile platforms’ rules and requirements, we incubated the IDAC under the FPF umbrella. Today, the IDAC is an independent watchdog organization dedicated to ensuring a fair and trustworthy digital and application marketplace for consumers, encouraging companies to engage in responsible practices. I’m very proud of that effort and look forward to watching them continue to grow into a widely influential organization.

Balancing user expectations with industry standards is an interesting challenge. Consumers typically use an app because it will provide a specific service or allow them to achieve a specific goal — whether that’s managing a calendar, ordering dinner, or passing the time playing a fun game. App companies need to be clear about what it is they are providing to users, how they use and treat the data they collect, and ensure that any secondary or downstream uses of data are not unexpected or discriminatory (even if such uses ensure an app is free to use). 

What do you see happening over the next few years in ad-tech?

Over the past few years, we’ve seen the GDPR have a very significant, global impact — despite ostensibly being a European law, the GDPR has influenced companies’ behaviors with respect to consumer data worldwide. We’re seeing other countries follow the example of the GDPR, working to establish privacy regulations informed by European law and its interpretations. I’ve found it fascinating to see how different cultural norms and expectations with respect to privacy have impacted national and state privacy laws, and it will be interesting to see how they continue to evolve. For example, where Europe recognizes privacy as a fundamental human right and approaches default data practices from that perspective, U.S. companies often rely on a system of notice and choice, requiring users to opt out of certain practices as the default. These differing perspectives are often reflected in how companies collect, use, and share consumer data, and have to be embraced and adapted to accommodate a globally accessible and targeted internet. 

In the United States, we’ve seen California enact regulations embracing approaches similar to the GDPR from a perspective that reflects U.S. cultural norms. I expect California to serve as a driver of additional privacy legislation and the evolution of default approaches in the United States, but I also expect to see changes coming from the ad tech providers themselves as well as the companies leveraging their services. Companies with the power to determine what can be done within their respective environments, for example Google and Apple in the mobile platform context, often drive a significant portion of the policy standards and discussions today. When a company controls the platforms and technologies on it that may be used to interact with consumers, a seemingly minor change on that platform can cause a ripple effect felt across the ecosystem.

Ultimately, I think the push-pull between advertisers and the platforms on which they reach consumers will continue. The brands and advertisers themselves may not always be technical experts, but these organizations excel at finding creative ways to reach their goals. This is where we end up in a “whack-a-mole” situation – platforms’ goals may not align with those of the advertisers on their platforms, creating a constant balancing act. FPF’s role and perspective as data optimists allows us to bring together a variety of stakeholders and experts to help achieve the various goals, using data in new and innovative ways, while always respecting the users at the core of the conversation.

 

Congrats to National Student Clearinghouse



National Student Clearinghouse’s StudentTracker for High Schools Earns iKeepSafe FERPA Badge









July 22, 2015 iKeepSafe.org, a leading digital safety and privacy nonprofit, announced today that it has awarded its first privacy protection badge to StudentTrackerSM for High Schools from the National Student Clearinghouse, the largest provider of electronic student record exchanges in the U.S. Its selection as the first recipient of the new badge reflects the ongoing efforts of the Clearinghouse, which performs more than one billion secure electronic student data transactions each year, to protect student data privacy.









A nonprofit organization founded by the higher education community in 1993, the Clearinghouse provides educational reporting, verification, and research services to more than 3,600 colleges and universities and more than 9,000 high schools. Its services are also used by school districts and state education offices nationwide.









Earlier this year, iKeepSafe launched the first independent assessment program for the Family Educational Rights and Privacy Act (FERPA) to help educators and parents identify edtech services and tools that protect student data privacy.









“The National Student Clearinghouse is as committed to K12 learners as we are to those pursuing postsecondary education, and that also means we’re committed to protecting their data and educational records,” said Ricardo Torres, President and CEO of the Clearinghouse. “So many aspects of education are moving into the digital realm, and we’re focused on providing students with the privacy and protection they deserve in a rapidly changing digital environment.”









The Clearinghouse became the first organization to receive the iKeepSafe FERPA badge by completing a rigorous assessment of its StudentTrackerSM for High Schools product, privacy policy and practices. “As the first company to earn the iKeepSafe FERPA badge, the National Student Clearinghouse has demonstrated its dedication to K12 students and their families, and to the privacy and security of their data,” said iKeepSafe CEO Marsali Hancock.









Products participating in the iKeepSafe FERPA assessment must undergo annual re-evaluation to continue displaying the iKeepSafe FERPA badge. For the evaluation, an independent privacy expert reviewed the StudentTrackerSM for High Schools product, its privacy policy and practices, as well as its data security practices.









For more information, please visit http://ikeepsafe.org/educational-issues/clearinghouses/



India: Proposed Data Regulation Overhaul Includes New Draft Rules for Processing Non-Personal Data

Authors: Sameer Avasarala

——-

Disclaimers

This guest post is by Sameer Avasarala, a Data Protection and Technology Lawyer in Bengaluru. The material/opinion expressed is exclusively that of the author alone and does not expresses the views of Cyril Amarchand Mangaldas or any other firm / organization that the author is associated with. He can be contacted at [email protected]

Data protection and informational privacy have been gaining mainstream momentum in India with significant movement around the Aadhaar project, the forthcoming comprehensive regime for the protection of personal data and evolving data market trends. The legislators are now also considering regulating the processing of non-personal data, as shown in a new Report released by a Committee of experts put together for this purpose by the Ministry of Electronics and Information Technology. This contribution will set out the general background of data related regulatory efforts in India (1), and then it will look closely to the proposed rules for processing non-personal data: (2) its definition and classification, (3) the data localization requirement for sensitive and critical non-personal data, (4) guidance on anonymization, and (5) proposed data sharing obligations for organizations.

1) Setting the scene: a fundamental right to privacy and a growing data market

The recognition of a fundamental right to privacy by the Supreme Court[1] (Puttaswamy), as well as devising the triple test[2] as a basis to evaluate laws which may restrict the right to privacy and its application to the Aadhaar project[3], have been instrumental in triggering mainstream discourse around privacy in India.

At the same time, the data market in India is an exponentially growing market, with some studies estimating it to be a USD 16 billion industry by 2025 at a staggering 26% compounded annual growth. The government recognizes the need for a data governance framework to act as a catalyst for the growth of data economy in India.

Based on the normative foundation in Puttaswamy, the Ministry of Electronics and Information Technology (‘MEITY’) has constituted a Committee of Experts for a data protection framework, whose report and the resulting draft bill led to the introduction of the Personal Data Protection Bill in 2019 (‘PDP Bill’) in the Parliament. The PDP Bill is currently being reviewed by a joint parliamentary committee which is expected to present its report before the Parliament in the upcoming monsoon session, which may in turn be rescheduled owing to the COVID-19 pandemic.

Separately, the MEITY also constituted a committee of experts to deliberate on a data governance framework for India (‘Committee’) with a view to study various issues relating to non-personal data and make specific suggestions on its regulation. The Committee released its report on July 12, 2020 (‘Report’) and makes substantive recommendations on the scope, classification, ownership and other issues related to non-personal data. It also makes a clarion call for a comprehensive non-personal data regulation in India, to complement the future law dedicated to personal data. In addition, the Committee recommends the establishment of an overarching, cross-sectoral Non-Personal Data Authority (‘NPDA’).

2) Proposed definition and classification of non-personal data

The Report identifies existing issues such as entry barriers for startups and new businesses owing to first-mover advantage of market leaders and data monopolies, to name a few. Business, innovation and research are identified as cornerstones for furthering an inclusive framework for India’s data economy. It is also in line with the Draft National e-Commerce Policyin identifying data of Indian residents as an important ‘national resource’.

‘Non-personal data’ has been defined in the Report as any data that is not personal data[4], or is without any personally identifiable information. This includes personal data that has been anonymized[5] and aggregated data in which individual specific events are no longer identifiable, apart from data that was never personally identifiable. The Report classifies non-personal data into:

The Report recognizes natural persons, entities and communities to whom non-personal data (prior to anonymization or aggregation) relates as ‘data principals’ and entities which undertake collection, storage and processing of non-personal data as ‘data custodians’. It also enables communities or groups of data principals to exercise their rights through ‘data trustees’.

3) Data localization requirements for sensitive and critical non-personal data

The Report classifies individuals to whom the data relates before it is being anonymized, as the ‘owners’ of private non-personal data and it recommends obtaining consent of the data principal (at the time of collection) for anonymization and use thereafter.

Private non-personal data is also further sub-classified based on a sensitivity spectrum, taking into account considerations of national security, collective harm, invasion to collective privacy, business sensitive information and anonymized data. Private non-personal data is, thus, categorized into ‘sensitive non-personal data’ and ‘critical non-personal data’. Sensitive personal data[6] and critical personal data[7] which have been anonymized will be considered to be ‘sensitive non-personal data’ and ‘critical non-personal data’ respectively. The Report recommends localization of sensitive non-personal data and critical non-personal data, in line with the requirements applicable to localization[8] of sensitive personal data and critical personal data under the PDP Bill.

4) Guidance on anonymization

Though an offshoot to regulation of non-personal data, the Report provides new insight into the regulatory perspective on anonymization of personal data in India. From a lack of an anonymization standard under the current information technology law[9], to an indicative list of de-identifiers for ‘totally anonymized data’ applicable to health records, the regulatory viewpoint on anonymization has been vastly inconsistent. Recently, a protocol released by the MEITY in relation to the Aarogya Setu, a contact tracing mobile application, indicated a high anonymization standard, based on ‘means likely to be used to identify’ individuals, generally similar to the General Data Protection Regulation.

Against this background, the Report recognizes the residual risk of re-identification associated with anonymized information and considers anonymized sensitive personal data and critical personal data as sensitive NPD and critical NPD respectively. While the Report suggests techniques and tools for anonymization, as part of an anonymization primer, the introduction of data localization requirements and classification of non-personal data in a similar manner to personal data may deter in practice the use of anonymized information.

5) Data sharing and registration obligations

The Report recognizes ‘data businesses’ as a horizontal category of businesses involved in data collection and processing. Based on specific threshold requirements, the Report proposes a compliance regime to govern such data businesses, including registration and mandatory disclosure of specific information to the NPDA. Interestingly, a similar requirement for ‘data fiduciaries’ is included in the PDP Bill[10]. Accordingly, they would need to submit to the proposed Data Protection Authority any personal data anonymized or other non-personal data to enable better targeting of delivery of service or formulation of evidence-based policies to the Government.

The Report on regulating non-personal data is also proposing that data custodians may be required to share non-personal metadata about users and communities, to be stored digitally in meta-data directories in India and made available on an open-access basis to encourage development of novel products and services.

The Report contemplates three broad purposes for data sharing:

  1. a) Non-personal data shared for sovereign purposes may be used by the Government, regulators and law enforcement authorities, inter alia, for cyber security, crime and investigation, public health and in sectoral developments.
  2. b) Non-personal data shared for core public interest purposes may be used for general and community use, research and innovation, delivery of public services, policy development etc.
  3. c) Non-personal data shared for economic purposes may be used by business entities for research, innovation and doing business. It may also be leveraged as training data for AI/ML systems.

A ‘checks-and-balances’ system is proposed for ensuring compliance with data sharing and other requirements based on measures such as expert probing for vulnerabilities. The Report also recommends establishments of data spaces, data trusts and cloud innovation labs and research centers which may act as physical environments to test and implement digital solutions and promote intensive data-based research. It also includes guiding principles for a technology architecture to digitally implement rules for data sharing, ranging from mechanisms for accessing data through data trusts, standardized data exchange processes, techniques to prevent re-identification of anonymized information and distributed storage for data security.

The Report recommends a three-tiered system architecture including legal safeguards, technology and compliance to enable data sharing, in addition to a policy switch which enables a single digital clearing house for regulatory management of non-personal data.

Finally, the Report proposes classification of high value or special public interest data sets, for instance, geospatial, telecommunications and health data. However, it does not specifically indicate any implications of such classification.

Compliance with processing of non-personal data requirements would be ensured by a newly created NPDA. The Report recognizes the need to harmonize guidance issued by the NPDA in line with sectoral regulations. The NPDA is sketched out to have an enabling role (to ensure a level playing field) in addition to enforcement (to address market failures),

6) Conclusion: More to come

The Committee is currently inviting public comments[11] and is likely to hold public consultations on the policy options proposed. While there is no clear timeline around framing and enacting a data governance framework for non-personal data, it is likely that the PDP Bill would be enacted by the Parliament prior to it. The PDP Bill may also be relevant in setting context for the forthcoming non-personal data framework, given the ability of the Government to solicit non-personal and anonymized personal data.

While the Report is helpful in setting context for the forthcoming regulations for non-personal data and in proposing a data governance regime, the Government is likely to evaluate its content, hold wider consultations and consider other policy aspects prior to formulating a comprehensive data framework governing non-personal data in India.

[1] Justice K. S. Puttaswamy v. Union of India, (2017) 10 SCC 1

[2] Modern Dental College & Research Centre & Ors v. State of Madhya Pradesh & Ors, AIR 2016 SC 2601

[3] Justice K. S. Puttaswamy v. Union of India, (2019) 1 SCC 1

[4] Rule 2(1)(i), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011; Section 3(28), Personal Data Protection Bill, 2019

[5] Section 3(3), Personal Data Protection Bill, 2019

[6] Section 3(36), Personal Data Protection Bill, 2019

[7] Section 33, Personal Data Protection Bill, 2019

[8] Section 34, Personal Data Protection Bill, 2019

[9] Rule 2(1)(i), The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (SPDI Rules)

[10] Section 91, Personal Data Protection Bill, 2019

[11] MyGov ‘Share your Inputs on the Draft Non-Personal Data Governance Framework’, available at https://www.mygov.in/task/share-your-inputs-draft-non-personal-data-governance-framework/

Student Privacy, Special Education, and Online Learning: Navigating COVID-19 and Beyond

COVID-19 continues to disrupt education and has forced many schools to pivot to virtual, or online, learning for the fall semester. And virtual learning poses unique student privacy challenges, particularly for students with disabilities, such as:

Can a student’s family members and parents be present during a live virtual class?

Can a class be recorded for a student to view later?

Can a student receive one-on-one services or teletherapy via video conferencing?

How do you know if a video conferencing platform is safe to use?

To help educators navigate some of these common challenges and concerns, the Future of Privacy Forum (FPF) and the National Center for Learning Disabilities (NCLD) partnered to develop a new resource, Student Privacy and Special Education: An Educator’s Guide During and After COVID-19. The Educator’s Guide covers how federal laws including the Family Educational Rights and Privacy Act (FERPA), Individuals with Disabilities Education Act (IDEA), Children’s Online Privacy and Protection Act (COPPA), and Health Insurance Portability and Accountability Act (HIPAA) apply to virtual learning.

To help address a common source of frustration and confusion for many educators, students, and their families, the Educator’s Guide outlines best practices for educators around the use of video classrooms, including:

As part of their reopening plans, many schools plan on collecting new levels of sensitive health information from staff, students, and their families to assess and mitigate health risks. This information will be collected in a variety of manners, including self-reported surveys or screenings such as temperature checks. Students with disabilities or special health care considerations that may be more vulnerable to COVID-19 may be at risk of discrimination based on their health or disability status. For more insight on the student privacy implications of this data collection, read FPF’s issue brief on this topic here. Educators seeking further support may be interested in FPF’s new series of “Student Privacy and Pandemics” professional development training for educators, which includes a module focused on video classrooms. FPF is also developing an ongoing series of issue briefs on the student privacy implications of various reopening strategies schools are considering, including the uses of temperature checks and thermal scans, as well as wearable technologies, to help identify potential COVID-19 cases.

Future of Privacy Forum, National Center on Learning Disabilities Release New Student Privacy and Virtual Learning Guide

The Future of Privacy Forum (FPF) and the National Center for Learning Disabilities (NCLD) today released a new resource designed to help educators navigate the unique student privacy challenges raised by COVID-19 and the shift to virtual learning, particularly for students with disabilities. Student Privacy and Special Education: An Educator’s Guide During and After COVID-19 is available for download and use at this link.

“As educators teach virtually even more, there will be many barriers to instructing every student, but concerns about data privacy shouldn’t be one,” said Lindsay Jones, President and CEO of the National Center for Learning Disabilities. “It’s important for educators to have clear information about how privacy laws impact the delivery of virtual instruction, because now more than ever we must use ed tech in a way that helps us effectively teaching all students in a virtual world.”

As part of their reopening plans, many schools are also collecting new levels of sensitive health information from staff, students, and their families, through self-reported surveys and or screenings such as temperature checks, to assess and mitigate health risks. Students with disabilities or special health care considerations that may be more vulnerable to COVID-19 may be at risk of discrimination based on their health or disability status.

“Reopening plans must balance protecting health and protecting student privacy and educational rights,” said Amelia Vance, FPF’s Director of Youth and Education Privacy. “It’s a difficult – but incredibly important – balance. Schools and districts should have clear plan in place for how they will collect, use and store health data to ensure it is not ultimately used to limit educational access or opportunities for vulnerable students.”

Student Privacy and Special Education: An Educator’s Guide During and After COVID-19 covers how key privacy laws like the Family Educational Rights and Privacy Act (FERPA), Individuals with Disabilities Education Act (IDEA), Children’s Online Privacy and Protection Act (COPPA), and Health Insurance Portability and Accountability Act (HIPAA) apply to distance learning, and answers common questions about virtual learning and special education services including:

Educators seeking further support can visit FPF’s student privacy-focused website, Student Privacy Compass, which houses additional resources for educators such as The Educator’s Guide to Student Privacy and a new “Privacy and Pandemics” resource collection including a series of professional development trainings for educators.

NCLD’s website has additional Resources & Tools on COVID-19 for educators and parents, including an Educator’s Guide to Virtual Learning, a brief on Relevant Laws & Best Practices for Online Learning and a guide to Inclusive Technology During the COVID-19 Crisis.

To learn more about the Future of Privacy Forum, visit www.fpf.org. To learn more about the National Center for Learning Disabilities, visit www.ncld.org.

 

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts and includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. For more information, visit www.fpf.org.

About NCLD

The National Center for Learning Disabilities (NCLD) is a Washington, DC-based national policy, advocacy, and research organization that works to improve the lives of the 1 in 5 children and adults nationwide with learning and attention issues — by empowering parents and young adults, transforming schools, and advocating for equal rights and opportunities. www.ncld.org.