End of Safe Harbor? Understanding the CJEU’s Decision and Its Implications

Legal background

In Europe, the processing of personal data is governed by Directive 95/46/EC (the “Directive”) which sets forth rules under which data may be lawfully processed1.The transfer of personal data to third countries is restricted to countries that ensure an adequate level of protection2.While the Directive itself does not provide a definition of the concept of an adequate level of protection, article 25.2 states that “the adequacy of the level of protection afforded by a third country shall be assessed in the light of all the circumstances surrounding a data transfer operation or set of data transfer operations” and lists non-exhaustive criteria to consider when making such determination3. Pursuant to article 25.6 of the Directive, the Commission may find (…) that a third country ensures an adequate level of protection (…) by reason of its domestic law or of the international commitments it has entered into (…) for the protection of the private lives and basic freedom and rights of individuals.

On this basis, the U.S. and E.U. negotiated and established a framework ensuring an adequate level of protection for data transfers from the EU to organizations established in the US (the “Safe Harbor”). In 2000, the Commission adopted a decision (Decision 2000/520) declaring that the Safe Harbor framework ensured adequate level of protection, therefore allowing lawful transfer of data from the EU to the US4. However, the status of the US-EU Safe Harbor was called into question in November 2013 after Edward Snowden’s revelations regarding US mass surveillance programs5.

Background of the case

In 2012, Maximillian Schrems, an Austrian law student who has been a Facebook user since 2008, requested the company to provide him with a copy of all the data it holds about himself. Later on, he filed a complaint with the Irish Data Protection Commissioner (“DPC”) in which he asked the DPC to exercise its power by prohibiting Facebook Ireland to transfer his personal data to the US.6 Mr. Schrems contended in his complaint that the law and practice in force in the US did not ensure adequate protection of the personal data held in its territory against the surveillance activities that were engaged by the public authorities. The DPC rejected his complaint, considering that there was no actual evidence that his data had been accessed by the NSA and added that the allegations raised by Mr. Schrems could, in any case, not be put forward as the Commission had already found in its Decision 2000/520 that the United Stated ensured an adequate level of protection. Mr. Schrems then went before the Irish High Court which adjourned the case to refer to the Court of Justice of the European Union (“CJEU”).

Questions before the CJEU

The CJEU addressed two questions in its judgment:

  1. Is the Data Protection Commissioner bound by the Commission’s decision that finds that the US ensures adequate level of protection?
  2. Is decision 2000/520 valid?

Ruling

The Court ruled that:

  1. National DPAs are not bound by a decision adopted by the Commission on the basis of Article 25.6 of the Directive. They therefore have to investigate a claim lodged by an individual concerning the protection of his rights in regards of personal data relating to him and which has been transferred from the EU to a third country when that person contends that that country does not actually provide an adequate level of protection.

A decision adopted pursuant to Article 25.6 of the Directive (such as the Commission Decision 2000/520/EC of 26 July 2000), by which the European Commission finds that a third country ensures an adequate level of protection, “does not prevent a supervisory authority [data protection authority] of a Member State from examining the claim of a person concerning the protection of his rights and freedoms in regard to the processing of personal data relating to him which has been transferred from a Member State to that third country when that person contends that the law and practices in force in the third country do not ensure adequate level of protection “

  1. Decision 2000/520 is invalid.

 

CJEU judgment

Is a national DPA bound by a Commission decision?

In its decision, the Court highlights the importance of interpreting the Directive “in the light of the fundamental rights guaranteed by the Charter”, notably the right to respect for private life. Consequently, the independence of national Data Protection Authorities (“DPA”) must be regarded as an essential component of the protection of individuals with regard to the processing of personal data. The Court recognized that DPAs’ power is limited to processing of personal data carried out in their territory. However, it considers that “the operation consisting in having personal data transferred from a Member State to a third country constitutes, in itself, processing of personal data carried out in a Member State.” Each DPA is therefore vested with the power to check whether a transfer of personal data from its own Member State to a third country complies with the requirements laid down by the Directive7.

Commission decisions (adopted on the basis of Article 25.6 of the Directive) are binding on all Member States and all their organs8. The CJEU only has jurisdiction to declare such decision invalid9 in order to guarantee legal certainty by ensuring that EU law is applied uniformly. However, the Commission decision cannot prevent individuals from their right to lodge a claim with their national DPA10.Likewise, such decision “cannot eliminate or reduce the powers expressly accorded to the national supervisory authorities” [DPA] both by the Charter and the Directive11.

Is the Commission Decision 2000/520 valid?

According to the Court, although the Directive does not contain any definition of adequate level of protection, it must be understood as requiring the third country to ensure in fact a level of protection of fundamental rights and freedoms that is essentially equivalent to that guaranteed within the EU12.It is incumbent to the Commission to check periodically whether the finding relating to the adequacy of the level of protection ensured by the third country in question is still factually and legally justified. Such a check is required, in any event, when evidence gives rise to a doubt in that regard13 (referring to the Snowden revelations).The Court judged that it was not even necessary to examine the content of the Safe Harbor principles as Article 1 of the Commission Decision does not comply with Article 25.6 of the Directive. Derogations and limitations in relation to the protection of personal data need to be applied “only in so far as is strictly necessary”14and the Court considered that the bulk collection of data operated by the NSA “without any differentiation, limitation or exception being made in the light of the objective pursued and without an objective criterion being laid down by which to determine the limits of the access of the public authorities to the data, and of its subsequent use, for purposes which are specific, strictly restricted and capable of justifying the interference which both access to that data and its use entail”15 was violating the principles of the Directive.

Secondly, the Safe Harbor deprived individuals from pursuing legal remedies in order to have access to personal data relating to him, or to obtain the rectification or erasure of such data which does not respect the essence of the fundamental right to effective judicial protection, as enshrined in the Charter16.

Thirdly, by limiting the DPA’s power to investigate, the Commission, in its Decision 2000/520, exceeded the power conferred upon it in Article 25.6 of the Directive17.

Key points to remember

 

 

 

Consequences of the CJEU decision

Some believe the CJEU reached this decision not because of companies’ behavior but rather because the US government was compelling companies to provide information under Section 702. This historical decision should provide momentum for Congress to amend section 702 and undertake reforms which could include greater ability to challenge, greater transparency, allowing surveillance only for limited purposes, prohibiting upstream acquisition, applying better limits on dissemination as well as on the retention of data.

Data transfers do not stop overnight and considering the load of work on DPAs, there should be a grace period that companies should use to find a way forward. The bigger risk regarding enforcement probably comes from EU citizens who have the right to file complaints with their local DPA.

What are some other ways to transfer data from the EU to the US?

Alternative mechanisms to consider include:

Are the other Commission decisions determining adequate level of protection for other countries still valid?

The decision does not apply to other adequacy decisions which remain binding on all Member States until they are actually declared invalid. As it was explained above, such decision can only be declared invalid by the CJEU. Therefore, it would require going through another lengthy process of complaints to a DPA, national court and the CJEU. For now, other adequacy decisions and other mechanisms still stand.

What should companies do?

The best step for companies at this point is likely to wait for the guidance that should be issued by the European DPAs in the very near future.

Indeed, as announced by the Article 29 Working Party, data protection authorities from Member States are to meet soon in order to issue guidance and provide a coordinated response to the Court’s decision21.

Information Commissioner Christopher Graham advised in London on 8 October: “Don’t panic.” The ICO’s office will not be “knee-jerking into sudden enforcement of a new arrangement. We are coordinating our thinking very much with the other data protection authorities across the EU.”22 This view was confirmed by the Dutch and French DPAs in their respective declaration23addressing the CJEU decision. Each of them referred to the announcement of the emergency meeting made by the Article 29 Working Party.24

Some companies are already taking steps to implement alternative mechanisms such as model contracts25. Since model contracts have no mechanism to over-ride national surveillance or law enforcement access to data, there is no logical reason for them to be an acceptable legal alternative to the Safe Harbor.  However, they technically are and will be valid until Schrems or another critic brings the same legal action to the courts.  Perhaps, this crisis will lead to a relatively quick political solution which will provide an alternative to Safe Harbor. If not, companies can at that time assess the available options.

 -Bénédicte Dambrine, Legal Fellow

1 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:31995L0046&qid=1444227336670&from=EN)

2 Article 25.1 of the Directive

3 Articles 25.2 and 25.4 of the Directive

4 http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000D0520:EN:HTML

5 Communication from the Commission to the European Parliament and the Council entitled ‘Rebuilding Trust in EU-US Data Flows’ (COM(2013) 846 final, 27 November 2013) and Communication from the Commission to the European Parliament and the Council on the Functioning of the Safe Harbour from the Perspective of EU Citizens and Companies Established in the EU (COM(2013) 847 final, 27 November 2013).

6 CJEU case C-362/14 of 6 October 2015 (http://curia.europa.eu/juris/document/document.jsf?text=&docid=169195&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=293038)

7 Section 47 of the Decision

8 Section 51 of the Decision

9 Section 61 of the Decision

10 Section 53 of the Decision

11 Section 53 of the Decision

12 Section 73 of the Decision

13 Section 76 of the Decision

14 Section 92 of the Decision

15 Section 93 of the Decision

16 Section 95 of the Decision

17 Section 103 of the Decision

18 http://ec.europa.eu/justice/data-protection/international-transfers/binding-corporate-rules/index_en.htm

19 http://ec.europa.eu/justice/data-protection/international-transfers/transfer/index_en.htm

20 Article 26.1(a) of the Directive

21 http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/2015/20151006_wp29_press_release_on_safe_harbor.pdf

22 https://iapp.org/news/a/icos-graham-dont-panic

23 http://www.cnil.fr/linstitution/actualite/article/article/invalidation-du-safe-harbor-par-la-cour-de-justice-de-lunion-europeenne-une-decision-cl/

24 http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/2015/20151006_wp29_press_release_on_safe_harbor.pdf

25 https://www.linkedin.com/pulse/safe-harbor-dead-official-statements-marius

FPF Founder Christopher Wolf wins Vanguard Award

The entire FPF team is thrilled to congratulate Chris Wolf, FPF founder and Co-chair and cherished mentor, on winning the IAPP Vanguard Award.  The following remarks were delivered today at IAPP by Brenda Leong, FPF Senior Counsel and Director of Operations.  Chris Wolf, FPF Founder and Co-chair, and his Vanguard Award

Welcome everyone and thank you for joining us this morning. I am honored to present the IAPP Privacy Vanguard Award. This annual award recognizes outstanding leadership, knowledge, and creativity in the field of privacy and data protection.

The recipient of the IAPP Privacy Vanguard Award is someone who has positively impacted the privacy industry through personal and communal achievements throughout their career.

This year’s award winner is known as the “Dean of the Industry” or, as The Washingtonian Magazine deemed him, the “Tech Titan”. He is not only a well-respected lawyer in privacy and data protection, and an accomplished and long-time respected leader in the field, but also a treasured friend, colleague, mentor or role model for those who have had the privilege to know and work with him.

He helped break the path for privacy law in the early days of its modern application when the Internet and related technologies made clear that existing laws were no longer sufficient. He has had a hand in advising and shaping thinking on many leading-edge issues including Internet free speech, Internet hate speech and the parameters of government access to stored information. In doing so, he led the development of a top-ranked privacy law practice, Hogan Lovells US, where he now helps lead a team of 27 full time privacy lawyers.

His influence on the global privacy agenda led to co-founding of the Future of Privacy Forum, DC-based a think tank dedicated to advancing responsible data use in commercial and consumer privacy. Working with great vision as part of the FPF team, he helped build it into a thriving community of business, academic and advocacy thought leaders, that has been influential in shaping public policy on many privacy issues.

As a “pioneer in privacy law”, he originated and edited the first privacy law treatise published by the prestigious Practising Law Institute and has written and lectured widely on the subject of privacy law. He is the co-editor of the PLI book, “A Practical Guide to the Red Flag Rules”, those identity-theft-prevention regulations issued by the FTC and financial regulators. He has testified in Congress and before the Privacy and Civil Liberties Board, participated in the 2014 White House Big Data Workshops, and served as a panelist at numerous FTC privacy-related workshops.

While playing this industry-leading role in the development of privacy and tech policy, he also managed to dedicate a major portion of his time and focus to such worthy charitable and philanthropic causes as the Anti-Defamation League – where he serves as the National Civil Rights Chair – and Food and Friends, a Washington-based nonprofit that provides home-delivered meals and nutrition counseling for people with life-challenging illnesses, and several other such organizations.

Finally, he has played all these roles while prioritizing time for family and friends – he manages to be there as a friend and colleague for an incredible swath of people – with grace and warmth – with diplomacy when required – and always with great class.

Jules Polonetsky – his cofounder at FPF – was very disappointed he couldn’t be here today, but he told me that there was a classic Jewish word that sums up today’s winner – he is a “mensch”…and for those like me whose yiddish might be a bit rusty – this simply means: a “fine human being”.

It gives me great pleasure to announce, and welcome to the stage, the winner of this year’s Privacy Vanguard Award, Christopher Wolf.

FPF releases Survey: "Beyond the Fear Factor”

Few topics in education have generated as much discussion as the potential for data and technology to transform teaching and learning. While the public discourse has been dominated by advocates and critics alike, we’ve learned little about how most parents of school-aged children view the risks and opportunities of using data and technology in the classroom. In FPF’s report “Beyond the Fear Factor: Parental Support for Technology and Data Use in Schools,” we investigate prevailing attitudes towards the emerging use of technology and data in education and children’s deepening online presence.  Analysis of these data allowed FPF to create a series of recommendations to both ed tech developers and schools.

You can read the full report HERE.

Evan Selinger & Brenda Leong on Student Data Privacy

On September 17, Passcode published “The Case for Safeguarding Students’ Digital Privacy”, an opinion piece discussing the results of a recent FPF-sponsored survey on parents’ attitudes towards the collection and use of student data.  Coauthored by FPF’s own Evan Selinger and Brenda Leong, the piece identifies the risks to students who maintain an active online presence without a working knowledge of sensitive data and digital privacy, and what schools ought to do to address this emerging problem.

From the article:

We personally believe elementary schools that integrate tablets, laptops, and desktops into their curricula should be assuming more responsibility for confronting the long-term implications of online identities created by students.

When students are led through the process of creating accounts and setting up passwords, schools become responsible for providing a broader explanation for why passwords matter and what happens once information is stored in databases and others can analyze it. Even if the ideals reflected in the survey became safeguarded by perfect procedures…students still would remain vulnerable to all kinds of extra-curricular data collection and scrutiny.

You can read the full article by clicking here.

Call for Papers: Beyond IRBs

 

CALL FOR PAPERS

Beyond IRBs: Designing Ethical Review Processes for Big Data Research

In the age of Big Data, innovative uses of information are continuously emerging in a wide variety of contexts. Increasingly, researchers at companies, not-for-profit organizations and academic institutions use individuals’ personal data as raw material for analysis and research. For research on data subject to the Common Rule, institutional review boards (IRBs) provide an essential ethical check on experimentation. Still, even academic researchers lack standards around the collection and use of online data sources, and data held by companies or in non-federally funded organizations is not subject to such procedures. Research standards for data can vary widely as a result. Companies and non-profits have become subject to public criticism and may elect to keep research results confidential to avoid public scrutiny or potential legal liability.

To prevent unethical data research or experimentation, experts have proposed a range of solutions, including the creation of “consumer subject review boards,”[1] formal privacy review boards,[2] private IRBs,[3] and other ethical processes implemented by individual companies.[4] Organizations and researchers are increasingly encouraged to pursue internal or external review mechanisms to vet, approve and monitor data experimentation and research. However, many questions remain concerning the desirable structure of such review bodies as well as the content of ethical frameworks governing data use. In addition, considerable debate lingers around the proper role of consent in data research and analysis, particularly in an online context; and it is unclear how to apply basic principles of fairness to selective populations that are subject to research.

To address these challenges, the Future of Privacy Forum (FPF) is hosting an academic workshop supported by the National Science Foundation, which will discuss ethical, legal, and technical guidance for organizations conducting research on personal information. Authors are invited to submit papers for presentation at a full-day program to take place on December 10, 2015. Successful submissions may address the following issues:

Papers for presentation will be selected by an academic advisory board and published in the online edition of the Washington and Lee Law Review. Four papers will be selected to serve as “firestarters” for the December workshop, awarding each author with a $1000 stipend.

Submissions must be 2,500 to 3,500 words, with minimal footnotes and in a readable style accessible to a wide audience.

Submissions must be made no later than October 25, 2015, at 11:59 PM ET, to [email protected]. Publication decisions and workshop invitations will be sent in November.


 

[1] Ryan Calo, Consumer Subject Review Boards: A Thought Experiment, 66 Stan. L. Rev. Online 97 (2013).

[2] White House Consumer Privacy Bill of Rights Discussion Draft, Section 103(c) (2015).

[3] Jules Polonetsky, Omer Tene, & Joseph Jerome, Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings, 13 Colo. Tech. L. J. 333 (2015).

[4] Mike Schroepfer, CTO, Research at Facebook (Oct. 2, 2014), http://newsroom.fb.com/news/2014/10/research-at-facebook/.

2015 National Student Privacy Symposium

FUTURE OF PRIVACY FORUM, DATA QUALITY CAMPAIGN TO HOST 2015 NATIONAL STUDENT PRIVACY SYMPOSIUM

IN WASHINGTON, D.C. ON SEPTEMBER 21, 2015

 Leading Education & Privacy Experts to Discuss, Explore the Benefits and Risks of Student Data and Technology in Schools

WASHINGTON, D.C. – August 18, 2105 – As students, parents, teachers and school administrators gear up for the start of another academic year, the Future of Privacy Forum and the Data Quality Campaign today announced plans to host a forum to explore the impact of student data in K-12 education.  National leaders will convene to discuss a wide range of data and privacy issues, including the challenges and opportunities of using data for education research, civil rights, personalized learning, parental engagement and more.

The event will also feature the unveiling of a new, major national survey of parent attitudes about student data and privacy issues.

The 2015 National Student Privacy Symposium will take place from 8 a.m. – 6:30 p.m. on Monday, September 21, 2015 at The Mayflower Renaissance Hotel in Washington, D.C. The event is free.

The symposium will engage research experts, education leaders, privacy and security professionals, advocacy groups, parents and government leaders in an open and honest debate about how to best serve our youth.

In addition to core student data privacy issues – education, privacy, security, and civil rights leaders will also discuss the benefits and risks of data use for underserved student populations – and its impact on inequality and discrimination.

The Keynote speaker is Kati Haycock, President of the Education Trust, a leading national non-profit education advocacy organization. Haycock is a civil rights champion and one of the nation’s top advocates for high academic achievement for all students, especially low-income students and students of color.

Support and funding for the event has been provided by: the Bill & Melinda Gates Foundation; the Digital Trust Foundation; the National Association of State School Boards of Education; the Consortium for School Networking (CoSN); the Houston Independent School District; iKeepSafe; and AASA: The School Superintendents Association.

For the most current agenda, please visit: www.studentprivacysymposium.org.

Space is limited, so attendees are asked to register by September 16.

For more information on the event, contact Kobie Pruitt, Education Policy Manager, Future of Privacy Forum, [email protected] or Jon-Michael Basile, Data Quality Campaign, [email protected].

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.  For more information, visit fpf.org

About DQC

The Data Quality Campaign is a national, nonprofit organization leading the effort to bring every part of the education community together to empower educators, parents, and policymakers with quality information to make decisions that ensure students achieve their best. For more information, go to www.dataqualitycampaign.org and follow us on Twitter @EdDataCampaign.

 

Media Contacts

Nicholas Graham for FPF

[email protected]

571-291-2967

 

Jon-Michael Basile for DQC

[email protected]

202-787-5718

Beyond the Common Rule: IRBs for Big Data and Beyond?

In the wake of last year’s news about the Facebook “emotional contagion” study and subsequent public debate about the role of A/B Testing and ethical concerns around the use of Big Data, FPF Senior Fellow Omer Tene participated in a December symposum on corporate consumer research hosted by Silicon Flatirons. This past month, the Colorado Technology Law Journal published a series of papers that emerged out of the symposium, including “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

“Beyond the Common Rule,” by Jules Polonetsky, Omer Tene, and Joseph Jerome, continues the Future of Privacy Forum’s effort to build on the notion of consumer subject review boards first advocated by Ryan Calo at FPF’s 2013 Big Data symposium. It explores how researchers, increasingly in corporate settings, are analyzing data and testing theories using often sensitive personal information. Many of these new uses of PII are simply natural extensions of current practices, and are either within the expectations of individuals or the bounds of the FIPPs. Yet many of these projects could involve surprising applications or uses of data, exceeding user expectations, and offering notice and obtaining consent could may not be feasible.

This article expands on ideas and suggestions put forward around the recent discussion draft of the White House Consumer Privacy Bill of Rights, which espouses “Privacy Review Boards” as a safety value for noncontextual data uses. It explores how existing institutional review boards within the academy and for human testing research could offer lessons for guiding principles, providing accountability and enhancing consumer trust, and offers suggestions for how companies — and researchers — can pursue both knowledge and data innovation responsibly and ethically.

The Future of Privacy Forum intends to continue the conversation about Big Data review boards. Joseph Jerome will be leading a panel discussion on the topic at the IAPP’s fall Privacy Academy, and FPF will be hosting an invite only workshop this winter with leading researchers, ethicists, and corporate policymakers to address how to build an ethical framework for Big Data research.

Click here to read “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

Student Data and De-Identification

FPF has released its newest paper, Student Data and De-Identification: Understanding De-Identification of Education Records and Related Requirements of FERPA.  Prepared in partnership with Reg Leichty of Foresight Law + Policy, this paper provides an overview of the different tools used to de-identify data to various degrees, based on the type of information involved, and the determined risk of unintended disclosure of individual identity. Proper data de-identification requires technical knowledge and expertise as well as knowledge of, and adherence to, industry best practice.

“Data de-identification represents one privacy protection strategy that should be in every student data holder’s playbook. Integrated with other robust privacy and security protections, appropriate de-identification – choosing the best de-identification technique based on a given data disclosure purpose and risk level – provides a pathway for protecting student privacy without compromising data’s value. This paper provides a high level introduction to: (1) education records de-identification techniques; and (2) explores the Family Educational Rights and Privacy Act’s (FERPA) application to de-identified education records. The paper also explores how advances in mathematical and statistical techniques, computational power, and Internet connectivity may be making de-identification of student data more challenging and thus raising potential questions about FERPA’s long-standing permissive structure for sharing non-personally identifiable information.”

De-Identification and other issues relating to student privacy will be discussed at our upcoming Student Privacy Symposium on Sept 21.  Learn more.

Security Quick Tips for Vendors

As part of our on-going support to vendors, especially start-ups and small business providers in the ed tech market, we have recently published our “Quick Security Tips for Vendors.”  This tool, a companion to our “Quick Privacy Tips for Vendors,” is designed to provide a simple baseline of security principles and practices as an ed tech business grows its products and services.  Of course, this list of tips does not constitute a complete security policy, but if followed, it will ensure that vendors have taken the best, first steps toward responsible protection of student data, as these tips flag many of the common key concerns. A company that implements student data privacy and security policies and procedures in compliance with these 2 checklists will have a strong foundation moving forward.

Student Privacy Symposium

Student Data Privacy Symposium

with the support of the Bill & Melinda Gates Foundation

Date:                       September 21, 2015

Location:              The Mayflower Renaissance Hotel, Washington, DC

Description:       The 2015 Student Data Privacy Symposium will present a thoughtful consideration by leading education and privacy experts on how student data should be collected and used. A series of panels will review the overarching value of technology and data use by educational institutions as demonstrated by current research, as well as the related concerns and risks of such use. Education, privacy, security, and civil rights leaders will discuss the benefits and risks of data use for underserved populations and consider possible strategies for the future. The 2015 Symposium is designed to engage education leaders, privacy and security professionals, advocacy groups, media, foundations, parent leaders, and companies to connect and collaborate for open and honest debate about how to best serve our youth.

Register

Proposed Agenda:

(Panelists will be identified as they are confirmed)

8:00 am                                  Registration Opens (Coffee Service)

9:00 am                                  Opening Remarks

 

9:10 am                                Welcome Keynote

9:45-10:45 am                    Panel 1: Student Data and Research

Leading academic researchers have studied and analyzed student data for many years. This panel gathers leading researchers to describe the results of their studies, with particular focus on studies that examine the use of technology, improving teaching and learning outcomes and understanding school performance.

 

10:45-11:00 am                                   Break

11:00-12:00 pm                 Panel 2: The Potential Risks of Student Data Collection and Use

Advocates and experts examine the possible dangers and pitfalls in collecting and using student data, and discuss way to address their concerns relating to uses of technology.

12:15-1:30 pm                    Lunch

Parents’ Views on Technology and Data Use in Education: Survey Results and Discussion

1:30-1:45                               Break

1:45-2:45 pm                       Panel 3: What Is the Future of Technology in the Classroom?

Stakeholders including teachers, school technology specialists, and administrators demonstrate the impact of personalized learning, digital backpacks, student profiling, and other tools and opportunities offered by increased technology. Panelists will also discuss the controls required for parents and students to effectively manage their data.

 

2:45-3:00                                Break

3:00-4:00 pm                       Panel 4: The Role of Technology and Data Use for Student Rights

Speakers explore the appropriate use of opt-in and opt-out programs; evaluate data “ownership” by schools, students, and parents, and the role of choice in data sharing and analysis; consider the ways data can identify inequalities, or when it may provide the potential for increased discrimination

 

4:00-4:15 pm                       Break

4:15-5:15 pm                       Panel 5: The Path Ahead: Areas for Discussion and Solutions

With audience participation, this closing panel explores the potential impact of proposed responses: legislative action; changes to contract requirements; increased security standards; the development of industry technical standards; seal or rating programs for schools and/or vendors. Compare best practices from other industry around transparency and communication.

 

5:15-5:30                                Closing Comments

5:30-7:00 pm                       Reception (China Room)