SXSW Edu – FPF and Colleagues Take on "Trust" Question

I am heading to SxSW Edu to join the US Department of Education Chief Privacy Officer Kathleen Styles, Common Sense Media’s Bill Fitzgerald, Data Quality Campaign’s Aimee Guidera, and several other distinguished education and privacy speakers for this session on student data privacy.  Sponsored by CoSN, this is not just a panel but a full 3-hour session to dig into the top issues, questions, and challenges of how to maximize the benefits of ed tech for our school children while protecting their data, their interests, and their future. It can only happen if we can find a way to inspire and maintain trust among and between all the players.  As advertised, this session will cover:

“Concerns around the privacy of student data have been rising and educators are increasingly on the defensive as parents believe too much data is collected about their children, it is not secure and too often inappropriately used for commercial gain. Trust is at the heart of the privacy debate. Last year the Student Data Principles were endorsed by 40 education associations, and over 200 companies have signed a Student Privacy Pledge. Learn how we can build trust through better educator training, improved review of education apps, clearer vendor agreements and building a “seal” validating trusted learning environments. Participate in an interactive forum on what else is needed.”
The key word there is “participate”! This session is about listening and learning and considering – all points of view, all ideas and suggestions. If you can’t be there – reach out to one of the speakers in advance with your thoughts and inputs.  We’ll take it all!
Here are some of my thoughts on the background of the discussion I hope that we’ll be having:
At the point of impact, there are two primary actors: the school and the vendor.  Schools and vendors generally want the same thing – good outcomes for students; and the DQC-sponsored Student Data Principles (values from the educator perspective) the FPF/SIIA-led Student Privacy Pledge (commitments by ed tech companies) reflect that priority.  Ideally, they work in partnership – the vendor providing a useful product or service which then allows the school to offer strong educational opportunities. But as with any relationship, “it’s complicated.”
Which vendors should schools choose to incorporate – how do they know who to trust?  Screening platforms, Seals, Pledges – all these and more are available, but from the school’s perspective there’s no silver bullet or easy way through the wickets of writing a contract or selecting an app and being completely sure you’re doing the right thing.
Even before that point, though, schools are making choices. What counts as a vendor and who should be able to incorporate the product – an app the teacher tells everyone to download? – only an account the school requires the student to create? What role should school IT offices play – and what about the many, many districts where there simply aren’t resources for this type of oversight? What is the burden on teachers – let’s try not to conclude they have to be IT specialists now too!
Even when schools and vendors work together efficiently toward a productive result, their job isn’t done. They operate as the agents, but the true “stakeholders” are the parents and students for whom this process takes place. They must be made aware of the decisions being made, the controls in place for them to exercise, and their own responsibilities.
Of course, we can never skip over the legal implications – some laws put the burden schools; some on vendors, and in either case, how do we control the risks without limiting the ability of either to function most effectively? New laws are considered out of the public debate about how much limitation over student data is the right amount; who should have access and when; what are those controls should parents have.
I’m sure you’ve noticed more questions than answers here – and I know, despite my immense respect for my co-speakers, we will not solve them all in one session! But if trust is part of the answer – and I think we all believe it must be – then we have to start somewhere, and this session is a “next step” in this on-going process to implement ed tech smartly, responsibly, and thoughtfully to minimize risk and prevent harm so we can enjoy the tremendous benefits data and technology offer to our schools and our students.
Hope to see you there!

Chris Wolf at Data Privacy Day

At Thursday’s Data Privacy Day event in Washington, Passcode joined privacy and security experts to explore US consumers’ evolving attitudes about digital privacy.

“Consumers will not do business with companies that don’t respect their privacy, companies they don’t trust,” said Chris Wolf, cochair of the Future of Privacy Forum. Mr. Wolf spoke on a panel Thursday at the National Cyber Security Alliance’s “Data Privacy Day” event of which Passcode was a media partner.

Full article here.

FPF Welcomes New Senior Fellow – Ira Rubinstein

FPF is proud to welcome its newest Senior Fellow, Ira Rubinstein. Ira will be working with FPF staff, fellows and members on a number of cross-Atlantic privacy issues and will be collaborating with EU academics and institutions on projects focused on de-identification, ethics, big data, and other issues.

Ira Rubinstein is a Senior Fellow at the Information Law Institute (ILI) of the New York University School of Law. His research interests include Internet privacy, electronic surveillance law, big data, and voters’ privacy. Rubinstein lectures and publishes widely on issues of privacy and security and has testified before Congress on these topics on several occasions. Recent papers include a study of Voter Privacy in the Age of Big Data; a research report on Systematic Government Access to Personal Data: A Comparative Analysis, prepared for the Center for Democracy and Technology and co-authored with Ron Lee and Greg Nojeim; Big Data: The End of Privacy or a New Beginning; published in International Data Privacy Law in 2013 and presented at the 2013 Computer Privacy and Data Protection conference in Brussels; and Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents, co-authored with Nathan Good, which won the IAPP Privacy Law Scholars Award at the 5th Annual Privacy Law Scholars Conference in 2012.

Prior to joining the ILI, Rubinstein spent 17 years in Microsoft’s Legal and Corporate Affairs department, most recently as Associate General Counsel in charge of the Regulatory Affairs and Public Policy group. Before coming to Microsoft, he was in private practice in Seattle, specializing in immigration law. From 2010-2016, he joined the Board of Directors of the Center for Democracy and Technology. He also serves on the Board of Advisers of the American Law Institute for the Restatement Third, Information Privacy Principles; the Organizing Committee of the Privacy by Design Workshops sponsored by the Computing Research Association; and he served as Rapporteur for the EU-US Privacy Bridges Project, which was presented at the 2015 International Conference of Privacy and Data Protection Commissioners in Amsterdam. Rubinstein graduated from Yale Law School in 1985.

We are excited and proud to have Ira on the FPF team!

Full press release here.

Chris Wolf Moderates Panel at CES 2016

Innovating Privacy: New Frameworks for Changing Technology

Chris led the discussion by this excellent panel at this year’s CES.  Full panel discussion can be viewed here(link expired).
Consumers are enjoying the benefits of connected devices while navigating (grappling with!) new privacy issues. Industry and regulators alike are working to understand consumer preferences while preserving creativity and flexibility to innovate with data. How can we adapt existing frameworks to respond to consumer concerns?

Moderator:

Speaker:

This year was a record breaking CES, with over 170,000 attendees and 3,800+ exhibitors throughout 2.47 million net square feet of exhibit space. Around 50,000 of attendees were international, representing over 150 countries.

 

Algorithmic transparency: Examining from within and without

As the volume of consumer data grows, an increasing number of decisions previously made by humans are now made by algorithms. Many thought leaders have called for algorithmic transparency to ensure that these decisions aren’t leading to unfair or discriminatory outcomes, but algorithmic transparency is tricky to implement. Last December, FTC Commissioner Julie Brill acknowledged the challenge in creating public-facing algorithmic transparency, calling on companies to proactively look internally to identify unfair, unethical, or discriminatory effects of their data use. Read more in the post on IAPP.

States and the District of Columbia Introduce ACLU Sponsored Legislation to Address Student Privacy

Recently, the ACLU, in partnership with the Tenth Amendment Center, created model legislation for states to “take control of their privacy in a digital age.” On January 20th, 2016 the ACLU coordinated with legislators in 16 states and the District of Columbia to roll out a variety of privacy bills simultaneously, many of which addressed the topic of educational data directly. After the passage of California’s SOPIPA, many states have proposed legislation directly targeting ed tech vendors for new responsibilities regarding the handling of student data. However, the ACLU’s model bill is a hybrid that specified new requirements for both schools and vendors to ensure protections and responsibility for student data privacy.

ACLU’s model legislation, which was proposed in various forms in the different participating states, includes several excellent ideas. The most important provision is the proposal that allows parents to specifically authorize that their child’s data be sent to the educational service providers they choose to provide additional educational purposes. This allows parents to supplement their child’s education by allowing a tutor to access data, or to export data to a tool that provides and enhanced or personalized curriculum.  A student could download data an educational game, or download to an app that would allow them to take data with them for use in college or a workplace.

At a time when learning is increasingly taking place 24/7, it allows parents to help students take their education records with them to be used for the additional educational purposes they choose.  Unfortunately, many bills passed in other states only allow parents to enable the sharing of data for college applications or scholarships, making it illegal for schools or vendors to use a program that would enable such onward transfers.  Parents in those states have to request access to their student’s data from a school authority, hope it is provided digitally and then transfer it to third parties.

Why are those states imposing such limits?  Some of them are worried that school service providers will convince parents to share data for marketing purposes. The ACLU bill addresses that concern by ensuring the consent needed is quite explicit and specific and can only be for educational purposes.  (In fact the consent requirements are so strict, that they may not be feasible for most services to achieve and will need some wordsmithing to be both protective and feasible)

Another key provision in the ACLU bill is the training requirement. Training for teachers to ensure that they have an adequate understanding of student privacy and how to comply with the law is critical. Restrictions placed on school data would be useless if school staff does not fully understand the rules or are not trained to follow them. The CoSN Trusted Learning Environment program should provide a useful training resource for many schools, as are the extensive materials at FERPA|Sherpa.

Fortunately, this model legislation avoids one of the unintended pitfalls of some recent state bills which have such strict language that they can inadvertently restrict schools from providing needed information to school photographers, yearbook publishers and spiritwear providers.

Another proposal in the model bill allows schools to bar parental access to confidential student data, if allowing access would risk the safety of the student.  For example, if a student confidentially disclosed to a guidance counselor that they were gay or lesbian but feared for their safety if  parent(s) requested access to this data, the school could respect their need for confidentiality, even if the data was included as a student record. However, this provision can only be effective if implemented at the federal level, since currently FERPA does not contain such an exception, and otherwise requires that parents have access to the entire student educational record. Such a “zone of confidentiality” should be an integral part of the student privacy conversation moving forward.

However, there are many other provisions in the bill that create significant problems for schools.  These would need revisions to be effective.

The bill defines Personally Identifiable information (PII) as “Any aggregate or de-identified student data that is capable of being de-aggregated or reconstructed to the point that individual students can be identified.”  And in some places the bill requires data be both aggregated and de-identified, further raising the bar.

Since no de-identification method is 100% perfect, this definition would restrict the uses of many very effectively de-identified data sets.  Without de-identified longitudinal data sets, which require record level data, states can’t measure how well schools perform. In order to identify discriminatory practices, among other useful applications of de-identified but not aggregated data, it is necessary to include levels of detail about race, geography, grades, discipline and other data that can leave a data set well de-identified, but not impossible to re-identify.  In fact, FERPA specifically allows sharing of data that may include a limited identifier, as long as the use is limited to research.  A better standard for de-identification is the “reasonable” standard used in the Student Privacy Pledge. Or, so that schools didn’t have to deal with multiple de-identification standards, the bill could seek to be consistent with FERPA’s de-identification standard, which we analyzed in our recent whitepaper.

The bill treats college students the same as kindergarten students. This is too broad a brush. College students are legal adults and can safely be treated differently than a student who has yet to reach the age of majority. While certain protections are still appropriate, a college student has the ability to make informed decisions based on their educational needs to a much greater extent than a young child.

The model bill only makes student data available to school employees. This could potentially bar parents who volunteer for school activities and part-time coaches from having access to student data. FERPA allows sharing with all school officials, although direct control is needed over whoever it is shared with. This bill would require a special contract for coaches, parents, and other school representatives that are not designated as “school employees.”

The bill defines a Student Service System (SIS) as any “software application and/or cloud based service that allows an educational institution to input, maintain, manage, and/or retrieve student data and/or personally identifiable student information, including applications that track and/or share personally identifiable student information in real time.” This provision covers any service that can be used by a student, which in effect means every business in the world that doesn’t have a way to identify and screen out students or doesn’t even know a student is using the product is covered by this bill. This is likely to be unconstitutional and is obviously impractical.  Most state laws and the Student Privacy Pledge cover products that are designed and marketed to schools.

Of particular concern, the proposed language would allow privacy litigation against teachers or parent volunteers. Schools hold a heavy responsibility to train, provide resources, and manage the use of ed tech in the educational process, but the way to ensure accountability for this is not to put teachers at personal legal risk.  Schools must have reasonable review and implementation processes in place for using technology and protecting data.  Teachers who violate school rules in general or put students at risk should be subject to appropriate management actions or discipline. But a teacher who misreads a privacy policy shouldn’t face litigation.

We applaud the work of the ACLU and their partners in these states to clearly address some of the challenges of student data privacy as ed tech applications continue to be implemented in schools, and look forward to working on these same issues with them, and with policymakers at the state and federal level.

Announcing the Launch of ResearchChoices.org

“Research Choices is an important step forward for research companies and will help consumers better understand how data is used to make decisions by a wide range of organizations.” – Jules Polonetsky

From the ESOMAR Press Release

Amsterdam, January 28, 2016: Top market research agencies comScore, GfK, Kantar, and Nielsen announced today the launch of a joint initiative to boost transparency and choice for online audience measurement research.

This joint initiative and the associated portal are being facilitated by ESOMAR, the World Association for Market, Social, and Opinion Research at the behest of the founding Research Choices participants.

“I welcome the launch of Research Choices, as the first world-wide and industry-wide initiative designed to reiterate our profession’s undertakings, and its self-regulatory strength. Initially focusing on the audience measurement sector, it is the sector’s hope and intention to progressively broaden the service to cover all digital research activities,” said Laurent Flores, ESOMAR President.

When completed, the web-based portal, accessible online at http://researchchoices.org, will provide the general public educational content initially demonstrating how online audience measurement research and online market research generally is conducted as well as highlighting participating companies’ privacy policies and tools to exercise opt-out and choice.

See full press announcement here. And visit Research Choices to learn more about it!

Student Privacy Boot Camp for Ed Tech Vendors

FPF is continuing its series of Boot Camp training sessions for ed tech start-ups and small to medium companies – with the next event scheduled for March 3 in San Francisco.  Slots are limited, so apply to attend now!

Need a fast and furious intro to what it takes to do privacy – and security – right as a player in the ed tech market? Curious about what laws applies to schools, what laws apply to vendors, and what other regulations matter too?  Need access to the greatest one-stop shop for resources on all questions for student privacy, geared toward parents, policymakers, schools, and of course, vendors?

Then this event is for you!

Speakers include the Chief Privacy Officer from the US Department of Education, security specialists from companies like Clever, and education and student data privacy experts from iKeepSafe, PlaywWell LLC, Data Quality Campaign, and many others.

In addition, privacy reps from FPF member companies with long experience in this market will  be on site to run “unconference” sessions and answer all your specific questions.

For a full description, complete agenda, and to register or become a sponsor, click here.

Essentially Equivalent:

A Comparison of the Legal Orders for Privacy and Data Protection – EU & US

From our friends at Sidley Austin:

“In a milestone decision on transatlantic data protection, the Court of Justice of the European Union (CJEU) issued its judgment in the Schrems case, declaring the Commission decision on the EU-U.S. Safe Harbor agreement invalid. The CJEU declared that such a decision requires a finding that the level of protection of fundamental rights and freedoms in the laws and practices of the third country is “essentially equivalent” to that guaranteed within the EU. Given the CJEU’s decision, the Commission and data protection authorities are now called upon to examine the legal order in the U.S. and compare its level of protection to that within the EU.

“This report provides a roadmap and resource for this comparison. Following the analysis laid out by the CJEU in Schrems, it shows how privacy values deeply embedded in U.S. law and practice have resulted in a system of protection of fundamental rights and freedoms that meets the test of essential equivalency.”

FPF Senior Fellow Peter Swire Debates Max Schrems

Privacy in the EU and the US

Scheduled for Jan 26, 2016 – 12:30 Eastern

FPF is pleased to share the following announcement:

The Brussels Privacy Hub is pleased to announce a pre-CPDP launch event: “Privacy in the EU and US: A debate between Max Schrems and Peter Swire”

* Peter Swire – Huang Professor of Law and Ethics at the Georgia Tech Scheller College of Business and a member of President Obama’s Review Group on Intelligence and Communications Technology

* Max Schrems – PhD student, privacy activist, and the successful plaintiff in the recent EU Court of Justice judgment Schrems v. Data Protection Commissioner;

Annie Machon writer, media commentator and political campaigner

The discussion will be moderated by Prof. Paul de Hert, Brussels Privacy Hub Co-director, and will include a Q&A session, giving participants an opportunity to engage with prominent figures in the data protection and privacy communities.

Recording of full debate here.