FPF Commends New America's Report on Predictive Analytics in Higher Education

New America released a report today that addresses the use of data in higher ed analytics – predicting student outcomes and managing university academic programs based on prior data. The growing ability to gather and analyze this data allows colleges to  intervene with struggling students, put in place mentoring programs, create support structures addressing “whole student” welfare, and ultimately improving academic outcomes and graduation rates.

This is an excellent report showing the great potential of this use of data. In particular, we commend the sensitivity to flagging the privacy and security issues.  It will be important to ensure those privacy policies are prioritized and well addressed throughout the development and expanded use of these analytics tools.

FPF runs a Higher Education Working Group focused exactly on these  issues, and the work being done to address and handle them responsibly. We work with many of the higher education advocacy groups, along with colleges and universities, to represent the privacy interests in these discussion, in suppor of their ultimate goals of better student support and learning outcomes.

For companies or organizations interested in this work, please contact us to learn more about joining us to be part of these discussions. Contact [email protected] for more information.

Future of Privacy Forum and Carnegie Mellon University Research Leads to New Tool from California Attorney General

Last week, California’s Attorney General, Kamala D. Harris, announced the release of a new form that allows consumers to report potential violations of the California Online Privacy Protection Act (CalOPPA) by websites and online services.

Attorney General Harris’ announcement explained that FPF’s 2011 research into app privacy policies had prompted an earlier agreement between her office and prominent mobile app platforms to encourage apps to post privacy policies. Now, a new FPF study commissioned by Attorney General Harris revealed the need for further work, leading to the release of the new complaint form.

The FPF Mobile Apps Study revealed that while the number of apps that provide privacy policies continues its upward trend from FPF’s previous surveys in 2011 and 2012, health and fitness apps – which may access sensitive, physiological data collected by sensors on a mobile phone, wearable, or other device – do worse than average at providing privacy policies. Only 70% of top health and fitness apps had a privacy policy (6% lower than overall top apps), and only 61% linked to it from the app platform listing page (10% lower than overall top apps).

The App Study also looked specifically at period tracking and sleep aid apps. Only 63% of period tracking apps provided a link to the privacy policy from the app platform listing page. More disappointingly, only 54% of sleep aid apps provided a link to the privacy policy from the app platform listing page.

Attorney General Harris has also worked with Carnegie Mellon University privacy researchers to review apps for compliance with the law and is collaborating with the Usable Privacy Policy Project at CMU to develop a tool that will identify mobile apps that may be in violation of CalOPPA.

FPF applauds Attorney General Harris for her long standing commitment to protecting consumer privacy and encourages consumers to utilize the new form to report suspected violations of CalOPPA.

Student Privacy Pledge Loopholes? Nope. We Did Our Homework.

The Student Privacy Pledge was introduced over two years ago by the Future of Privacy Forum and the Software and Information Industry Association. It was endorsed by the White House and published at the forefront of the movement to clarify responsible practices in the collection, protection, and use of student data as the presence of technology in schools expanded. The Pledge has since been signed by more than 300 ed tech companies as a way to help demonstrate their commitment to student privacy.

The Electronic Frontier Foundation yesterday published a confusing analysis of the Pledge. EFF generally praises the Pledge commitments, but claims that the Pledge includes some fine print definitions that undercut its protections. The Pledge defines ’Student personal information’ as “personally identifiable information as well as other information when it is both collected and maintained on an individual level and is linked to personally identifiable information.” EFF claims that the Pledge “is surely meant to be narrowly interpreted” and would “seem to permit signatories to collect sensitive and potentially identifying data such search history, so long as not tied to a student’s name.”

We don’t agree. We have written extensively on the definition of personal information, in general and under FERPA. FERPA, SOPIPA and other statutes define student personal information broadly and, in our view, any reasonable analysis of the definition of Personally Identifiable Information would cover direct or indirect information that could be reasonably used to identify an individual student. To conclude, as EFF has done, that we “surely meant” some narrower definition is not consistent with either the plain meaning of the Pledge language, our published discussions about it, or the use of this definition in subsequent state student privacy laws where the Pledge has been a basis for the legislative language. There is no logic in creating or implying a meaning that would be in violation of FERPA and state laws, and we would have been happy to explain this to EFF had they reached out to us.

But whatever our view, the FTC has the authority to enforce the Pledge and interpret what the Commission thinks the language means – and we would be very surprised if they were to adopt EFF’s position that the language of the pledge should be read in a narrow and limited way.

EFF also takes issue with the fact that the Pledge covers only “school service providers” – that is, services designed and marketed to schools. However, the Pledge definition tracks consistently with the definitions in state laws and in previously proposed federal bills. SOPIPA, for example, covers programs and services that are primarily used for K-12 school purposes “and (were) designed and marketed” for such purposes. It would be quite confusing to schools and to vendors if the Pledge was interpreted to be out of sync with the standard definitions that have become widely adopted.

Why do state laws and the Pledge cover only services designed and marketed to schools? As we have discussed previously, vendors who sell general products shouldn’t be required to revamp their services simply because a school is using their product. In many cases, the vendor may not even know that a school is using their products. However the Pledge does cover services that are designed and marketed to schools, a distinction consistently made by the Pledge and state laws. We disagree that this is a “loophole” – in fact it’s an important legal distinction that policymakers have supported.

The Future of Privacy Forum has worked on student privacy with just about every major privacy group and education organization involved with student data. Getting this right is important to us. Ensuring responsible use of student data requires close collaboration between school leaders, teachers, parents, vendors, and students. We have huge respect for EFF’s smart advocacy across a range of tech policy issues, and hope they will take the time to work with us and others who are working to ensure responsible uses of technology for student learning.

A National Challenge: Advancing Privacy While Preserving the Utility of Data

A Call for Papers

Addressing “privacy” increasingly involves discussions of ethics, philosophy, and psychology along with law, economics, and technology. Finding an approach to future privacy concerns that supports the benefits of technology without compromising individual rights is an increasingly complex challenge. Not only is technology continuously advancing, but individual attitudes, expectations, and participation vary greatly. New ideas and approaches to privacy must be identified and developed at the same pace and with the same focus as the technologies they address.

To contribute to this important discussion, the Future of Privacy Forum (FPF), Washington & Lee University School of Law (W&L), and the International Association of Privacy Professionals (IAPP) are collaborating to produce an on-line Roundtable Issue of the Washington & Lee Law Review during the 2016–2017 academic year. This Issue will focus on data and privacy topics relating to the National Privacy Research Strategy (NPRS), published by the National Science and Technology Council’s Networking and Information Technology Research and Development Program in June 2016.

FPF, W&L, and IAPP are sponsoring this Call for Papers and hosting a Symposium on Privacy Research Prioritization. Authors from multiple disciplines including law, computer science, statistics, engineering, social science, ethics and business are invited to submit papers for presentation at a fullday program to take place in Washington, D.C. in April 2017. This Call requests in particular topics that address or support issues within three of the main priorities outlined in the NPRS:

• Increase the transparency of data collection, sharing, use, and retention (Priority 3.4)

• Assure that information flows and use are consistent with privacy rules (Priority 3.5)

• Reduce privacy risks of analytical algorithms (Priority 3.7)

Optimally, papers will contain between 5,000—8,000 words but must not exceed 10,000 words. Papers must be submitted to ([email protected]) by February 24, 2017.

 READ MORE

Use of Limit Ad Tracking Rises, in the US

A new study about consumer use of Limit Ad Tracking indicates that the rate rose to 20% in the US. According to adjust, the previous rate of iOS users who opted out of ad tracking was 16-18%. This data seems to be in sync with TUNE’s reporting which found that 17% of users enabled the Limit Ad Tracking setting.

However, globally, rates remain the same around 18%. Adjust points to the release of Apple’s iOS 10 as contributing to the spike of 2 million US users activating the Limit Ad Tracking feature for the first time. Learn more about the Limit Ad Tracking feature in our blog post.

A Discussion of "Owned: How the Internet of Things Took Our Property and Privacy"

On October 6, 2016, Professor Joshua Fairfield from the Washington and Lee University School of Law joined us to discuss a chapter from his upcoming book, “Owned: How the Internet of Things Took Our Property and Privacy.” The chapter highlights the importance of ownership in preserving privacy – and what society may lose with the erosion of ownership in the digital age. The chapter focuses on how more “concrete” property rights can serve as a backstop for buttressing harder-to-define privacy rights. A diverse group of attendees from business, government, advocacy, and academia joined to debate and discuss the chapter.

VIEW PHOTOS

Jules Polonetsky Featured on Modern Workplace


On October 11, 2016, FPF’s CEO, Jules Polonetsky, was featured on Modern Workplace for “The Privacy Balance: Staying secure and ethical with your data.” The webinar focused on how to navigate common blind spots in data security and avoid privacy pitfalls. Jules discussed tips for creating an organization that is profitable and ethical. Watch a clip below.

 

Source: Modern Workplace. To see the full episode, register at modernworkplace.com

October 27th Event: EU Law, Institutions and Policymaking

FPF and Goethe-Institut are pleased to present an intensive education program titled:

Understanding EU Law, Institutions and Policymaking:

An Advanced Legal Colloquium for Privacy Leaders

Special Focus: Germany, France and Benelux

The goal of this program is to provide privacy experts with a deeper understanding of the broader legal environment in Europe. Who can bring an action, which courts are involved, how do the various national and Europe wide systems interact? We will seek to understand the issues and challenges of privacy law in the EU context. The program will not focus on the specifics of the GDPR, but rather the broader context of EU law, politics and policy that will help privacy and policy leaders of US multi-nationals better understand our partners across the Atlantic.

REGISTER

To view written materials for each session, click here: Materials

Agenda


8:30 – 9:00 a.m.

Registration and Continental Breakfast


9:00 – 9:05 a.m.

Welcome


9:05 – 10:00 a.m.

Session 1: European Law and Institutions

The structure, relationships, jurisprudence, role and jurisdiction of  European institutions vs member states.

Professor Nico van Eijk, University of Amsterdam


10:00 – 11:00 a.m.

Session 2: Political Decision Making

The role of civil society, political parties and civil movements, lobbying and issue advocacy in the political decision making process.

Andrea Glorioso, Counselor (ICT & Digital Economy), Delegation of the European Union to the USA


11:00 – 11:15 a.m.

Coffee Break


11:15 – 12:00 p.m.

Session 3: The Benelux Region

A discussion on legal institutions and political culture

Professor Nico van Eijk, University of Amsterdam


12:00 – 12:10 p.m.

Highlight: Focus on Spain, Catalonia and Basque Country

Understanding Courts and Data protection enforcement within Spain

Laura Vivet, Collaborator Professor, Autonomous University of Barcelona


12:10 – 1:00 p.m.

Networking Lunch


1:00 – 2:00 p.m.

Session 4: Focus on France and Germany

Understanding Courts, Policy and Legal Culture


2:00 – 3:00 p.m.

Session 5: Focus on Germany

Data Protection Enforcement within the German Legal System

Felix Wittern, Fieldfisher, Silicon Valley Office


3:00 – 4:00 p.m.

Session 6: Path to the Future

Recent cases in the European Courts and liability and appeals under the GDPR

Moderator: Omer Tene, Senior Fellow, Future of Privacy Forum