The Student Privacy Pledge is a Binding Legal Commitment and G Suite for Education Makes the Grade

The Student Privacy Pledge is a public and legally enforceable statement by ed tech companies to safeguard student privacy, built around a dozen commitments regarding the collection, maintenance, and use of student personal information.  Since it was introduced in 2014 by the Future of Privacy Forum and the Software and Information Industry Association, more than 300 ed tech companies have become signatories, and it was endorsed by the White House in 2015.

Yesterday, the Mississippi Attorney General filed a Complaint against Google, alleging that the company violated its promises as a signatory of the Student Privacy Pledge (“Pledge”). After reviewing the Complaint and Google’s policy, the Future of Privacy Forum has determined that Google’s practices are consistent with its obligations under the Pledge.

First, it is important to understand that only Google’s educational products, known collectively as G Suite for Education, are subject to the Pledge. As we have written before, the Pledge covers only “school service providers,” – that is, companies when they are operating in their capacity as a provider of a service that is both designed and marketed for use in schools. This is consistent with most state student privacy laws and proposed federal bills. Why? When vendors are selling their general market products, they should not be required to change those products simply because they are sold to a school. The Pledge, and all state student privacy laws, are written to cover services like G Suite for Education that are specifically designed and marketed to schools.

The Complaint alleges that Google is advertising to students based on “data mining” the student’s behavioral activity while logged into G Suite. However, Google has consistently stated that there are no ads in the G Suite for Education Core Services, which include Gmail, Calendar, Classroom, Drive, Docs, Sheets, Slides, Contacts, Groups, Vault, and Hangouts. Beyond those core services, school administrators must opt-in to allow students to use their school account with other Google services, some of which are ad supported, such as YouTube, Maps, or Blogger. Students can likewise be on a personal device, and “sync” the device or use other websites with their G Suite account if a school administrator allows them to do so.  When using these commercial sites – not covered by the Pledge – students will see ads, but no student personal information from within the G Suite for Education Core Service products is ever used to target that advertising.

Targeted advertisements to students using data collected from G Suite for Education student accounts is not allowed under the Pledge, and Google does not do this. Therefore, we don’t believe the Complaint raises any valid issues about student data use by Google for Mississippi students. Google’s practices meet the commitments it has publicly made, as expressed in the Student Privacy Pledge.

Mobile Apps Study Underscores Necessity of Strong Best Practices for Health and Wellness Data

Kelsey Finch, FPF Policy Counsel, presented FPF’s 2016 Mobile Apps Study at the Federal Trade Commission’s annual PrivacyCon on January 12, 2017. Kelsey exhibited a visual representation of the App Study designed by FPF Fellow, Carolina Alonso. See the visual here and below.

The 2016 Mobile Apps Study underscores the necessity of strong Best Practices for health and wellness data. The App Study revealed that while the number of apps that provide privacy policies continues its upward trend from our previous surveys in 2011 and 2012, health and fitness apps – which often control and link to wearable devices, and which can collect sensitive health and wellness data – do worse than average at providing privacy policies. Only 70% of top health and fitness apps had a privacy policy (6% lower than overall top apps), and only 61% linked to it from the app store listing page (10% lower than overall top apps).

READ STUDY

Recognizing the need for strong Best Practices, FPF released Best Practices for Consumer Wearables and Wellness Apps and Devices, a detailed set of guidelines that responsible companies can follow to ensure they provide practical privacy protections for consumer-generated health and wellness data. The document was produced with support from the Robert Wood Johnson Foundation and incorporates input from a wide range of stakeholders including companies, advocates, and regulators.

READ BEST PRACTICES

FPF Welcomes New Fellows

FPF is pleased to welcome Gabriela Zanfir-Fortuna, PhD. as a non-resident fellow. Gabriela worked for more than two years (March 2014-June 2016) for the European Data Protection Supervisor in Brussels, both for ‘Supervision and Enforcement’ and ‘Policy and Consultation’ Units. Notably, she represented the EDPS in various subgroups of the Article 29 Working Party, being a member of the drafting team of WP29 that assessed the EU-US Privacy Shield. Gabriela worked on international data transfers, international relations, EU large scale IT-systems (particularly, the Schengen Information System), case-law overviews and she was a member of the Court team.

At FPF, she will be responsible for tracking global privacy research scholarship for the FPF academic-industry Research Coordination Network (RCN). She will also help author blog posts, provide counsel on EU privacy law, and monitor EU activities related to FPF’s work. Please join us in welcoming Gabriela to the team!


FPF is delighted to welcome Leslie Harris as a senior fellow. Leslie is an Internet and technology policy lawyer who has been closely involved in the development of seminal Internet policy and regulation in Congress, Executive Branch agencies, the European Union and global governance bodies including the OECD,UNESCO and ICANN. From 2005 to 2014, she served as President/CEO of the Center for Democracy & Technology. Leslie is currently President of the Harris Strategy Group, a senior level consultancy providing strategic advice on policy and strategy related to new technology, civil liberties and human rights.

At FPF, she will lead our efforts to explore and examine legal, ethical, technological, administrative and practical roadblocks and challenges to sharing administrative data between businesses and researchers. Please join us in welcoming Leslie to the team!

Video Archive: 2016 Privacy Papers for Policymakers

On January 11, 2017, FPF and Honorary Co-Hosts Senator Edward J. Markey, and Co-Chairs of the Congressional Bi-Partisan Privacy Caucus, Congressman Joe Barton, and Congresswoman Diana DeGette, held the 7th Annual Privacy Papers for Policymakers at the Dirksen Senate Office Building. The videos are below.








 

 

FPF Supports the Email Privacy Act – H.R. 387

Yesterday, Congress introduced the Email Privacy Act (H.R. 387), which would update protections in the Electronic Communications Act (ECPA) to take account of citizens’ evolving use of technology and better align the law with consumers’ reasonable expectations of privacy in the contents of their email communications. Offered by Representatives Kevin Yoder (R-KS) and Jared Polis (D-CO), this bi-partisan bill simplifies the law and codifies practices currently employed by law enforcement agencies and companies; in most circumstances, the bill requires the government to obtain a warrant government in order to access to email content.  The bill would reduce confusion for police, companies, and users, while bringing statutory protections for electronic communications into the modern era.

ECPA, originally passed in 1986, created standards for government access to the content of communications sent over telecommunications systems – it is the primary federal law governing law enforcement access to Internet traffic. Although ECPA was forward-thinking for its time, the developments of technology and communications in the 30 years since have greatly surpassed its scope and the effectiveness of its policy direction.

The Email Privacy Act makes several important updates. Under ECPA, the content of communications (including email) could be obtained without a warrant after 180 days. This provision may had been reasonable when online storage was expensive, email use was limited, and few American engaged in sensitive communications online.  However, in light of the current use and storage of email communications as a typical and standard means of individual and organizational correspondence, there is no reason to reduce protections for those communications after six months.  This update recognizes the central role of email messages in modern society, and ensures that individuals and organizations can maintain their communications in reasonable confidence – requiring law enforcement to obtain a warrant based on probable cause for access. The “probable cause” standard for requesting or accessing the content of such communications is consistent with other protections from arbitrary search; eliminating this “180-day rule” is an excellent and necessary improvement to existing law.

Likewise, previous Department of Justice interpretation of ECPA established a standard that “opening” an email removed it from warrant protection, even within the 180-day period. This is interpretation does not align with users’ current expectations given the common use of email for communication by and between individuals and organizations. The contents of email, like the contents of traditional hard-copy official correspondence, should always enjoy 4th Amendment protections. The Email Privacy Act appropriately reflects that standard, requiring the government to demonstrate probable cause before accessing emails – even when those messages have been opened by the recipient.

While the bill doesn’t include every improvement or reform that many advocates would like to see, it includes key and important requirements that make big steps forward in the protections the contents of electronic communications. Nothing in the bill affects existing requirements under the Wiretap Act, FISA, or any other current law. FPF joins numerous other privacy and advocacy organizations to urge immediate passage of the bill as introduced.

 

The Privacy Policy Snapshot Challenge – $20,000 First Prize.

The Privacy Policy Snapshot Challenge calls upon developers, designers, health data privacy experts, and creative, out-of-the-box thinkers to use the US Department of Health and Human Services ONC’s Model Privacy Notice template to create an online tool that can generate a user-friendly “snapshot” of a product’s privacy practices. ONC will award a total of $35,000 in prizes through this challenge. Enter your submissions now!  The deadline for submission is April 10, 2017 with winners expected to be announced in mid-2017. For more information, view the Federal Register Notice.

ONC is also hosting an informational webinar on Thursday, January 12, 2017 from 2:00-3:00pm ET. Register for the webinar.

As the ONC team explains, ” More and more individuals are obtaining access to their electronic health information and using consumer health technology to manage this information. As retail products that collect digital health data directly from consumers are used, such as exercise trackers, it is increasingly important for consumers to be aware of companies’ privacy and security policies and information sharing practices. Health technology developers can use the Mobile Privacy Notice to easily enter their information practices and produce a notice to allow consumers to quickly learn and understand privacy policies, compare company policies, and make informed decisions”.

As FPF showed in our recent FPF Mobile Apps Study , the number of apps that provide privacy policies continues its upward trend from our previous surveys in 2011 and 2012. But health and fitness apps – which may access sensitive, physiological data collected by sensors on a mobile phone, wearable, or other device – do worse than average at providing privacy policies. Only 70% of top health and fitness apps had a privacy policy (6% lower than overall top apps), and only 61% linked to it from the app platform listing page (10% lower than overall top apps).

The App Study also looked specifically at period tracking and sleep aid apps. Only 63% of period tracking apps provided a link to the privacy policy from the app platform listing page. More disappointingly, only 54% of sleep aid apps provided a link to the privacy policy from the app platform listing page.

FPF also released a best practices that responsible companies can follow to ensure they provide practical privacy protections for consumer-generated health and wellness data. The document was produced with support from the Robert Wood Johnson Foundation and incorporates input from a wide range of stakeholders including companies, advocates, and regulators.

Fitness and wellness data from apps and wearables provide significant benefits for users, but it is essential that companies incorporate Fair Information Practice Principles to safeguard this data.

FPF Testifies at NYC Taxi and Limousine Commission Hearing

Yesterday, Lauren Smith, FPF Policy Counsel testified at the NYC Taxi and Limousine Commission’s (TLC) hearing about its proposed rules that add new trip reporting requirements for for-hire vehicle (FHV) bases.

Lauren explained that the proposed rules would create significant privacy risks by mandating that FHV bases transmit passenger drop-off time and location data. This can be highly sensitive information. These additional data points pose particular risks in light of the TLC’s existing data collection, given that FHV bases must already report the date, time, and location of passenger pick-ups. With the addition of drop-off data as proposed by the rule, the TLC’s data set would provide the TLC and the public with a comprehensive view of the movements of individual New Yorkers.

Lauren asserted that at minimum, the TLC should explore ways to: 1) tailor the data collection more narrowly to the stated purpose by focusing on trip duration rather than the location of passengers’ trips; 2) collect less precise, more general geographic information; and 3) enact policies and procedures that detail the privacy and security protections for such sensitive data.

Read the full testimony.

Conference Proceedings – Beyond IRBs Designing Ethical Review Processes for Big Data Research

Today, FPF is pleased to make available the Conference Proceedings from our Beyond IRBs: Designing Ethical Review Processes for Big Data Research workshop. The workshop, co-hosted by the Washington & Lee School of Law and supported by the National Science Foundation and the Alfred P. Sloan Foundation, aimed to identify processes and commonly accepted ethical principles for data research in academia, government and industry.

The workshop brought together over 60 researchers, including lawyers, computer scientists, ethicists and philosophers, as well as policymakers from government, industry and civil society, to discuss a blueprint for infusing ethical considerations into organizational processes in a data rich environment. To learn more about the event, its participants, and its organizers, please visit bigdata.fpf.org.

As part of the Beyond IRBs workshop, FPF and the Washington & Lee School of Law issued a call for papers addressing ethical, legal, and technical guidance for organizations conducting research on personal information. The papers were published in Spring 2016 in the Washington & Lee Online Law Review.

Building on the discussions at Beyond IRBs, FPF also co-hosted a Roundtable on Ethics, Privacy, and Research in June 2016 with the Ohio State University’s Program on Data and Governance. This timely event, which followed the White House’s call to develop strong data ethics frameworks, convened corporate and academic leaders to discuss how to integrate ethical and privacy considerations into innovative data projects and research. To learn more about the event, see our post here.

FPF was also recently awarded additional grants by the National Science Foundation and the Alfred P. Sloan Foundation in our pursuit of thought-provoking discussions around ethical, legal, and technical guidance for organizations conducting research on personal information.

Read the Conference Proceedings.

NYC Taxi & Limousine Commission Proposal Requiring Drop-Off Location Data Raises Privacy Concerns

On Monday, the Future of Privacy Forum joined with the Center for Democracy & Technology, the Electronic Frontier FoundationThe Constitution Project, and Tech Freedom to write the NYC Taxi and Limousine Commission (TLC) about its proposed rules that add new trip reporting requirements for for-hire vehicle (FHV) bases.

The proposed rule would create significant privacy risks by mandating that FHV bases collect and transmit passenger drop-off time and location data, which can be highly sensitive information. The proposed rule poses particular risks in light of the TLC’s current data collection—FHV bases must already report the date, time, and location of passenger pick-ups—and the history of similar passenger data held by TLC becoming publicly available in response to Freedom of Information requests.  With the addition of drop-off data, the TLC’s data set would provide the TLC and the public with a comprehensive view of the movements of individual New Yorkers.

We understand that the Commission has proposed this rule change in order to reduce the risks associated with fatigued driving. However, it is unclear how the collection of precise location information—information that includes details of the day-to-day activities, lifestyles, and habits of millions of individuals—will achieve this end.  Driver fatigue results from long periods of time on the road, which is information the TLC could ascertain from collecting trip duration rather than pick-up and drop-off location information of individual passengers. At minimum, the TLC should explore ways to: 1) tailor the data collection more narrowly to the stated purpose by focusing on trip duration rather than the location of passengers’ trips; 2) collect less precise, more general geographic information; and 3) enact policies and procedures that detail the privacy and security protections for such sensitive data.

Read the letter.

FPF Statement on Privacy and Wearables

A new report released was today by the Center for Digital Democracy and the School of Communications at American University focuses on privacy and wearables. As a recent HHS report made clear, the data collected by most wearables is not regulated to the same degree as information you provide to your doctor.  But several mechanisms have ensured that many health and fitness apps respect users’ data – the leading app platforms impose strong privacy requirements, barring sale of sensitive data and requiring enhanced notice.  Companies can also look to the guidelines established by FPF in our Best Practices for Consumer Wearables and Wellness Apps and Devices.  And of course, the Federal Trade Commission has the authority to investigate and fine companies that do not keep their promises or act unfairly.

“Some data collected by wearables may be trivial, but other information can be highly sensitive,” said Kelsey Finch, FPF Policy Counsel.  “Companies must take affirmative steps to build consumer trust – especially when they are using intimate, identifiable data.”