Details on the EU-US privacy engineering workshop were published in European Data Protection Supervisor’s latest newsletter. This workshop was organized by the Internet Privacy Engineering Network (IPEN), Future of Privacy Forum, KU Leuven and Carnegie Mellon University, on November 10, in Leuven.
“The organisers and participants will document the outcome of the workshop in research reports and policy recommendations, which should be available from early next year.”
FPF Comments on the FTC and Department of Education Student Privacy and Ed Tech Workshop
On Friday, November 17th, 2017, the Future of Privacy Forum filed comments with the Federal Trade Commission and the Department of Education in conjunction with their upcoming workshop, to be held on December 1st. The workshop will examine the privacy issues inherent to the use of educational technology in schools, and consider the intersection of the Federal Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA). FPF’s comments are focused on two areas that merit additional clarity: 1) when schools’ consent to ed tech providers for data collection about students under thirteen years old is sufficient under COPPA; and 2) whether the rights and safeguards typically provided to parents under COPPA accrue to schools when administrators consent to data collection from young students. In our comments, we argue that schools should be able to provide consent for the use of ed tech tools when they will be used exclusively for educational purposes; and that when schools provide consent for the use of ed tech for educational purposes, COPPA rights and safeguards should accrue to the school.
While FERPA’s requirements for schools, parents, and ed tech providers are fairly clear, COPPA is ambiguous as to how schools may provide consent to the use of ed tech products in schools for children under the age of 13. Section M of the FTC’s FAQ on COPPA states that “schools may act as the parent’s agent and can consent to the collection of kids’ information on the parent’s behalf.” But this statement could be interpreted either as similar to FERPA’s school official exception, or as requiring that, since schools are acting as “the parent’s agent,” they must actively seek out parental consent before they can consent use of an ed tech tool. In some cases, ed tech providers have approached that ambiguity by attempting to shift the liability to the schools via contract, which often places a larger burden on the schools than is appropriate.
In many circumstances, it is appropriate that schools have the authority to provide consent for the use of ed tech products. Certain basic functions that require consent under COPPA would be thrown into disarray were the schools not able to provide that consent. Some schools could be unable to perform basic functions that rely on outside parties, such as operation of administrative information systems, and teachers could also be forced to design lesson plans around some students but not others. Parental consent to share student information is likely still appropriate for less integral functions, like using student information in the yearbook, or announcing the honor roll. When the school is permitted under COPPA to consent to an ed tech product being used, the rights that COPPA confers should also apply to a school, including any rights to control, access, review, and delete data. In some cases, providing parents with that right, while only providing the school with the right to consent on students’ behalf, could raise the specter of parents changing their childrens’ academic results; this would undermine the administrability and intergrity of some ed tech tools.
Increased clarity on FERPA and COPPA responsibilities for schools will allow for the responsible use of valuable and innovative ed tech products to assist students, and we look forward to discussing these issues further at the Student Privacy and Ed Tech workshop on December 1st.
A Conversation with Giovanni Buttarelli about The Future of Data Protection: setting the stage for an EU Digital Regulator
The nature of the digital economy is as such that it will force the creation of multi-competent supervisory authorities sooner rather than later. What if the European Data Protection Board would become in the next 10 to 15 years an EU Digital Regulator, looking at matters concerning data protection, consumer protection and competition law, having “personal data” as common thread? This is the vision Giovanni Buttarelli, the European Data Protection Supervisor, laid out last week in a conversation we had at the IAPP Data Protection Congress in Brussels.
The conversation was a one hour session in front of an over-crowded room in The Arc, a cozy amphitheater-like venue inducing bold ideas being expressed in a stimulating exchange.
To begin with, I reminded the Supervisor that at the very beginning of his mandate, in early 2015, he published the 5-year strategy of the EDPS. At that time the GDPR wasn’t adopted yet and the Internet of Things was taking off. Big Data had been a big thing for a while and questions about the feasibility and effectiveness of a legal regime that is centered around each data item that can be traced back to an individual were popping up. The Supervisor wrote in his Strategy that the benefits brought by new technologies should not happen at the expense of the fundamental rights of individuals and their dignity in the digital society.
“Big data will need equally big data protection“, he wrote then, suggesting thus that the answer to Big Data is not less data protection, but enhanced data protection.
I asked the Supervisor if he thinks that the GDPR is the “big data protection” he was expecting or whether we need something more than what the GDPR provides for. And the answer was that “the GDPR is only one piece of the puzzle”. Another piece of the puzzle will be the ePrivacy reform, and another one will be the reform of the regulation that provides data protection rules for the EU institutions and that creates the legal basis for the functioning of the EDPS. I also understood from our exchange that a big part of the puzzle will be effective enforcement of these rules.
The curious fate of the European Data Protection Board
One centerpiece of enforcement is the future European Data Protection Board, which is currently being set up in Brussels so as to be functional on 25 May 2018, when the GDPR becomes applicable. The European Data Protection Board will be a unique EU body, as it will have a European nature, being funded by the EU budget, but it will be composed of commissioners from national data protection authorities who will adopt decisions, that will rely for the day-to-day activity on a European Secretariat. The Secretariat of the Board will be ensured by dedicated staff of the European Data Protection Supervisor.
The Supervisor told the audience that he either already hired or plans to hire a total of “17 geeks” adding to his staff, most of whom will be part of the European Data Protection Board Secretariat. The EDPB will be functional from Day 1 and, apparently, there are plans for some sort of inauguration of the EDPB celebrated at midnight on the 24th to the 25th of May next year.
These are my thoughts here: the nature of the EDPB is as unique as the nature of the EU (those of you who studied EU Law certainly remember from the law school days how we were told that the EU is a sui generis type of economical and political organisation). In fact, the EDPB may very well serve as test model for ensuring supervision and enforcement of other EU policy areas. The European Commission could test the waters to see whether such a mixt national/European enforcement mechanism is feasible.
There is a lot of pressure on effective enforcement when it comes to the GDPR. We dwelled on enforcement, and one question that inevitably appeared was about the trend that starts to shape up in Europe, of having competition authorities and consumer protection authorities engaging in investigations together with, or in parallel with data protection authorities (see here – here and here).
“It’s time for a big change, and time for the EU to have a global approach“, the Supervisor said. And a change that will require some legislative action. “I’m not saying we will need an European FTC (US Federal Trade Commission – n), but we will need a Digital EU Regulator“, he added. This Digital Regulator would have the powers to also look into competition and consumer protection issues raised by processing of personal data (so, therefore, in addition to data protection issues). Acknowledging that these days there is a legislative fatigue in Brussels surrounding privacy and data protection, the Supervisor said he will not bring this idea to the attention of the EU legislator right now. But he certainly plans to do so, maybe even as soon as next year. The Supervisor thinks that the EDPB could morph into this kind of Digital Regulator sometime in the future.
Another question that had to be asked on enforcement was whether we should expect more concentrated and coordinated action of privacy commissioners on a global scale, in GPEN-like structures. The Supervisor revealed that the privacy commissioners that meet for the annual International Conference are “trying to complete an exercise about our future”. They are currently analyzing the idea of creating an entity with legal personality that will look into global enforcement cases.
Ethics comes on top of legal compliance
Another topic the conversation went to was “ethics”. The EDPS has been on the forefront of including the ethics approach in privacy and data protection law debates, by creating the Ethics Advisory Group at the beginning of 2016. I asked the Supervisor whether there is a danger that, by bringing such a volatile concept into the realm of data protection, companies would look at this as an opportunity to circumvent strict compliance and rely on sufficient self-assessments that their uses of data are ethical.
“Ethics comes on top of data protection law implementation”, the Supervisor explained. According to my understanding, ethics is brought into the data protection realm only after a controller or processor is already compliant with the law and, if they have to take equally legal decisions, they should rely on ethics to take the right decision.
We did discuss about other things during this session, including the 2018 International Conference of Privacy Commissioners that will take place in Brussels, and the Supervisor received some interesting questions from the public at the end, including about the Privacy Shield. But a blog can only be this long.
Note: The Supervisor’s quotes are so short in this blog because, as the moderator, I did my best to follow the discussion and steer it rather than take notes. So the quotes come from the brief notes I managed to take during this conversion.
MetroLab Network is a group of more than 35 city-university partnerships focused on bringing data, analytics, and innovation to city government. Its members include 38 cities, 4 counties, and 51 universities.
Who
Future of Privacy Forum
What
Privacy and Open Data
The Smart Cities and Open Data movements promise to use data to spark civic innovation and engagement, promote inclusivity, and transform modern communities. At the same time, advances in sensor technology, re-identification science, and Big Data analytics have challenged cities and their partners to construct effective safeguards for the collection, use, sharing, and disposal of personal information. In this breakout session, we will discuss privacy risks in open data programs and how cities like Seattle are promoting transparency while protecting individual rights.
Moderator
Kelsey Finch, Policy Counsel, FPF
Panelists
Michael Mattmiller, Chief Technology Officer, City of Seattle
Jesse Woo, Lawyer and Research Faculty, Georgia Tech
Privacy and Urban Instrumentation
As cities harness more data than ever, how can we assess the risks and opportunities of new technologies and data flows while preserving public trust and individual privacy? In this breakout session, come hear from Cities, CIOs, academic leaders, and industry experts as we examine the opportunities and challenges of new urban instrumentation and how we can come together to address privacy challenges in smart cities.
Moderator
Annie Antón, Professor, College of Computing, Georgia Tech
Panelists:
Nigel Jacob, City of Boston, Co-Founder, Mayor’s Office of New Urban Mechanics
Scott R. Shipman, General Counsel & Chief Privacy Officer, Verizon
The Top 10: Student Privacy News (October-November 2017)
The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories. This blog is cross-posted at studentprivacycompass.org.
GDPR kicks in on May 25th, 2018, and S. schools have begun to focus on how it applies to them. Every higher ed institution – and some K-12 institutions – as well as most ed tech companies with users in the EU will be impacted. (see my Storify of live-tweets from the panel). Novatia has some recentpotentially useful articles on GDPR and schools as well.
John Verdi Talks Connected Devices with Fox 2 St. Louis
On November 13, 2017, FPF’s Vice President of Policy, John Verdi, discussed the privacy implications of connected devices with Mike Colombo of Fox 2 St. Louis. John explained:
“What data is being transmitted and what data is being used really depends on the device,” Verdi said. “They can offload that information from the device to servers on the internet that are either controlled by the companies or third parties and there’s some processing that can happen there.”
“I think it’s really time for folks at the federal level to be thinking about comprehensive, baseline, common sense privacy law,” he said.
Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers
Data has become the currency of the modern economy. A recent study projects the global volume of data to grow from about 0.8 zettabytes (ZB) in 2009 to more than 35 ZB in 2020, most of it generated within the last two years and held by the corporate sector.
As the cost of data collection and storage becomes cheaper and computing power increases, so does the value of data to the corporate bottom line. Powerful data science techniques, including machine learning and deep learning, make it possible to search, extract and analyze enormous sets of data from many sources in order to uncover novel insights and engage in predictive analysis. Breakthrough computational techniques allow complex analysis of encrypted data, making it possible for researchers to protect individual privacy, while extracting valuable insights.
At the same time, these newfound data sources hold significant promise for advancing scholarship and shaping more impactful social policies, supporting evidence-based policymaking and more robust government statistics, and shaping more impactful social interventions. But because most of this data is held by the private sector, it is rarely available for these purposes, posing what many have argued is a serious impediment to scientific progress.
A variety of reasons have been posited for the reluctance of the corporate sector to share data for academic research. Some have suggested that the private sector doesn’t realize the value of their data for broader social and scientific advancement. Others suggest that companies have no “chief mission” or public obligation to share. But most observers describe the challenge as complex and multifaceted. Companies face a variety of commercial, legal, ethical, and reputational risks that serve as disincentives to sharing data for academic research, with privacy – particularly the risk of reidentification – an intractable concern. For companies, striking the right balance between the commercial and societal value of their data, the privacy interests of their customers, and the interests of academics presents a formidable dilemma.
To be sure, there is evidence that some companies are beginning to share for academic research. For example, a number of pharmaceutical companies are now sharing clinical trial data with researchers, and a number of individual companies have taken steps to make data available as well. What is more, companies are also increasingly providing open or shared data for other important “public good” activities, including international development, humanitarian assistance and better public decision-making. Some are contributing to data collaboratives that pool data from different sources to address societal concerns. Yet, it is still not clear whether and to what extent this “new era of data openness” will accelerate data sharing for academic research.
Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment.
FPF gratefully acknowledges the support of the Alfred P. Sloan Foundation for this project.
New Study: Companies are Increasingly Making Data Accessible to Academic Researchers, but Opportunities Exist for Greater Collaboration
FOR IMMEDIATE RELEASE
November 14, 2017
Contact: Melanie Bates, Director of Communications, [email protected]
New Study: Companies are Increasingly Making Data Accessible to Academic Researchers, but Opportunities Exist for Greater Collaboration
Washington, DC – Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, FPF reveals findings from research and interviews with experts in the academic and industry communities. Three main areas are discussed: 1) The extent to which leading companies make data available to support published research that contributes to public knowledge; 2) Why and how companies share data for academic research; and 3) The risks companies perceive to be associated with such sharing, as well as their strategies for mitigating those risks.
“More widespread access to corporate data sets would support new scholarship and allow researchers to consider questions that cannot fully be answered from publicly available data alone,” said Leslie Harris, FPF Senior Fellow and Understanding Data Sharing Decisions’ Principal Researcher. “In this exploratory study, we aim to contribute to the literature by seeking the ‘ground truth’ from the corporate sector about the challenges they encounter when they consider making data available for academic research.”
Of the companies interviewed, 70% report making at least some data available to academic researchers. Half of the sharing companies began making data available to external researchers within the last five years. Close to half of the interviewed companies said that the main reason for sharing data for research was to obtain insights that would help the company “better execute” or “better understand” their mission. A number of companies also said that sharing data for research helped to build their brands, strengthen relationships with academics, and attract talent to the company. The study also found that companies are concerned about privacy, particularly the risk of re-identification. Companies are equally concerned that sharing data for research might diminish or destroy the intellectual property value of their data.
FPF identified several opportunities to promote data-driven research: 1) to enhance the positive public profile of company/academic data sharing; 2) to help mitigate perceived risks, particularly privacy and re-identification risks; 3) to develop and share tools for public outreach and community engagement; 4) to encourage peer-to-peer knowledge sharing; and 5) to create a clearinghouse identifying data types desired by academics.
“We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment,” said Jules Polonetsky, FPF’s CEO.
FPF released Understanding Data Sharing Decisions today at the ADRF Network Inaugural Conference during the session on Expanding Private Sector Administrative Data Access. The focus of the discussion centered around why and how companies share data for academic research, strategies for mitigating risks and building trust, and recommendations for encouraging company-academic data sharing.
FPF would like to thankLeslie Harris (FPF Senior Fellow), the Principal Researcher of this report, and Chinmayi Sharma (University of Virginia School of Law), Research Assistant. FPF gratefully acknowledges the support of the Alfred P. Sloan Foundation for this project.
###
The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.
FPF Advisory Board Member Amie Stepanovich Discusses 'Why Inclusion Matters'
Pictured Above: Amie Stepanovich (Access Now)
Yesterday, Future of Privacy Forum Advisory Board Member, Amie Stepanovich, U.S. Policy Manager at Access Now, published an article explaining the importance of ensuring marginalized communities have greater influence on how emerging technologies are developed. Amie says:
“Ultimately, we won’t be able to change things in any significant way so long as we create and facilitate an environment that is hostile toward diversity. Instead of digging our heads into the sand when scandals erupt, those of us working in tech should embrace change and invest in identifying and promoting smart, diverse voices. That means developing institutional and operational systems and processes that respect the range of backgrounds and experiences that diversity brings; creating public policies that are not developed or dictated by a single point of view; and providing platforms for discussion such as panels or events that highlight the voices and perspectives of under-represented people and organizations that are breaking through societal roadblocks and developing valuable expertise, often at great personal cost.”