EDPS Highlights EU-US Privacy Engineering Workshop

Details on the EU-US privacy engineering workshop were published in European Data Protection Supervisor’s latest newsletter. This workshop was organized by the Internet Privacy Engineering Network (IPEN), Future of Privacy Forum, KU Leuven and Carnegie Mellon University, on November 10, in Leuven.

“The organisers and participants will document the outcome of the workshop in research reports and policy recommendations, which should be available from early next year.”

READ NEWSLETTER

FPF Comments on the FTC and Department of Education Student Privacy and Ed Tech Workshop

On Friday, November 17th, 2017, the Future of Privacy Forum filed comments with the Federal Trade Commission and the Department of Education in conjunction with their upcoming workshop, to be held on December 1st. The workshop will examine the privacy issues inherent to the use of educational technology in schools, and consider the intersection of the Federal Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA). FPF’s comments are focused on two areas that merit additional clarity: 1) when schools’ consent to ed tech providers for data collection about students under thirteen years old is sufficient under COPPA; and 2) whether the rights and safeguards typically provided to parents under COPPA accrue to schools when administrators consent to data collection from young students. In our comments, we argue that schools should be able to provide consent for the use of ed tech tools when they will be used exclusively for educational purposes; and that when schools provide consent for the use of ed tech for educational purposes, COPPA rights and safeguards should accrue to the school.

While FERPA’s requirements for schools, parents, and ed tech providers are fairly clear, COPPA is ambiguous as to how schools may provide consent to the use of ed tech products in schools for children under the age of 13. Section M of the FTC’s FAQ on COPPA states that “schools may act as the parent’s agent and can consent to the collection of kids’ information on the parent’s behalf.” But this statement could be interpreted either as similar to FERPA’s school official exception, or as requiring that, since schools are acting as “the parent’s agent,” they must actively seek out parental consent before they can consent use of an ed tech tool.  In some cases, ed tech providers have approached that ambiguity by attempting to shift the liability to the schools via contract, which often places a larger burden on the schools than is appropriate.

In many circumstances, it is appropriate that schools have the authority to provide consent for the use of ed tech products. Certain basic functions that require consent under COPPA would be thrown into disarray were the schools not able to provide that consent. Some schools could be unable to perform basic functions that rely on outside parties, such as operation of administrative information systems, and teachers could also be forced to design lesson plans around some students but not others. Parental consent to share student information is likely still appropriate for less integral functions, like using student information in the yearbook, or announcing the honor roll.  When the school is permitted under COPPA to consent to an ed tech product being used, the rights that COPPA confers should also apply to a school, including any rights to control, access, review, and delete data.  In some cases, providing parents with that right, while only providing the school with the right to consent on students’ behalf, could raise the specter of parents changing their childrens’ academic results; this would undermine the administrability and intergrity of some ed tech tools.

Increased clarity on FERPA and COPPA responsibilities for schools will allow for the responsible use of valuable and innovative ed tech products to assist students, and we look forward to discussing these issues further at the Student Privacy and Ed Tech workshop on December 1st.

READ COMMENTS

A Conversation with Giovanni Buttarelli about The Future of Data Protection: setting the stage for an EU Digital Regulator

The nature of the digital economy is as such that it will force the creation of multi-competent supervisory authorities sooner rather than later. What if the European Data Protection Board would become in the next 10 to 15 years an EU Digital Regulator, looking at matters concerning data protection, consumer protection and competition law, having “personal data” as common thread? This is the vision Giovanni Buttarelli, the European Data Protection Supervisor, laid out last week in a conversation we had at the IAPP Data Protection Congress in Brussels.

The conversation was a one hour session in front of an over-crowded room in The Arc, a cozy amphitheater-like venue inducing bold ideas being expressed in a stimulating exchange.

To begin with, I reminded the Supervisor that at the very beginning of his mandate, in early 2015, he published the 5-year strategy of the EDPS. At that time the GDPR wasn’t adopted yet and the Internet of Things was taking off. Big Data had been a big thing for a while and questions about the feasibility and effectiveness of a legal regime that is centered around each data item that can be traced back to an individual were popping up. The Supervisor wrote in his Strategy that the benefits brought by new technologies should not happen at the expense of the fundamental rights of individuals and their dignity in the digital society.

Big data will need equally  big data protection, he wrote then, suggesting thus that the answer to Big Data is not less data protection, but enhanced data protection.

I asked the Supervisor if he thinks that the GDPR is the “big data protection” he was expecting or whether we need something more than what the GDPR provides for. And the answer was that “the GDPR is only one piece of the puzzle”. Another piece of the puzzle will be the ePrivacy reform, and another one will be the reform of the regulation that provides data protection rules for the EU institutions and that creates the legal basis for the functioning of the EDPS. I also understood from our exchange that a big part of the puzzle will be effective enforcement of these rules.

The curious fate of the European Data Protection Board

One centerpiece of enforcement is the future European Data Protection Board, which is currently being set up in Brussels so as to be functional on 25 May 2018, when the GDPR becomes applicable. The European Data Protection Board will be a unique EU body, as it will have a European nature, being funded by the EU budget, but it will be composed of commissioners from national data protection authorities who will adopt decisions, that will rely for the day-to-day activity on a European Secretariat. The Secretariat of the Board will be ensured by dedicated staff of the European Data Protection Supervisor.

The Supervisor told the audience that he either already hired or plans to hire a total of “17 geeks” adding to his staff, most of whom will be part of the European Data Protection Board Secretariat. The EDPB will be functional from Day 1 and, apparently, there are plans for some sort of inauguration of the EDPB celebrated at midnight on the 24th to the 25th of May next year.

These are my thoughts here: the nature of the EDPB is as unique as the nature of the EU (those of you who studied EU Law certainly remember from the law school days how we were told that the EU is a sui generis type of economical and political organisation). In fact, the EDPB may very well serve as test model for ensuring supervision and enforcement of other EU policy areas. The European Commission could test the waters to see whether such a mixt national/European enforcement mechanism is feasible.

There is a lot of pressure on effective enforcement when it comes to the GDPR. We dwelled on enforcement, and one question that inevitably appeared was about the trend that starts to shape up in Europe, of having competition authorities and consumer protection authorities engaging in investigations together with, or in parallel with data protection authorities (see here – here and here).

It’s time for a big change, and time for the EU to have a global approach, the Supervisor said. And a change that will require some legislative action. “I’m not saying we will need an European FTC (US Federal Trade Commission – n), but we will need a Digital EU Regulator“, he added. This Digital Regulator would have the powers to also look into competition and consumer protection issues raised by processing of personal data (so, therefore, in addition to data protection issues). Acknowledging that these days there is a legislative fatigue in Brussels surrounding privacy and data protection, the Supervisor said he will not bring this idea to the attention of the EU legislator right now. But he certainly plans to do so, maybe even as soon as next year. The Supervisor thinks that the EDPB could morph into this kind of Digital Regulator sometime in the future.

The interplay among these three fields of law has been on the Supervisor’s mind for some time now. The EDPS issued four Opinions already that set the stage for this proposal – See Preliminary Opinion on “Privacy and competitiveness in the age of Big Data: the interplay between data protection, competition law and consumer protection in the digital economy“, Opinion 4/2015 “Towards a new digital ethics“, Opinion 7/2015 “Meeting the Challenges of Big Data“, and finally Opinion 8/2016 on “coherent enforcement of fundamental rights in the age of Big Data“. So this is certainly something the data protection bubble should keep their eyes on.

Enhanced global enforcement initiatives

Another question that had to be asked on enforcement was whether we should expect more concentrated and coordinated action of privacy commissioners on a global scale, in GPEN-like structures. The Supervisor revealed that the privacy commissioners that meet for the annual International Conference are “trying to complete an exercise about our future”. They are currently analyzing the idea of creating an entity with legal personality that will look into global enforcement cases.

Ethics comes on top of legal compliance

Another topic the conversation went to was “ethics”. The EDPS has been on the forefront of including the ethics approach in privacy and data protection law debates, by creating the Ethics Advisory Group at the beginning of 2016. I asked the Supervisor whether there is a danger that, by bringing such a volatile concept into the realm of data protection, companies would look at this as an opportunity to circumvent strict compliance and rely on sufficient self-assessments that their uses of data are ethical.

“Ethics comes on top of data protection law implementation”, the Supervisor explained. According to my understanding, ethics is brought into the data protection realm only after a controller or processor is already compliant with the law and, if they have to take equally legal decisions, they should rely on ethics to take the right decision.

We did discuss about other things during this session, including the 2018 International Conference of Privacy Commissioners that will take place in Brussels, and the Supervisor received some interesting questions from the public at the end, including about the Privacy Shield. But a blog can only be this long.

Note: The Supervisor’s quotes are so short in this blog because, as the moderator, I did my best to follow the discussion and steer it rather than take notes. So the quotes come from the brief notes I managed to take during this conversion.

This article originally appeared on pdpEcho.

Roundtable Discussion: Smart Cities and Open Data (2017 MetroLab Network Annual Summit)

You are invited to join the Future of Privacy Forum at the 2017 MetroLab Network Annual Summit for a roundtable discussion about smart cities and open data.

MetroLab Network is a group of more than 35 city-university partnerships focused on bringing data, analytics, and innovation to city government. Its members include 38 cities, 4 counties, and 51 universities.

Who

Future of Privacy Forum

What

Privacy and Open Data

The Smart Cities and Open Data movements promise to use data to spark civic innovation and engagement, promote inclusivity, and transform modern communities. At the same time, advances in sensor technology, re-identification science, and Big Data analytics have challenged cities and their partners to construct effective safeguards for the collection, use, sharing, and disposal of personal information. In this breakout session, we will discuss privacy risks in open data programs and how cities like Seattle are promoting transparency while protecting individual rights.

Moderator

Panelists

Privacy and Urban Instrumentation

As cities harness more data than ever, how can we assess the risks and opportunities of new technologies and data flows while preserving public trust and individual privacy? In this breakout session, come hear from Cities, CIOs, academic leaders, and industry experts as we examine the opportunities and challenges of new urban instrumentation and how we can come together to address privacy challenges in smart cities.

Moderator

Panelists:

Where

2017 MetroLab Network Annual Summit

Georgia Tech

Atlanta, Georgia

When

December 14, 2017

2:30 PM & 4:00 PM

REGISTER HERE(Link Expired)

Learn more about the Summit by visiting metrolabnetwork.org/annual-summit.

The Top 10: Student Privacy News (October-November 2017)

The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories. This blog is cross-posted at studentprivacycompass.org. 

Over the past month and a half, student privacy issues have proliferated in the news. Among other big events coming up, the comments on the FTC/USED workshop on COPPA in schools are due this Friday (the workshop is December 1st), and FPF is holding a free student privacy bootcamp for ed tech companies (register here) in DC on December 8th.

The Top 10

  1. As reported in my last newsletter, four districts (and possibly two universities) were targeted in September and October by hackers who threatened to harm students and disclose their sensitive information. After the S. Department of Education warned districts about the potential security threat, the story hit CNN, Wall Street Journal, NPR, Mother Jones, NBC News, CNBC, and the Washington Post, among many other outlets. It is worth rereading EdWeek’s report on ransomware in schools from earlier this year and this blog from PogoWasRight on the possible consequences of this breach. Policymakers at the state and federal level are planning to act – and hopefully they will provide the money and resources necessary to help districts build up their security and train educators on how to avoid cyber threats and protect privacy. However, it is noteworthy that many of the hackers’ actions are already against the law. My worry? Copycat hackers.
  2. GDPR kicks in on May 25th, 2018, and S. schools have begun to focus on how it applies to them. Every higher ed institution – and some K-12 institutions – as well as most ed tech companies with users in the EU will be impacted. (see my Storify of live-tweets from the panel). Novatia has some recent potentially useful articles on GDPR and schools as well.
  3. Personalized Learning – and the data that drives it – continues to inspire articles that discuss its effectiveness and potential impacts on privacy. The latest is Ben Herold’s article on “The Case(s) Against Personalized Learning,” part of an EdWeek special report on “Personalized Learning: Vision vs. Reality.” This month, we also saw the Rand Corporation’s newest research study on personalized learning and Ben Williamson noted that “It is important for education research to engage with how some of its central concerns—learning, training, experience, behaviour, curriculum selection, teaching, instruction and pedagogy—are being reworked and applied within the tech sector.”
  4. A progressive political group “filed FOIA requests seeking the publicly available student directories to get student cell phone numbers [available through FERPA’s directory information exception] at every one of Virginia’s 39 public colleges. Of those, 18 schools, including Tech and Radford, complied.” This will lead to legislation banning directory information disclosures for this purpose next spring. In related news, a great blog on how “Cell Phone Numbers are the New SSNs.”
  5. Over 800 “School websites [were] hacked to show pro-Islamic State message” (more info in this Fox News article) due to a vulnerability in the company that maintains the websites. In response, Rep. Donald Payne Jr said he is “working on federal legislation to address cybersecurity threats to schools.” In related news, student security expert Doug Levin is halfway through a blog series about state education agency and district websites. So far, he’s covered whether they have secure browsing and ad tracking.
  6. EdWeek reported on the education implications behind a New York City Council bill that would require that “all city agencies publish the source code behind algorithms they use to target services to city residents.” ProPublica also released a report after “A federal judge this week unsealed the source code for a software program developed by New York City’s crime lab, exposing to public scrutiny a disputed technique for analyzing complex DNA evidence,” which could have implications for schools.
  7. Policymakers, pundits, and the public continue to express more skepticism about tech companies than ever before, and ed tech companies are not exception. The New York Times continued its series about how ed tech is changing education with “How Silicon Valley Plans to Conquer the Classroom,” which has already resulted in a Maryland “Legislator Target[ing] Tech Perks in Baltimore County District” and a Baltimore Sun investigation finding that “Baltimore County school leaders…were paid by tech industry group” without disclosing the payments. Mother Jones also published “Inside Silicon Valley’s Big-Money Push to Remake American Education;” “Silicon Valley Tried to Reinvent Schools. Now It’s Rebooting,” via BloombergTechnology; and EdSurge reported on AltSchool’s shutting down two of their campuses, asking “Where Does Silicon Valley’s Philanthropy End and Profits Begin?
  8. Student Privacy went viral this month! Read about the saga of Taiwan Jones in Buzzfeed. Even though this was all probably fake, it raised questions for educators as to whether grading in public violates FERPA. The answer: it depends. It would likely be a FERPA violation if the person sitting next to the teacher can see the name and grade of the student (someone suggests this is another reason to use ID numbers instead of names on homework assignments), but FERPA could also be read to mean that it doesn’t violate FERPA because the grade hasn’t been entered into the gradebook.
  9. The House Committee on Government Oversight and Reform passed H.R. 4174, which implements recommendations from the Evidence-Based Policymaking Commission (see their report that was backed by EPIC and my write-up of its impact on education here). Some groups are opposing the bill on privacy grounds.
  10. The annual Global Privacy Enforcement Network Privacy Sweep found that some educational apps “fall short on privacy,” via Business Insider. Part of the problem? [W]ebsite privacy notices are too vague and generally inadequate.”

Image: “image_106” by Brad Flickinger  is licensed under CC BY 2.0.

John Verdi Talks Connected Devices with Fox 2 St. Louis

On November 13, 2017, FPF’s Vice President of Policy, John Verdi, discussed the privacy implications of connected devices with Mike Colombo of Fox 2 St. Louis. John explained:

“What data is being transmitted and what data is being used really depends on the device,” Verdi said. “They can offload that information from the device to servers on the internet that are either controlled by the companies or third parties and there’s some processing that can happen there.”

“I think it’s really time for folks at the federal level to be thinking about comprehensive, baseline, common sense privacy law,” he said.

You can watch the full interview below.

Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers

Data has become the currency of the modern economy. A recent study projects the global volume of data to grow from about 0.8 zettabytes (ZB) in 2009 to more than 35 ZB in 2020, most of it generated within the last two years and held by the corporate sector.

As the cost of data collection and storage becomes cheaper and computing power increases, so does the value of data to the corporate bottom line. Powerful data science techniques, including machine learning and deep learning, make it possible to search, extract and analyze enormous sets of data from many sources in order to uncover novel insights and engage in predictive analysis. Breakthrough computational techniques allow complex analysis of encrypted data, making it possible for researchers to protect individual privacy, while extracting valuable insights.

At the same time, these newfound data sources hold significant promise for advancing scholarship and shaping more impactful social policies, supporting evidence-based policymaking and more robust government statistics, and shaping more impactful social interventions. But because most of this data is held by the private sector, it is rarely available for these purposes, posing what many have argued is a serious impediment to scientific progress.

A variety of reasons have been posited for the reluctance of the corporate sector to share data for academic research. Some have suggested that the private sector doesn’t realize the value of their data for broader social and scientific advancement. Others suggest that companies have no “chief mission” or public obligation to share. But most observers describe the challenge as complex and multifaceted. Companies face a variety of commercial, legal, ethical, and reputational risks that serve as disincentives to sharing data for academic research, with privacy – particularly the risk of reidentification – an intractable concern. For companies, striking the right balance between the commercial and societal value of their data, the privacy interests of their customers, and the interests of academics presents a formidable dilemma.

To be sure, there is evidence that some companies are beginning to share for academic research. For example, a number of pharmaceutical companies are now sharing clinical trial data with researchers, and a number of individual companies have taken steps to make data available as well. What is more, companies are also increasingly providing open or shared data for other important “public good” activities, including international development, humanitarian assistance and better public decision-making. Some are contributing to data collaboratives that pool data from different sources to address societal concerns. Yet, it is still not clear whether and to what extent this “new era of data openness” will accelerate data sharing for academic research.

Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with ResearchersIn this report, we aim to contribute to the literature by seeking the “ground truth” from the corporate sector about the challenges they encounter when they consider making data available for academic research. We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment.

READ REPORT

FPF gratefully acknowledges the support of the Alfred P. Sloan Foundation for this project.

New Study: Companies are Increasingly Making Data Accessible to Academic Researchers, but Opportunities Exist for Greater Collaboration

FOR IMMEDIATE RELEASE

November 14, 2017

Contact: Melanie Bates, Director of Communications, [email protected] 

New Study: Companies are Increasingly Making Data Accessible to Academic Researchers, but Opportunities Exist for Greater Collaboration

Washington, DC – Today, the Future of Privacy Forum released a new study, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. In this report, FPF reveals findings from research and interviews with experts in the academic and industry communities. Three main areas are discussed: 1) The extent to which leading companies make data available to support published research that contributes to public knowledge; 2) Why and how companies share data for academic research; and 3) The risks companies perceive to be associated with such sharing, as well as their strategies for mitigating those risks.

“More widespread access to corporate data sets would support new scholarship and allow researchers to consider questions that cannot fully be answered from publicly available data alone,” said Leslie Harris, FPF Senior Fellow and Understanding Data Sharing Decisions’ Principal Researcher. “In this exploratory study, we aim to contribute to the literature by seeking the ‘ground truth’ from the corporate sector about the challenges they encounter when they consider making data available for academic research.”

Of the companies interviewed, 70% report making at least some data available to academic researchers. Half of the sharing companies began making data available to external researchers within the last five years. Close to half of the interviewed companies said that the main reason for sharing data for research was to obtain insights that would help the company “better execute” or “better understand” their mission. A number of companies also said that sharing data for research helped to build their brands, strengthen relationships with academics, and attract talent to the company. The study also found that companies are concerned about privacy, particularly the risk of re-identification. Companies are equally concerned that sharing data for research might diminish or destroy the intellectual property value of their data.

FPF identified several opportunities to promote data-driven research: 1) to enhance the positive public profile of company/academic data sharing; 2) to help mitigate perceived risks, particularly privacy and re-identification risks; 3) to develop and share tools for public outreach and community engagement; 4) to encourage peer-to-peer knowledge sharing; and 5) to create a clearinghouse identifying data types desired by academics.

“We hope that the impressions and insights gained from this first look at the issue will help formulate further research questions, inform the dialogue between key stakeholders, and identify constructive next steps and areas for further action and investment,” said Jules Polonetsky, FPF’s CEO.

FPF released Understanding Data Sharing Decisions today at the ADRF Network Inaugural Conference during the session on Expanding Private Sector Administrative Data Access. The focus of the discussion centered around why and how companies share data for academic research, strategies for mitigating risks and building trust, and recommendations for encouraging company-academic data sharing.

FPF would like to thank Leslie Harris (FPF Senior Fellow), the Principal Researcher of this report, and Chinmayi Sharma (University of Virginia School of Law), Research Assistant.  FPF gratefully acknowledges the support of the Alfred P. Sloan Foundation for this project.

###

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

FPF Advisory Board Member Amie Stepanovich Discusses 'Why Inclusion Matters'

Pictured Above: Amie Stepanovich (Access Now)

Yesterday, Future of Privacy Forum Advisory Board Member, Amie Stepanovich, U.S. Policy Manager at Access Now, published an article explaining the importance of ensuring marginalized communities have greater influence on how emerging technologies are developed. Amie says:

“Ultimately, we won’t be able to change things in any significant way so long as we create and facilitate an environment that is hostile toward diversity. Instead of digging our heads into the sand when scandals erupt, those of us working in tech should embrace change and invest in identifying and promoting smart, diverse voices. That means developing institutional and operational systems and processes that respect the range of backgrounds and experiences that diversity brings; creating public policies that are not developed or dictated by a single point of view; and providing platforms for discussion such as panels or events that highlight the voices and perspectives of under-represented people and organizations that are breaking through societal roadblocks and developing valuable expertise, often at great personal cost.”

READ ARTICLE