Study: EU-US Privacy Shield Essential to Leading European Companies

New FPF Study Documents More Than 100 European Companies Participating in the Privacy Shield Program

From Major Employers such as Ingersoll-Rand and Lidl Stiftung to Leading Technology Firms like Telefónica, RELX, and TE Connectivity, European Companies Depend on the EU-US Agreement

EU Firms are Signing up for Privacy Shield at a High Rate – the One-Year-Old Privacy Shield Program Includes a Larger Percentage of EU Companies than the Predecessor Safe Harbor Program

Termination of the Privacy Shield Program Could Inhibit European Employment – Nearly One Third of Privacy Shield Companies Rely on the Framework to Transfer HR Information of European Staff

The Future of Privacy Forum conducted a study of the companies enrolled in the US-EU Privacy Shield program and determined that 114 European headquartered companies are active Privacy Shield Participants. These European companies rely on the program to transfer data to their US subsidiaries or to essential vendors that support their business needs.

FPF staff also found that EU companies comprise 5.2% of all Privacy Shield companies, an increase over the 3.5% of all companies that were based in Europe under the previous EU-US Safe Harbor Program in 2014. At only a year old, the number of Privacy Shield participants is already greater than 2,000 and typically grows by several members each week.

Leading EU companies that rely on Privacy Shield include:

ABB, Swiss electrical equipment company

CNH Industrial America, Dutch capital goods company

Ingersoll-Rand, Irish globally diversified industrial company

Lidl Stiftung, German grocery market chain

NCS Pearson, British education assessment and publishing company

Reckitt Benckiser, British consumer goods company

RELX, British and Dutch information and analytics company

TE Connectivity, Swiss consumer electronics company

Telefónica, Spanish mobile network provider

With the first annual review of the Privacy Shield framework underway, it is important for parties on both sides of the Atlantic to recognize the program’s benefits to the US and to Europe. Although no system is perfect, there is substantial value for many stakeholders, including leading European companies, in maintaining Privacy Shield protections for companies and consumers in both the United States and Europe.

FPF research also determined that over 700 companies, nearly a third of the total number analyzed, use Privacy Shield to process their human resources data. Inhibiting the flow of HR data between the US and EU could mean delays for EU citizens receiving their paychecks, or a decline in global hiring by US companies. Therefore, employees stand to lose if the Privacy Shield were terminated or materially altered.

The research identified 114 Privacy Shield companies headquartered or co-headquartered in Europe. This is a conservative estimate of companies that would be impacted by cancelation of the Privacy Shield framework – FPF staff did not include global companies that have major European offices but are headquartered elsewhere. The 114 companies include some of Europe’s largest and most innovative employers, doing business across a wide range of industries and countries. EU-headquartered firms and major EU offices of global firms depend on the Privacy Shield program so that their related US entities can effectively exchange data for research, to improve products, to pay employees and to serve customers. These companies would be severely burdened and disadvantaged by termination of the Privacy Shield program. Given the importance of this mechanism to companies and consumers on both sides of the Atlantic, FPF recommends that the Privacy Shield arrangement be preserved.

Methodology:

• FPF staff recorded a list of 2,188 active Privacy Shield companies as of July 2017 from https://www.privacyshield.gov.

• FPF staff performed a web search for each current company by name, checking the location of the company’s headquarters on a combination of public databases such as LinkedIn, CrunchBase, Bloomberg, and the company’s own website.

• A company that listed its headquarters in an EU member state or in Switzerland was counted as a match; companies that merely had a prominent EU office or founded in an EU member state were not counted.

• 114 total EU-headquartered companies were identified using this method.

For the full list of European companies in the Privacy Shield program, or to schedule an interview with John Verdi or Jules Polonetsky, email [email protected].

Upcoming Future of Privacy Forum Events

In the coming months, FPF will be hosting or co-hosting a number of events. We welcome your attendance and participation. Please contact us at [email protected] for further information.

Events

Sustainable Innovation with Effective Data Protection in the Pacific Rim

presented by Information Accountability Foundation and FPF

September 25, 2017

Hong Kong

Refining Privacy to Improve Health Outcomes

presented by Triangle Privacy Research Hub, FPF, Intel Corporation, North Carolina Journal of Law & Technology (NC JOLT), Information Accountability Foundation (IAF), Duke University, and Indiana University 

October 26-27, 2017

Chapel Hill

Bridging Industry and Academia to Tackle Responsible Research and Privacy Practices

Organized by Xinru Page, Bentley University; Pamela Wisniewski, University of Central Florida; Margaret Honda, Future of Privacy Forum, Jen Romano-Bergstrom, Instagram and President of the User Experience Professionals Association; Sona Makker, Facebook; Norberto

November 2-3, 2017 (Call For Participation deadline is September 22, 2017)

New York

Privacy Localism: A New Research Agenda

presented by Information law Institute and FPF

November 3, 2017

NYU School of Law

Brussels Privacy Symposium: AI Ethics

presented by the Brussels Privacy Hub of the Vrije Universiteit Brussel (VUB) and FPF

November 6, 2017

Brussels

Privacy Engineering Research and the GDPR: A Trans-Atlantic Initiative

presented by Internet Privacy Engineering Network, University of Leuven, Carnegie Mellon University, and FPF

November 10, 2017 (Call For Papers deadline is September 24, 2017)

Leuven, Belgium

Roundtable Discussion: Smart Cities and Open Data 

presented by FPF at the 2017 MetroLab Network  Annual Summit

December 14, 2017

Atlanta, Georgia

2017 Privacy Papers for Policymakers

presented by FPF and the Congressional Bi-Partisan Privacy Caucus 

February 27, 2018 (Nomination deadline is September 26, 2017)

Capitol Hill


Sustainable Innovation with Effective Data Protection in the Pacific Rim

The Pacific Rim is a hot bed of innovation in observational technologies and applying them to practical applications driven by advanced analytics and artificial technologies. For these applications to be sustainable there needs to be policy mechanisms that are respectful of the diverse ethical and legal cultures in the region, and interoperable with other regions. This session will explore the application of information policy governance for new technologies emerging in the region.

Organizers: Information Accountability Foundation and FPF

When: September 25, 2017; 3:30pm – 5pm

Where: 

Kowloon Shangri-La

Kowloon Room II (M/F)

Hong Kong

This is an official side event of the 39th International Conference of Data Protection and Privacy Commissioners. Please click the link below for details. 

READ MORE


FPF’s Stacey Gray will be giving an exclusive presentation on data security and privacy guidelines for voice-activated devices like the Amazon Echo, Google Home, and many more. Breakfast will be served, followed by networking with fellow IoT executives and entrepreneurs.

Agenda:

8AM: Arrival & Breakfast

8:30AM: ‘The Future of IoT Privacy & Security’ Presentation by Stacey Gray, Future of Privacy Forum

9:00AM: Q&A and Curated Discussion

9:30AM: Executive Networking

To receive an invitation, please contact [email protected]


Refining Privacy to Improve Health Outcomes

The Triangle Privacy Research Hub’s symposium will bring together leading experts in the field of privacy, health, and data to discuss how new technologies and data sources can improve health outcomes, while protecting individual privacy. The goal of the event is to propose specific law, policy and practice changes to promote the more effective use of data for health.

The symposium is sponsored by the Triangle Privacy Research Hub (TPRH), Intel CorporationFuture of Privacy Forum (FPF)the North Carolina Journal of Law & Technology (NC JOLT)the Information Accountability Foundation (IAF)Duke University Science and Society InitiativeDuke University Masters of Management in Clinical Informatics (MMCi), and Indiana University Center for Law, Ethics, and Applied Research (CLEAR) in Health Information.

Agenda:

Thursday, October 26 – Sponsored by the North Carolina Journal of Law & Technology (NC JOLT)

Location: Rizzo Center: 150 Dubose Home Lane, Chapel Hill, NC

1:00 p.m. Keynote Remarks

1:45 p.m. – 5:00 p.m. Panel Discussions with Audience Q&A

Friday, October 27 

Location: Duke Innovation & Entrepreneurship Initiative at The Bullpen: 215 Morris St #300, Durham, NC

9:00 a.m. Keynote Remarks

9:45 a.m. – 1:00 p.m. Panel Discussions with Audience Q&A

Lunch will be provided following the program

Please note this event is invitation only. For more information, please contact [email protected].


Bridging Industry and Academia to Tackle Responsible Research and Privacy Practices

As more human interactions move online and the amount and variety of information shared digitally continues to grow, decisions regarding the collection, sharing, and use of this data must take into account both ethical and privacy considerations. It is important that industry and academia come together to find joint solutions for making these difficult decisions regarding privacy and ethics to maximize the benefits of data-driven research and practices, while ensuring that harms and negative outcomes are prevented. To bridge communication between these communities, we are organizing a workshop for thought leaders from academia, industry and civil society to identify common goals, establish a long-term vision, and initiate working teams for tangible projects focused on responsible research ethics and privacy practices around user data.

When: November 2-3, 2017 (CFP deadline is September 22, 2017)

Location: Facebook Offices, 770 Broadway, New York, NY

To encourage a diverse set of attendees, we ask those who are interested to provide their resume or CV and a 1 to 2-page submission. Please click the link below for details. 

READ MORE


Privacy Localism: A New Research Agenda

Academic experts on administrative law, privacy, federalism, and local governance will be joined by policymakers, industry representatives and privacy advocates to present and discuss a variety of perspectives on the legal, empirical and policy implications of this trend toward “privacy localism.”

November 3, 2017

NYU School of Law

D’Agostino Hall, Lipton Hall, 108 West 3rd Street

Organizers:  Katherine J. Strandburg (ILI Faculty), Ira Rubinstein (ILI Senior Fellow), Bilyana Petkova (ILI Fellow, Assistant Professor, Maastricht University)

Co-sponsors: Information Law Institute and Future of Privacy Forum

RSVP: Email Bilyana Petkova


AI Ethics: The Privacy Challenge

Please join us for the 2nd Annual Brussels Privacy Symposium. The 2017 Symposium is titled AI Ethics: The Privacy Challenge and is jointly presented by FPF and the Brussels Privacy Hub of the Vrije Universiteit Brussel (VUB). This all-day workshop hosted on November, 2017 in Brussels, Belgium will provide attendees the opportunity to examine issues sounding Artificial Intelligence as well as feature papers and speakers from around the globe.

Selected interdisciplinary works in law and policy, computer science and engineering, social studies, and economics will be published in a special issue of IEEE Security & Privacy Magazine.

When:

6 November, 2017

10:00 AM – 5:00 PM

Where:

Vrije Universiteit Brussel,

Institute of European Studies,

Pleinlaan 5, 1050

Brussels, Belgium

The event is complimentary, but space is limited, so please register if you plan to attend. For more event information, please click here to be directed to the Brussels Privacy Symposium webpage. Special thanks to our 2017 Brussels Privacy Symposium sponsors and supporters.

REGISTER HERE

Privacy Engineering Research and the GDPR: A Trans-Atlantic Initiative

When the EU’s GDPR becomes fully applicable on 25 May 2018, many data protection requirements will be seen in a new perspective. With this event, we aim to determine the relevant state of the art in privacy engineering; in particular, we will focus on those areas where the “art” needs to be developed further. The goal of this trans-Atlantic initiative is to identify open research and development tasks, which are needed to make the full achievement of the GDPR’s ambitions possible.

Event Format:

  • Review the state of the art – including current solutions
  • Discuss current research in the field
  • Break-out Sessions – participants will focus on “key challenges” and identify challenges for research and development
  • All participants will discuss the outcomes of the break-out sessions and suggest next steps at the conclusion of the Workshop.

When: November 10, 2017 (CFP deadline is September 15, 2017)

Location: Leuven, Belgium

To encourage a diverse set of attendees, we ask those who are interested in attending the workshop to provide a one-page submission. Please click the link below for details. 

READ MORE


Smart City scenery Roundtable Discussion: Smart Cities and Open Data (2017 MetroLab Network Annual Summit)

WHAT

Privacy and Open Data

The Smart Cities and Open Data movements promise to use data to spark civic innovation and engagement, promote inclusivity, and transform modern communities. At the same time, advances in sensor technology, re-identification science, and Big Data analytics have challenged cities and their partners to construct effective safeguards for the collection, use, sharing, and disposal of personal information. In this breakout session, we will discuss privacy risks in open data programs and how cities like Seattle are promoting transparency while protecting individual rights.

Privacy and Urban Instrumentation

As cities harness more data than ever, how can we assess the risks and opportunities of new technologies and data flows while preserving public trust and individual privacy? In this breakout session, come hear from Cities, CIOs, academic leaders, and industry experts as we examine the opportunities and challenges of new urban instrumentation and how we can come together to address privacy challenges in smart cities.

WHERE

2017 MetroLab Network Annual Summit

Georgia Tech

Atlanta, Georgia

WHEN

December 14, 2017

2:30 PM & 4:00 PM

READ MORE


First Take: Privacy in Updated Federal Guidance for Automated Driving

Yesterday, the Department of Transportation and the National Highway Traffic Safety Administration issued updated guidance for autonomous vehicles; streamlining last year’s guidance, incorporating public comments, and stripping privacy from its recommendations.

While Transportation Secretary Elaine Chao’s opening message highlights the importance of protecting consumer privacy, privacy was among three elements removed in the updated guidance, along with “Registration and Certification” and “Ethical Considerations.” This means that these topics are not to be included in the Safety Assessment Letters, which the guidance stresses are highly voluntary, despite recent legislative proposals that would make mandatory. Readers seeking information on NHTSA’s updated approach to privacy had to look to the footnotes, which point to a new NHTSA website, where answers can be found by clicking through and expanding an extended series of Q&As.

A close read of these Q&A’s reveals the removed sections of the guidance “remain important and are areas for further discussion and research” and the updated guidance instead focuses on topics readier for implementation in the near term. The site emphasizes the Federal Trade Commission’s role as the chief agency for protecting consumer privacy in this space, in part because “privacy is not directly relevant to motor vehicle safety.” This is one of the clearest instances to date of NHTSA clarifying the limits of their jurisdiction over privacy as related to the FTC.

A call for data sharing between entities in this space was also largely removed from the guidance, with NHTSA on their website citing industry concern about sharing of proprietary information, and lack of clarity around safety metrics. While much of the promise of advanced driving systems will eventually be their ability to enhance safety through data sharing, the nascent nature of this technology renders it a bit early for such mandates between non-governmental entities, and NHTSA focuses first on building standardized mechanisms for crash reconstruction data reporting to government. In the meantime, they do encourage voluntary collaboration among industry.

NHTSA’s removal of a privacy focus stands in contrast to some recent approaches to this issue. The SELF DRIVE Act that passed in the House of Representatives last week includes a comprehensive section on consumer privacy with a mandate that manufacturers create and share their “privacy plan,” creates an advisory group that will review privacy among its topics, and calls on the FTC undertake study of this topic (our full recap of the House bill here). In June, NHTSA and the FTC co-hosted a workshop focused on privacy around connected cars, emphasizing the two agencies’ commitment to the issue, but with the FTC taking a much more vocal role throughout the workshop. Also of note, a recently shared Staff Discussion Draft of the Senate version of self-driving legislation does not include a privacy section, and its data recording section focuses on crash data.

For those of us who think that proactively protecting consumer privacy in the connected car is crucial to the adoption of this safety-enhancing technology, its absence from the NHTSA guidance is notable. But some may find that this move achieves the clarity that many of us, including the Government Accountability Office in a recent study, have called for regarding NHTSA and FTC roles—by explaining that nearly all issues related to privacy in connected cars fall squarely in the FTC’s camp.

RELATED FPF RESOURCES

See our recap of the privacy elements of the first Federal Automated Vehicles Policy here

See our comments on the first Federal Automated Vehicles Policy here

See our comments on the NHTSA/FTC Workshop here

See our infographic mapping data flows in the connected car here

Bridging Industry and Academia to Tackle Responsible Research and Privacy Practices

Workshop Description

As more human interactions move online and the amount and variety of information shared digitally continues to grow, decisions regarding the collection, sharing, and use of this data must take into account both ethical and privacy considerations. It is important that industry and academia come together to find joint solutions for making these difficult decisions regarding privacy and ethics to maximize the benefits of data-driven research and practices, while ensuring that harms and negative outcomes are prevented. To bridge communication between these communities, we are organizing a workshop for thought leaders from academia, industry and civil society to identify common goals, establish a long-term vision, and initiate working teams for tangible projects focused on responsible research ethics and privacy practices around user data. The workshop will kick-off with a keynote from Facebook’s Deputy Chief Privacy Officer, Rob Sherman, and a panel discussion from our advisory board members. 

Wooden sign with ethic conceptsWorkshop Organizers: Xinru Page, Bentley University; Pamela Wisniewski, University of Central Florida; Margaret Honda, Future of Privacy Forum, Jen Romano-Bergstrom, Instagram and President of the UXPA (User Experience Professionals Association); Sona Makker, Facebook; Norberto Andrade, Facebook


Advisory Board Members: Chris Clifton, Purdue University; Lorrie Cranor, Carnegie Mellon University, Director CyLab Usable Privacy and Security Laboratory; Lauri Kanerva, Facebook, Research Management Lead; Helen Nissenbaum, Cornell Tech and New York University; Jules Polonetsky, Future of Privacy Forum, Chief Executive Officer (full board to be finalized soon)

When: November 2 – 3, 2017

Location: Facebook Offices, 770 Broadway, New York, NY

To encourage a diverse set of attendees, we ask those who are interested to provide their resume or CV and a 1 to 2-page submission. Please click the link below for details. 

READ MORE

Bipartisan Report: Feds Should Connect Student Data While Protecting Privacy

Today, the Commission on Evidence-Based Policymaking released their final report. The Commission was created through bi-partisan legislation in 2016 to “consider how to strengthen government’s evidence-building and policymaking efforts” (page 16). One of the key issues that the Commission heard from advocates on all sides about whether to overturn the current federal ban on connecting education data collected by the federal government in order to provide students, postsecondary institutions, and the public with information that could be used to improve policies or better target federal funding. Education organizations that support overturning the current federal ban on connecting education data collected by the federal government with other data were very excited by the report: among other recommendations, the Commission advocated that Congress consider repealing current bans on the collection and use of data for evidence building. However, the concerns of privacy advocates that appeared in many public comments to the Commission were not overlooked: the word “privacy” is mentioned in the report 408 times (out of 114 pages), and numerous privacy and security protections were recommended by the Commission.

Why Overturn the Ban?

The federal government collects a lot of data in order to do its job. For example, data is collected about students and their parents in order to provide them with financial aid. Some leading education organizations argue that this data is generally not being used efficiently or effectively.

Despite the fact that the federal government collects student-level data from higher education students on demographics, income, assets, and educational attainment as well as federal aid amounts and history through the FSA National Student Loan Data System and Central Processing System, college students are not given the critical information they need to know about how students with similar characteristics perform at certain institutions and the expected employment value relative to the educational cost. In addition to the FSA data systems, the National Center for Education Statistics collects data on an institutional level with information about graduation rates and pricing, but this information does not include transfer and part-time students, who comprise a significant percentage of the student population.

Many education advocates have urged an end to the ban, arguing that this would allow new data uses like allowing students to compare potential colleges, allow postsecondary institutions to report certain student data once instead of multiple times as they do now to various federal agencies, and let taxpayers to know whether money spent on higher education is being used most effectively.

Groups opposed to overturning the ban have primarily raised potential privacy and security concerns, such as the danger of the information being repurposed and used in law enforcement or immigration proceedings, and the recent history of problematic data breaches and inadequate security in some federal agencies.

What Did the Report Say?

The Commission explicitly rejected the presumption that “increasing access to confidential data…significantly increase[es] privacy risk.”

The report advocates that Congress “consider repealing current bans and limiting future bans on the collection and use of data for evidence building” (recommendation 2-5). Despite receiving more comments on overturning the federal ban on sharing federally held education data than on any other issue (page 30), the Commission explicitly rejected the presumption that “increasing access to confidential data…significantly increase[es] privacy risk” (page 1). The report states that “there are steps that can be taken to improve data security and privacy protections beyond what exists today, while increasing the production of evidence” (page 1). The Commission used the Fair Information Practice Principles (FIPPS) as their framework to suggest additional ways to protect information already held by the federal government, and suggested balancing privacy vs public right-to-know interests by weighing:

  1. The potential public benefits of the research project;
  2. The sensitivity of the data that would be accessed in that project; and
  3. “Any risk that allowing access could post to confidentiality” (page 24).

Most significantly for privacy advocates who were concerned that data held at the federal level could be used for purposes they were not intended to be used for (such as for a law enforcement action against, for example, a college student whose information was in the SLDN), the Commission recommended using a pre-existing legal framework to restrict the use of the data for “statistical purposes” only, violation of which would be subject to major penalties under the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) which could include fines and/or jail time.

Practically, this means that data would only be used by certain vetted governmental or non-governmental researchers to show the efficacy of a federal program or federal funding. For example, this would allow the government – and therefore the public – to know how students with similar characteristics perform at certain institutions and the expected employment value relative to the educational cost.

The Commission also discussed how to keep data safe, suggesting several methods to share data and disclose deidentified or aggregated data to the public while minimizing the likelihood of any privacy harms occurring.

What’s Next?

Speaker of the House Paul Ryan and Senator Patty Murray said at the press conference for today’s report release that they will be introducing bills in the House and Senate to codify the Commission’s recommendations into law. The report may also improve the chances of passing the College Transparency Act, a bi-partisan bill introduced earlier this year that would overturn the ban while improving the privacy and security of student data held by the federal government. Regardless of the legislative vehicle, the debate about federal collection and use of student data is sure to be an active one in the coming months.

 

This blog is cross-posted at https://studentprivacycompass.org/cepreport.

The Top 10: Student Privacy News (July – August 2017)

The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories. This blog is cross-posted at studentprivacycompass.org. 

The Top 10

  1. With the announced elimination of the DACA program yesterday, immigration data and education is on everyone’s mind again.
  2. Research from Leah Figueroa, a data analyst who has worked in higher education for 13 years, that showed that some higher ed institutions are inappropriately (but not illegally) releasing vast amounts of directory information (see her talk at Infosec Southwest titled “FERPA: Only Your Grades Are Safe – OSINT in Higher Education,” but note some of her FERPA/USED info is incorrect).
  3. Ben Herold of EdWeek released a great primer for schools on COPPA (as well as 5 new takeaways from the FTC on COPPA)
  4. Personalized learning continues to be a hot topic (as always, worth noting Monica Bulger’s awesome article on the topic and its privacy implications):
  5. Predictive analytics, algorithm, and AI discussions showed up in a lot of articles this month as well:
  6. Edtech, research, and privacy was also a trend in the past month:
  7. GDPR has been coming up in education news. In K-12, this included “Preparing for the General Data Protection Regulation (GDPR) – 10 Steps for Schools [in the UK]” via Harrison Clark Rickerbys Solicitors; and “There Will be Blood – GDPR and EdTech,” via Eylan Ezekiel. EDUCAUSE covered it for higher ed in “GDPR: A Data Regulation to Watch.”
  8. “The International Working Group on Data Protection in Telecommunications has adopted new recommendations to improve privacy and security standards for e-learning platforms,” via EPIC.
  9. Many articles raised (or should have raised) important student surveillance questions: “Flagler School District Enters Brave New World of Student Computer Controls and Surveillance,” via FlaglerLive; “NY schools to use vaping, bullying detectors,” via Fox5NY; “Student suicide prevention body in talks with Google and Facebook to help at-risk Hongkongers,” via South China Morning Post; “How schools are tracking students using their mobile phones,” via The Age; and “Alief ISD rolls out GPS technology to keeps students safe,” via Click2Houston. It may be worth reading my previous report on surveillance and privacy for those interested in the costs and benefits of surveillance technologies in schools (as well as practical steps to regulate them).
  10. Security issues continuously came up this past month:

Image: “Drawing competition for school kids” by Simply CVR  is licensed under CC BY 2.0.

What does privacy mean to you?

Future of Privacy Forum’s CEO, Jules Polonetsky, spoke with Goethe Institut about what privacy means to him. Jules discussed his experience leading FPF and the various circumstances he encounters. You can watch the full clip below.

Video

The Goethe-Institut is the Federal Republic of Germany’s cultural institute, active worldwide. Goethe-Institut promotes the study of German abroad and encourage international cultural exchange. Learn more by visiting www.goethe.de

Privacy Engineering Research and the GDPR: A Trans-Atlantic Initiative

Workshop Description

A multidisciplinary research approach to understanding privacy is needed and deploying new privacy-aware approaches may require changes in existing technology systems, in business processes, in regulations, and in laws, as stated in the US National Privacy Research Strategy. When the EU’s GDPR becomes fully applicable on 25 May 2018, many data protection requirements will be seen in a new perspective. Among other aspects, “data protection by design and by default” will become an explicit legal obligation. Organizations who are processing personal data have to apply privacy engineering so that their systems implement data protection principles and integrate the necessary safeguards.

With this event, we aim to determine the relevant state of the art in privacy engineering; in particular, we will focus on those areas where the “art” needs to be developed further. The goal of this trans-Atlantic initiative is to identify open research and development tasks, which are needed to make the full achievement of the GDPR’s ambitions possible. See full agenda here.

When: Friday, November 10, 2017 from 9:30 – 17:00

Where: University of Leuven, Parthenonzaal (Mgr. Sencie Instituut MSI 1 03.18) Erasmusplein 2, 3000 Leuven, Belgium

Click here to find more detailed information about how to get to the University of Leuven and other practical matters.

Featured Speakers and Panelists

  • Giovanni Buttarelli, European Data Protection Supervisor
  • Wojciech Wiewiorowski, Assistant European Data Protection Supervisor
  • Norman Sadeh, Professor of Computer Science and Co-Director, Privacy Engineering Program, Carnegie Mellon University (CMU)
  • Claudia Diaz, Professor at the COSIC research group of the Department of Electrical Engineering (ESAT), KU Leuven
  • Josep Domingo-Ferrer, Professor of Computer Science, Chairholder of the UNESCO Chair in Data Privacy, and ICREA-Acadèmia Researcher, Universitat Rovira i Virgili
  • Jaap-Henk Hoepman, Professor in the Digital Security group at the Institute for Computing and Information Sciences and Scientific Director of the Privacy & Identity Lab, Radboud University Nijmegen
  • Naomi Lefkovitz, Senior Privacy Policy Advisor in the Information Technology Lab at the National Institute of Standards and Technology, U.S. Department of Commerce
  • Simon Hania, Vice President Privacy & Security / Corporate Privacy Officer, TomTom

Event Agenda:

This full-day session will include:

Who should attend:

Organizers

GET UPDATES


Call for Participation

This workshop aims at bringing together those who are working at the forefront of research on adapting data processing strategies, developing privacy engineering, and practitioners applying these technologies. The workshop will include a Key Note presentation, a panel discussion reviewing the current landscape followed by breakout sessions to address relevant themes, such as the ones mentioned below (but not limited to them). To encourage a diverse set of attendees, we ask those who are interested in attending the workshop to provide a one-page submission that includes the following information (no specific format required):

Biosketch. One to two paragraphs introducing who you are and your background, including whether you come from industry, the public sector, academia, or other (please specify).

Theme. The theme that most strongly resonates with you, why, and your stated position on this theme. It may be related to a project you are already working on, a project you recently concluded or a project you intend to start. Please explain in what sense you are well-positioned to help address this theme.

Influence. Please describe how you would be influential within your respective community, for example by disseminating the workshop outcomes after the conclusion of the workshop, implementing certain measures, etc.

Please send submissions to [email protected]. *Submissions are now due 5 October, 2017. Submissions will be reviewed by the workshop program organizers — Jules Polonetsky, CEO Future of Privacy Forum; Achim Klabunde, IPEN/European Data Protection Supervisor; Bettina Berendt, Professor in the Computer Science Department at the University of Leuven/DTAI; and Norman Sadeh, Professor of Computer Science and Co-Director, Privacy Engineering Program, Carnegie Mellon University (CMU) — and your attendance will be accepted based on the goal to ensure a diverse set of attendees and the relevance to the workshop themes and goals. Participation is free of charge.

Possible Workshop Themes to be discussed include, but are not limited to:

You can find information about accommodation at http://www.visitleuven.be/en/stayingover. For other questions regarding the venue, please contact the local organiser, Bettina Berendt, at [email protected]

This workshop is supported in part by the National Science Foundation under Grant No. 1654085

FPF Statement on GAO Release of Vehicle Data Privacy Report

A new report released today by the United States Government Accountability Office reviews consumer privacy issues related to connected vehicles. The report examines the use, types, and sharing of vehicle data; surveys automakers to understand how their privacy policies align with privacy best practices; consults experts in the field to understand the issues at play in this space; and examines related Federal efforts. The report paints a picture of an industry on the brink of significant change, with industry and Federal practices actively adapting to these new realities.

The report’s core recommendation is that National Highway Traffic Safety Administration better define its roles and responsibilities related to connected vehicle data privacy. The Department of Transportation agreed to provide a detailed response within 60 days.

In a key conclusion, GAO explains that most of the automakers surveyed (representing 90 percent of the U.S. auto market) do limit data collection, use, and sharing in accordance with privacy best practices. Despite these positive practices, their privacy notices are written in legalese that makes it difficult for consumers to understand, they do not specify data sharing and use practices, and they offer limited individualized controls to consumers.

“We applaud GAO for their thoughtful survey of this space, and FPF was pleased to have served as a selected subject matter expert for the report,” said Lauren Smith, FPF Policy Counsel. She continued, “GAO did a laudable job describing the landscape of connected cars and privacy, tracking progress and practices in the industry, highlighting areas for improvement, and calling for clarity in Federal agency roles.”

The report highlights several FPF projects, including our consumer guide to data in the connected car, and our paper describing the spectrum of de-identification.

Just as the GAO report supports greater understanding of data on connected vehicles, our recently released infographic “Data and the Connected Car – Version 1.0,” similarly maps data elements in the connected car.

Smart Cities Need Smart Privacy Protections: FPF seeks public comments on proposed Open Data Risk Assessment for the City of Seattle

Action: Proposed draft report on conducting an Open Data Risk Assessment for the City of Seattle

Published: August 18, 2017

Comments due: October 2, 2017

The Future of Privacy Forum (FPF) requests feedback from the public on the proposed City of Seattle Open Data Risk Assessment. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy. To ensure its Open Data program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked FPF with creating and deploying an initial privacy risk assessment methodology for open data.

This Draft Report provides tools and guidance to the City of Seattle and other municipalities navigating the complex policy, operational, technical, organizational, and ethical standards that support privacy-protective open data programs. In the spirit of openness and collaboration, FPF invites public comments from the Seattle community, privacy and open data experts, and all other interested individuals and stakeholders regarding this proposed framework and methodology for assessing the privacy risks of a municipal open data program.

Following this period of public comment, a Final Report will assess the City of Seattle as a model municipality and provide detailed recommendations to enable the Seattle Open Data program’s approach to identify and address key privacy, ethical and equity risks, in light of the city’s current policies and practices.

How to Comment:

Please note that the comment period has closed as of Oct. 2, 2017

All timely and responsive public comments will be considered and will be made available on a publicly accessible FPF or City of Seattle website after the final report is published. Because comments will be made public to the extent practical, they should not include any sensitive or confidential information.

Interested parties may provide feedback in any of the following ways:

READ THE DRAFT REPORT