WashingTech Tech Policy Podcast’s Episode 107

Jules Polonetksy, Future of Privacy Forum’s CEO, was featured on WashingTech Tech Policy Podcast’s Episode 107. The show is hosted by Joe Miller and focuses on the top tech law and policy debates driving the tech and communications sectors. Episode 107 centered around online privacy issues and where U.S. privacy law and policy now stand in light of recent data breaches. Jules also discussed what consumers can do to protect their data. Given the recent Equifax data breach, he explained the importance of freezing your credit to prevent bad actors from opening accounts in your name. Listen to the full interview at the link below.

LISTEN

Law Enforcement Access to Student Records: A Guide for School Administrators & Ed Tech Service Providers

FOR IMMEDIATE RELEASE

September 26, 2017

Contact:

Amelia Vance, Policy Counsel, [email protected]

Melanie Bates, Director of Communications, [email protected]

Law Enforcement Access to Student Records:

A Guide for School Administrators & Ed Tech Service Providers

Washington, DC – Today, the Future of Privacy Forum released a new paper, Law Enforcement Access to Student Records: A Guide for School Administrators & Ed Tech Service Providers. With the repeal of the Deferred Action for Childhood Arrivals (DACA) program last month, it is important that schools – and the companies that serve them – understand their legal options and when they may be required to disclose student personal information to law enforcement.

“The Federal Education Rights and Privacy Act (FERPA) broadly prohibits schools from disclosing student records without the written consent of the parent or student,” said Amelia Vance, FPF Policy Counsel. “In this Guide, we highlight two key best practices when responding to federal requests for student data: 1) consult legal counsel to determine your obligations; and 2) carefully align the amount and types of data you collect about students to the programs and services you provide,” said Vance.

The Guide notes that some schools collect student immigration status or other data that can be used to imply immigration status. “If schools collect student immigration status data, it is considered part of the student record and is protected by FERPA,” Vance said. The Guide explains that schools may only disclose this information with consent or in response to a valid court order or subpoena.

In addition to the Guide, FPF has released an accompanying blog with a list of supplemental resources and articles.

###

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

Law Enforcement Access to Student Records: What Is the Law?

By Mia Little and Amelia Vance

In the current political climate, schools have expressed a great deal of concern about government agencies – including law enforcement – requesting student data in order to identify and deport undocumented students. With the repeal of the Deferred Action for Childhood Arrivals (DACA) program last month, it is important that schools – and the companies that serve them – understand their legal options and when they may be required to disclose student personal information to law enforcement.

Today, the Future of Privacy Forum (FPF) released “Law Enforcement Access to Student Records: A Guide for School Administrators & Ed Tech Service Providers,” written by Amelia Vance and Sarah Williamson. This guide helps to answer some of the basic questions that we have heard from key stakeholders about law enforcement access to data over the past nine months.

Schools should only be collecting the information they need to help students, and, if the disclosure of that information could cause a greater harm than benefit to students, schools should consider deleting that information.

The publication emphasizes issues that schools and third-party service providers must consider before disclosing student data in response to law enforcement requests. First, schools and service providers should proactively review the information they collect to align the amount and types of data to the programs and services they provide. Schools should only collect the information they need to help students, and, if the disclosure of that information could cause a greater harm than benefit to students, schools should consider deleting that information if hey are not legally obligated to retain it. Second, schools and service providers should consult legal counsel to determine their legal obligations to students and law enforcement when presented with a request for student data. Federal civil rights laws and the Supreme Court decision Plyler v. Doe require states to provide equal access to public education for undocumented children, and schools cannot use any information collected about race, ethnicity, national origin, or English proficiency to discriminate against students.

Under the Family Educational Rights and Privacy Act (FERPA), schools and service providers may be required to disclose certain student data if presented with a valid and narrowly tailored warrant, court order, or subpoena. However, schools cannot disclose information they do not have. For example, if your school does not need to collect immigration status data, then don’t collect it. If your school must collect sensitive data to better serve students, then please note that a student’s immigration status is likely considered part of their education record and is therefore protected under FERPA.

With that said, if your school does collect sensitive data that could be used to determine a student’s immigration status, you should know that schools and service providers have legal obligations to both students and law enforcement. Under FERPA, schools that receive funding from the U.S. Department of Education are obligated to protect student privacy. FERPA offers privacy protection for student education records, but some exceptions may apply to disclosures to law enforcement. However, it is very important for schools to know that, if they are compelled to turned over student records to law enforcement, FERPA typically requires that schools notify the student or parent prior to disclosure unless a court has ruled otherwise.

Service Providers should always insist on appropriate legal process before disclosing FERPA protected data to law enforcement. Under FERPA, service providers are only allowed to (re)disclose student data on behalf of the school; and if service providers engage in any unauthorized disclosure, they may also risk penalties under the Electronic Communications Privacy Act (ECPA), which is a federal law designed to prevent unauthorized access to private electronic communications. In any case, it’s important for schools to verify that their contracts with service providers include a requirement that they be notified about a record request from law enforcement.

Overwhelmingly, every expert FPF talked to recommended that schools and service providers should always consult legal counsel prior to the disclosure of student records to law enforcement without consent. You can read our new publication here for more information, and we list additional helpful resources below. If this all sounds overwhelming, then it’s important to remind you of the number one best practice: strive to minimize legal risks on the back end by limiting the amount and types of data you collect about students on the front end.

***

Mia Little is a 3L at the American University Washington College of Law and an intern at FPF. Amelia Vance is the Education Policy Counsel for FPF. 

Resources

American Immigration Council, “Public Education for Immigrant Students: Understanding Plyler v. Doe,” Oct. 24, 2016.

Benjamin Herold, “Trump’s Anti-Immigration Rhetoric Fuels Data Concerns,” EdWeek, Jan. 13, 2017.

Leah Plunkett, “How the New Immigration Agenda Violates the Promise of Plyler v. Doe & What School Decision-Makers Can Do to Protect Their Students & the Constitution,” Berkman-Klein Center, March 6, 2017.

Program to collect information relating to nonimmigrant foreign students and other exchange program participants, 8 U.S. Code § 1372.

Retention and Reporting of Information for F, J, and M Nonimmigrants; Student and Exchange Visitor Information System (SEVIS), 67 FR 76255.

U.S. Department of Education, K-12 School Officials, last accessed Sept. 25, 2017.

U.S. Department of Education, Resource Guide: Supporting Undocumented Youth, Oct. 20, 2015.

U.S. Immigration and Customs Enforcement, Memorandum on Enforcement Actions in Sensitive Locations, Oct. 24, 2011.

Related News Articles

Betsy Woodruff, “The Trump Administration Now Has Tons Of DACA Data And Is Poised To Weaponize It,” The Daily Beast, Sept. 5, 2017.

Mark Keierleber, “Trump order could give immigration agents a foothold in US schools,” The 74 and The Guardian, Aug. 22, 2017.

Mark Keierleber, “As Immigrant Students Worry About a New School Year, Districts & Educators Unveil Plans to Protect Their Safety (and Privacy),” The 74 and The Guardian, Aug. 21, 2017.

Sean Teehan, “Mass. Schools Emphasize Student Privacy When Dealing With Immigration Officials,” New England Public Radio, Aug. 14, 2017.

Massachusetts Attorney General Maura Healey, “AG Healey Issues Guidance to Health Care Providers and Public Schools on Immigration Enforcement Requests,” May 18, 2017.

Greg Childress, “Durham school board to look at student privacy policy amid deportation fears,” The News & Observer, April 2, 2017.

Jill Tucker, “California pressed to stop collecting students’ citizenship data,” SFGate, March 27, 2017.

Elizabeth A. Harris, “Educators Prepare for Immigration Agents at the Schoolhouse,” The New York Times, March 7, 2017.

Corey Mitchell, “What can schools do to protect undocumented students, and other FAQs,” March 6, 2017.

Monica Disare, “As anxiety grows after Trump’s executive orders, what protections do immigrant students have in NYC schools?” Chalkbeat, Feb. 6. 2017.

Image: “Resolution #4 – become more organized” by Victoria Pickering  is licensed under CC BY-NC-ND 2.0. The original picture was cut to 1200×545 for this blog.

Artificial Intelligence, Machine Learning, and Ethical Applications

FPF and IAF to Host Event at 39th International Conference of Data Protection and Privacy Commissioners Discussing Key Technologies and Impacts for Privacy, Data Protection, and Responsible Information Practices

On September 25, 2017, the Future of Privacy Forum and the Information Accountability Foundation will co-host an official side event at the International Conference of Data Protection Commissioners in Hong Kong.  The event follows IAF’s publication of Artificial Intelligence, Ethics and Enhanced Data Stewardship and associated blog, and FPF’s curation of leading research highlighting the privacy challenges posed by artificial intelligence.  The presentations and discussion are an excellent opportunity to learn how AI works, why it matters for data protection frameworks, and to discuss the implications of algorithmic systems that interact with people, learn with little or no human intervention, and make decisions that matter to individuals.

Technologists have long used algorithms to manipulate data. Programmers can create software that analyzes information based on rules and logic, performing tasks that range from ranking web sites for a search engine to identifying which photos include images of the same individual.  Typically, software performs this analysis based on criteria selected and prioritized by human engineers and data scientists.  Recent advances in machine learning and artificial intelligence support the creation of algorithmic software that can, with limited or no human intervention, internally modify its processing and criteria based on data.

Machine learning techniques can help hone algorithmic analysis and improve results.  However, reduced human direction means that AI can do unexpected things.  It also means that data protection safeguards should ensure that algorithmic decisions are lawful and ethical – a challenge when specific algorithmic criteria may be opaque or not practical to analyze.  Increasingly, technologists and policymakers are grappling with hard questions about how machine learning works, how AI technologies can ethically interact with individuals, and how human biases might be reduced or amplified by algorithms that employ logic but lack human intuition.

On September 25, 2017, FPF and IAF will bring together technologists, policymakers, and privacy experts to discuss:

Presenters include:

The event will be held from 3:30pm – 5:00pm (15:30 – 17:00) in Kowloon Room II (M/F) of the conference venue in Hong Kong.  Registration is not required.  For more information, please contact John Verdi at [email protected] or Peter Cullen at [email protected]. Please also look out for other side events from our colleagues at IAPP, Nymity, and OneTrust.

FPF Welcomes New Team Members

The Future of Privacy Forum is delighted to welcome several new members to our team!

Carson Martinez

Carson Martinez is our new Policy Fellow. Carson started working at FPF in early August on the emerging areas of privacy and data protection. She is working in consumer genetics and organizing the upcoming event: “Refining Privacy to Improve Health Outcomes” Symposium. Specifically, she will work on issues surrounding health data, particularly where it is not covered by HIPAA. These non-HIPAA health data issues include consumer-facing genetics companies, wearables, medical “big data” and medical device surveillance. Finally, she will also be assisting with the operation of the Genetics Working Group.

Carson was previously an Intern at Intel with the Government and Policy Group, working on health, technology, and policy. Before joining Intel, Carson was an intern for the International Neuroethics Society, and a Research Assistant for both Data-Pop Alliance and New York University. Carson graduated from Duke University with a Master’s Degree in Bioethics and Science Policy with a concentration in Technology and Data Policy. She earned her Bachelor’s Degree in Neuroscience with minors in Philosophy and Psychology from New York University. Carson is also a Certified Information Privacy Professional/United States (CIPP/US).


Chanda Marlowe

Chanda Marlowe is our inaugural Christopher Wolf Diversity Law Fellow. FPF established the Fellowship in dedication to the vision and commitment of Christopher Wolf, founder of the Future of Privacy Forum. Chris has worked throughout his life to fight discrimination, bigotry, and bias. At FPF, Chris has led our work on understanding the uses of Big Data to fight discrimination, representing FPF at the 2013 White House Big Data Workshops and collaborating with the Anti-Defamation League on a Big Data project and workshop. The Fellowship is designed to equip recent law school graduates with the skills to succeed in the privacy profession, and at the conclusion of the Fellowship to provide private and public employers with privacy professionals of diverse backgrounds.

Chanda is returning to FPF after working as our Law and Policy Intern last summer. She has also been working for the past two years as the Roy H. Park M.S. Fellow for the UNC School of Media and Journalism. Chanda will be focusing on consumer and commercial privacy issues including general data management, de-identification, privacy ethics, algorithms, the Internet of Things, and Connected Cars. She will working on proposed regulatory actions, research and analysis of European privacy issues, and tracking consumer privacy legislation. Previously, Chanda was a Litigation Intern for ACLU of Northern California, a Judicial Extern for the Honorable Judge Wanda Bryant, and an Internal Communications Intern for the Volvo Group.

Chanda received her Juris Doctorate from the UNC School of Law. She obtained her Master’s Degree in Communication and Journalism from the UNC School of Media and Journalism. Chanda earned her M.A. in Teaching from UNC at Chapel Hill, where she also earned her B.A. in English Language and Literature.


Lindsey Barrett

Lindsey Barrett is our new Georgetown Policy Fellow. She will be working closely with FPF’s Student Data Privacy Project and will also be leading the 2017 Privacy Papers for Policymakers annual awards event. Lindsey worked for FPF as a Legal Intern in 2016.

In law school, Lindsey worked with Facebook as a Privacy and Public Policy Extern, as a Research Assistant for the Georgetown Center on Privacy and Technology, and was the co-founder and Managing Editor of the Georgetown Law Technology Review. Lindsey also has experience working as a Legal Intern for the Senior Adviser for Privacy at the Office of Management and Budget, and for the Office of International Affairs at the U.S. Department of Justice. Lindsey currently serves as the Updates Editor for the American Bar Association Internet of Things Committee.


Matthew Green Jr.

Matthew Green Jr. is our new Communications Intern. Matthew studies Public and Mass Communications at The College of New Jersey, with minors in Marketing and Journalism. He previously worked in marketing and sales for The AroundCampus Group as an Outside Sales Representative and for the Shiloh Community Development Corporation as a Communications Intern. Matthew will be working closely with Melanie Bates, the Director of Communications, on public outreach through various media platforms. He will assist with social media development, monthly newsletter production, FPF website maintenance, and various other marketing and public relations projects.


Maria Little

Maria Little is our new Legal Intern. Maria is studying to get her Juris Doctor at the American University Washington College of Law where she is an active member of the Intellectual Property Law Society as well as the Junior Editor for the National Security Law Brief. She earned her Master’s Degree in Intelligence Analysis from Johns Hopkins University and a Bachelor of Arts in International Studies from Middlebury College.

Maria worked this past summer as a Google Public Policy Fellow for the Open Technology Institute. Before that, she worked at the Office of the Director of National Intelligence General Counsel as a Legal Intern and for the U.S. Department of Defense as an Intelligence Oversight Officer/Data Scientist.


Amy Oliver

Amy Oliver is a volunteer attorney with the policy team, focusing on a wide variety of issues related to consumer privacy and emerging technology. Prior to joining FPF, Amy served as a career attorney for the U.S. Department of Justice for 16 years. With a Doctorate of Law Degree from the American University Washington College of Law, she has a wealth of experience and expertise in law and government policy. At American University, she was involved with the Appellate Advocacy Clinic and the Administrative Law Journal as the Editor. Amy received a Master’s Degree in International Law from the University of New South Wales Australia and a Bachelor’s Degree in History from the University of Delaware. Amy is also a Certified Information Privacy Professional/United States.

Study: EU-US Privacy Shield Essential to Leading European Companies

New FPF Study Documents More Than 100 European Companies Participating in the Privacy Shield Program

From Major Employers such as Ingersoll-Rand and Lidl Stiftung to Leading Technology Firms like Telefónica, RELX, and TE Connectivity, European Companies Depend on the EU-US Agreement

EU Firms are Signing up for Privacy Shield at a High Rate – the One-Year-Old Privacy Shield Program Includes a Larger Percentage of EU Companies than the Predecessor Safe Harbor Program

Termination of the Privacy Shield Program Could Inhibit European Employment – Nearly One Third of Privacy Shield Companies Rely on the Framework to Transfer HR Information of European Staff

The Future of Privacy Forum conducted a study of the companies enrolled in the US-EU Privacy Shield program and determined that 114 European headquartered companies are active Privacy Shield Participants. These European companies rely on the program to transfer data to their US subsidiaries or to essential vendors that support their business needs.

FPF staff also found that EU companies comprise 5.2% of all Privacy Shield companies, an increase over the 3.5% of all companies that were based in Europe under the previous EU-US Safe Harbor Program in 2014. At only a year old, the number of Privacy Shield participants is already greater than 2,000 and typically grows by several members each week.

Leading EU companies that rely on Privacy Shield include:

ABB, Swiss electrical equipment company

CNH Industrial America, Dutch capital goods company

Ingersoll-Rand, Irish globally diversified industrial company

Lidl Stiftung, German grocery market chain

NCS Pearson, British education assessment and publishing company

Reckitt Benckiser, British consumer goods company

RELX, British and Dutch information and analytics company

TE Connectivity, Swiss consumer electronics company

Telefónica, Spanish mobile network provider

With the first annual review of the Privacy Shield framework underway, it is important for parties on both sides of the Atlantic to recognize the program’s benefits to the US and to Europe. Although no system is perfect, there is substantial value for many stakeholders, including leading European companies, in maintaining Privacy Shield protections for companies and consumers in both the United States and Europe.

FPF research also determined that over 700 companies, nearly a third of the total number analyzed, use Privacy Shield to process their human resources data. Inhibiting the flow of HR data between the US and EU could mean delays for EU citizens receiving their paychecks, or a decline in global hiring by US companies. Therefore, employees stand to lose if the Privacy Shield were terminated or materially altered.

The research identified 114 Privacy Shield companies headquartered or co-headquartered in Europe. This is a conservative estimate of companies that would be impacted by cancelation of the Privacy Shield framework – FPF staff did not include global companies that have major European offices but are headquartered elsewhere. The 114 companies include some of Europe’s largest and most innovative employers, doing business across a wide range of industries and countries. EU-headquartered firms and major EU offices of global firms depend on the Privacy Shield program so that their related US entities can effectively exchange data for research, to improve products, to pay employees and to serve customers. These companies would be severely burdened and disadvantaged by termination of the Privacy Shield program. Given the importance of this mechanism to companies and consumers on both sides of the Atlantic, FPF recommends that the Privacy Shield arrangement be preserved.

Methodology:

• FPF staff recorded a list of 2,188 active Privacy Shield companies as of July 2017 from https://www.privacyshield.gov.

• FPF staff performed a web search for each current company by name, checking the location of the company’s headquarters on a combination of public databases such as LinkedIn, CrunchBase, Bloomberg, and the company’s own website.

• A company that listed its headquarters in an EU member state or in Switzerland was counted as a match; companies that merely had a prominent EU office or founded in an EU member state were not counted.

• 114 total EU-headquartered companies were identified using this method.

For the full list of European companies in the Privacy Shield program, or to schedule an interview with John Verdi or Jules Polonetsky, email [email protected].

Upcoming Future of Privacy Forum Events

In the coming months, FPF will be hosting or co-hosting a number of events. We welcome your attendance and participation. Please contact us at [email protected] for further information.

Events

Sustainable Innovation with Effective Data Protection in the Pacific Rim

presented by Information Accountability Foundation and FPF

September 25, 2017

Hong Kong

Refining Privacy to Improve Health Outcomes

presented by Triangle Privacy Research Hub, FPF, Intel Corporation, North Carolina Journal of Law & Technology (NC JOLT), Information Accountability Foundation (IAF), Duke University, and Indiana University 

October 26-27, 2017

Chapel Hill

Bridging Industry and Academia to Tackle Responsible Research and Privacy Practices

Organized by Xinru Page, Bentley University; Pamela Wisniewski, University of Central Florida; Margaret Honda, Future of Privacy Forum, Jen Romano-Bergstrom, Instagram and President of the User Experience Professionals Association; Sona Makker, Facebook; Norberto

November 2-3, 2017 (Call For Participation deadline is September 22, 2017)

New York

Privacy Localism: A New Research Agenda

presented by Information law Institute and FPF

November 3, 2017

NYU School of Law

Brussels Privacy Symposium: AI Ethics

presented by the Brussels Privacy Hub of the Vrije Universiteit Brussel (VUB) and FPF

November 6, 2017

Brussels

Privacy Engineering Research and the GDPR: A Trans-Atlantic Initiative

presented by Internet Privacy Engineering Network, University of Leuven, Carnegie Mellon University, and FPF

November 10, 2017 (Call For Papers deadline is September 24, 2017)

Leuven, Belgium

Roundtable Discussion: Smart Cities and Open Data 

presented by FPF at the 2017 MetroLab Network  Annual Summit

December 14, 2017

Atlanta, Georgia

2017 Privacy Papers for Policymakers

presented by FPF and the Congressional Bi-Partisan Privacy Caucus 

February 27, 2018 (Nomination deadline is September 26, 2017)

Capitol Hill


Sustainable Innovation with Effective Data Protection in the Pacific Rim

The Pacific Rim is a hot bed of innovation in observational technologies and applying them to practical applications driven by advanced analytics and artificial technologies. For these applications to be sustainable there needs to be policy mechanisms that are respectful of the diverse ethical and legal cultures in the region, and interoperable with other regions. This session will explore the application of information policy governance for new technologies emerging in the region.

Organizers: Information Accountability Foundation and FPF

When: September 25, 2017; 3:30pm – 5pm

Where: 

Kowloon Shangri-La

Kowloon Room II (M/F)

Hong Kong

This is an official side event of the 39th International Conference of Data Protection and Privacy Commissioners. Please click the link below for details. 

READ MORE


FPF’s Stacey Gray will be giving an exclusive presentation on data security and privacy guidelines for voice-activated devices like the Amazon Echo, Google Home, and many more. Breakfast will be served, followed by networking with fellow IoT executives and entrepreneurs.

Agenda:

8AM: Arrival & Breakfast

8:30AM: ‘The Future of IoT Privacy & Security’ Presentation by Stacey Gray, Future of Privacy Forum

9:00AM: Q&A and Curated Discussion

9:30AM: Executive Networking

To receive an invitation, please contact [email protected]


Refining Privacy to Improve Health Outcomes

The Triangle Privacy Research Hub’s symposium will bring together leading experts in the field of privacy, health, and data to discuss how new technologies and data sources can improve health outcomes, while protecting individual privacy. The goal of the event is to propose specific law, policy and practice changes to promote the more effective use of data for health.

The symposium is sponsored by the Triangle Privacy Research Hub (TPRH), Intel CorporationFuture of Privacy Forum (FPF)the North Carolina Journal of Law & Technology (NC JOLT)the Information Accountability Foundation (IAF)Duke University Science and Society InitiativeDuke University Masters of Management in Clinical Informatics (MMCi), and Indiana University Center for Law, Ethics, and Applied Research (CLEAR) in Health Information.

Agenda:

Thursday, October 26 – Sponsored by the North Carolina Journal of Law & Technology (NC JOLT)

Location: Rizzo Center: 150 Dubose Home Lane, Chapel Hill, NC

1:00 p.m. Keynote Remarks

1:45 p.m. – 5:00 p.m. Panel Discussions with Audience Q&A

Friday, October 27 

Location: Duke Innovation & Entrepreneurship Initiative at The Bullpen: 215 Morris St #300, Durham, NC

9:00 a.m. Keynote Remarks

9:45 a.m. – 1:00 p.m. Panel Discussions with Audience Q&A

Lunch will be provided following the program

Please note this event is invitation only. For more information, please contact [email protected].


Bridging Industry and Academia to Tackle Responsible Research and Privacy Practices

As more human interactions move online and the amount and variety of information shared digitally continues to grow, decisions regarding the collection, sharing, and use of this data must take into account both ethical and privacy considerations. It is important that industry and academia come together to find joint solutions for making these difficult decisions regarding privacy and ethics to maximize the benefits of data-driven research and practices, while ensuring that harms and negative outcomes are prevented. To bridge communication between these communities, we are organizing a workshop for thought leaders from academia, industry and civil society to identify common goals, establish a long-term vision, and initiate working teams for tangible projects focused on responsible research ethics and privacy practices around user data.

When: November 2-3, 2017 (CFP deadline is September 22, 2017)

Location: Facebook Offices, 770 Broadway, New York, NY

To encourage a diverse set of attendees, we ask those who are interested to provide their resume or CV and a 1 to 2-page submission. Please click the link below for details. 

READ MORE


Privacy Localism: A New Research Agenda

Academic experts on administrative law, privacy, federalism, and local governance will be joined by policymakers, industry representatives and privacy advocates to present and discuss a variety of perspectives on the legal, empirical and policy implications of this trend toward “privacy localism.”

November 3, 2017

NYU School of Law

D’Agostino Hall, Lipton Hall, 108 West 3rd Street

Organizers:  Katherine J. Strandburg (ILI Faculty), Ira Rubinstein (ILI Senior Fellow), Bilyana Petkova (ILI Fellow, Assistant Professor, Maastricht University)

Co-sponsors: Information Law Institute and Future of Privacy Forum

RSVP: Email Bilyana Petkova


AI Ethics: The Privacy Challenge

Please join us for the 2nd Annual Brussels Privacy Symposium. The 2017 Symposium is titled AI Ethics: The Privacy Challenge and is jointly presented by FPF and the Brussels Privacy Hub of the Vrije Universiteit Brussel (VUB). This all-day workshop hosted on November, 2017 in Brussels, Belgium will provide attendees the opportunity to examine issues sounding Artificial Intelligence as well as feature papers and speakers from around the globe.

Selected interdisciplinary works in law and policy, computer science and engineering, social studies, and economics will be published in a special issue of IEEE Security & Privacy Magazine.

When:

6 November, 2017

10:00 AM – 5:00 PM

Where:

Vrije Universiteit Brussel,

Institute of European Studies,

Pleinlaan 5, 1050

Brussels, Belgium

The event is complimentary, but space is limited, so please register if you plan to attend. For more event information, please click here to be directed to the Brussels Privacy Symposium webpage. Special thanks to our 2017 Brussels Privacy Symposium sponsors and supporters.

REGISTER HERE

Privacy Engineering Research and the GDPR: A Trans-Atlantic Initiative

When the EU’s GDPR becomes fully applicable on 25 May 2018, many data protection requirements will be seen in a new perspective. With this event, we aim to determine the relevant state of the art in privacy engineering; in particular, we will focus on those areas where the “art” needs to be developed further. The goal of this trans-Atlantic initiative is to identify open research and development tasks, which are needed to make the full achievement of the GDPR’s ambitions possible.

Event Format:

  • Review the state of the art – including current solutions
  • Discuss current research in the field
  • Break-out Sessions – participants will focus on “key challenges” and identify challenges for research and development
  • All participants will discuss the outcomes of the break-out sessions and suggest next steps at the conclusion of the Workshop.

When: November 10, 2017 (CFP deadline is September 15, 2017)

Location: Leuven, Belgium

To encourage a diverse set of attendees, we ask those who are interested in attending the workshop to provide a one-page submission. Please click the link below for details. 

READ MORE


Smart City scenery Roundtable Discussion: Smart Cities and Open Data (2017 MetroLab Network Annual Summit)

WHAT

Privacy and Open Data

The Smart Cities and Open Data movements promise to use data to spark civic innovation and engagement, promote inclusivity, and transform modern communities. At the same time, advances in sensor technology, re-identification science, and Big Data analytics have challenged cities and their partners to construct effective safeguards for the collection, use, sharing, and disposal of personal information. In this breakout session, we will discuss privacy risks in open data programs and how cities like Seattle are promoting transparency while protecting individual rights.

Privacy and Urban Instrumentation

As cities harness more data than ever, how can we assess the risks and opportunities of new technologies and data flows while preserving public trust and individual privacy? In this breakout session, come hear from Cities, CIOs, academic leaders, and industry experts as we examine the opportunities and challenges of new urban instrumentation and how we can come together to address privacy challenges in smart cities.

WHERE

2017 MetroLab Network Annual Summit

Georgia Tech

Atlanta, Georgia

WHEN

December 14, 2017

2:30 PM & 4:00 PM

READ MORE


First Take: Privacy in Updated Federal Guidance for Automated Driving

Yesterday, the Department of Transportation and the National Highway Traffic Safety Administration issued updated guidance for autonomous vehicles; streamlining last year’s guidance, incorporating public comments, and stripping privacy from its recommendations.

While Transportation Secretary Elaine Chao’s opening message highlights the importance of protecting consumer privacy, privacy was among three elements removed in the updated guidance, along with “Registration and Certification” and “Ethical Considerations.” This means that these topics are not to be included in the Safety Assessment Letters, which the guidance stresses are highly voluntary, despite recent legislative proposals that would make mandatory. Readers seeking information on NHTSA’s updated approach to privacy had to look to the footnotes, which point to a new NHTSA website, where answers can be found by clicking through and expanding an extended series of Q&As.

A close read of these Q&A’s reveals the removed sections of the guidance “remain important and are areas for further discussion and research” and the updated guidance instead focuses on topics readier for implementation in the near term. The site emphasizes the Federal Trade Commission’s role as the chief agency for protecting consumer privacy in this space, in part because “privacy is not directly relevant to motor vehicle safety.” This is one of the clearest instances to date of NHTSA clarifying the limits of their jurisdiction over privacy as related to the FTC.

A call for data sharing between entities in this space was also largely removed from the guidance, with NHTSA on their website citing industry concern about sharing of proprietary information, and lack of clarity around safety metrics. While much of the promise of advanced driving systems will eventually be their ability to enhance safety through data sharing, the nascent nature of this technology renders it a bit early for such mandates between non-governmental entities, and NHTSA focuses first on building standardized mechanisms for crash reconstruction data reporting to government. In the meantime, they do encourage voluntary collaboration among industry.

NHTSA’s removal of a privacy focus stands in contrast to some recent approaches to this issue. The SELF DRIVE Act that passed in the House of Representatives last week includes a comprehensive section on consumer privacy with a mandate that manufacturers create and share their “privacy plan,” creates an advisory group that will review privacy among its topics, and calls on the FTC undertake study of this topic (our full recap of the House bill here). In June, NHTSA and the FTC co-hosted a workshop focused on privacy around connected cars, emphasizing the two agencies’ commitment to the issue, but with the FTC taking a much more vocal role throughout the workshop. Also of note, a recently shared Staff Discussion Draft of the Senate version of self-driving legislation does not include a privacy section, and its data recording section focuses on crash data.

For those of us who think that proactively protecting consumer privacy in the connected car is crucial to the adoption of this safety-enhancing technology, its absence from the NHTSA guidance is notable. But some may find that this move achieves the clarity that many of us, including the Government Accountability Office in a recent study, have called for regarding NHTSA and FTC roles—by explaining that nearly all issues related to privacy in connected cars fall squarely in the FTC’s camp.

RELATED FPF RESOURCES

See our recap of the privacy elements of the first Federal Automated Vehicles Policy here

See our comments on the first Federal Automated Vehicles Policy here

See our comments on the NHTSA/FTC Workshop here

See our infographic mapping data flows in the connected car here

Bridging Industry and Academia to Tackle Responsible Research and Privacy Practices

Workshop Description

As more human interactions move online and the amount and variety of information shared digitally continues to grow, decisions regarding the collection, sharing, and use of this data must take into account both ethical and privacy considerations. It is important that industry and academia come together to find joint solutions for making these difficult decisions regarding privacy and ethics to maximize the benefits of data-driven research and practices, while ensuring that harms and negative outcomes are prevented. To bridge communication between these communities, we are organizing a workshop for thought leaders from academia, industry and civil society to identify common goals, establish a long-term vision, and initiate working teams for tangible projects focused on responsible research ethics and privacy practices around user data. The workshop will kick-off with a keynote from Facebook’s Deputy Chief Privacy Officer, Rob Sherman, and a panel discussion from our advisory board members. 

Wooden sign with ethic conceptsWorkshop Organizers: Xinru Page, Bentley University; Pamela Wisniewski, University of Central Florida; Margaret Honda, Future of Privacy Forum, Jen Romano-Bergstrom, Instagram and President of the UXPA (User Experience Professionals Association); Sona Makker, Facebook; Norberto Andrade, Facebook


Advisory Board Members: Chris Clifton, Purdue University; Lorrie Cranor, Carnegie Mellon University, Director CyLab Usable Privacy and Security Laboratory; Lauri Kanerva, Facebook, Research Management Lead; Helen Nissenbaum, Cornell Tech and New York University; Jules Polonetsky, Future of Privacy Forum, Chief Executive Officer (full board to be finalized soon)

When: November 2 – 3, 2017

Location: Facebook Offices, 770 Broadway, New York, NY

To encourage a diverse set of attendees, we ask those who are interested to provide their resume or CV and a 1 to 2-page submission. Please click the link below for details. 

READ MORE

Bipartisan Report: Feds Should Connect Student Data While Protecting Privacy

Today, the Commission on Evidence-Based Policymaking released their final report. The Commission was created through bi-partisan legislation in 2016 to “consider how to strengthen government’s evidence-building and policymaking efforts” (page 16). One of the key issues that the Commission heard from advocates on all sides about whether to overturn the current federal ban on connecting education data collected by the federal government in order to provide students, postsecondary institutions, and the public with information that could be used to improve policies or better target federal funding. Education organizations that support overturning the current federal ban on connecting education data collected by the federal government with other data were very excited by the report: among other recommendations, the Commission advocated that Congress consider repealing current bans on the collection and use of data for evidence building. However, the concerns of privacy advocates that appeared in many public comments to the Commission were not overlooked: the word “privacy” is mentioned in the report 408 times (out of 114 pages), and numerous privacy and security protections were recommended by the Commission.

Why Overturn the Ban?

The federal government collects a lot of data in order to do its job. For example, data is collected about students and their parents in order to provide them with financial aid. Some leading education organizations argue that this data is generally not being used efficiently or effectively.

Despite the fact that the federal government collects student-level data from higher education students on demographics, income, assets, and educational attainment as well as federal aid amounts and history through the FSA National Student Loan Data System and Central Processing System, college students are not given the critical information they need to know about how students with similar characteristics perform at certain institutions and the expected employment value relative to the educational cost. In addition to the FSA data systems, the National Center for Education Statistics collects data on an institutional level with information about graduation rates and pricing, but this information does not include transfer and part-time students, who comprise a significant percentage of the student population.

Many education advocates have urged an end to the ban, arguing that this would allow new data uses like allowing students to compare potential colleges, allow postsecondary institutions to report certain student data once instead of multiple times as they do now to various federal agencies, and let taxpayers to know whether money spent on higher education is being used most effectively.

Groups opposed to overturning the ban have primarily raised potential privacy and security concerns, such as the danger of the information being repurposed and used in law enforcement or immigration proceedings, and the recent history of problematic data breaches and inadequate security in some federal agencies.

What Did the Report Say?

The Commission explicitly rejected the presumption that “increasing access to confidential data…significantly increase[es] privacy risk.”

The report advocates that Congress “consider repealing current bans and limiting future bans on the collection and use of data for evidence building” (recommendation 2-5). Despite receiving more comments on overturning the federal ban on sharing federally held education data than on any other issue (page 30), the Commission explicitly rejected the presumption that “increasing access to confidential data…significantly increase[es] privacy risk” (page 1). The report states that “there are steps that can be taken to improve data security and privacy protections beyond what exists today, while increasing the production of evidence” (page 1). The Commission used the Fair Information Practice Principles (FIPPS) as their framework to suggest additional ways to protect information already held by the federal government, and suggested balancing privacy vs public right-to-know interests by weighing:

  1. The potential public benefits of the research project;
  2. The sensitivity of the data that would be accessed in that project; and
  3. “Any risk that allowing access could post to confidentiality” (page 24).

Most significantly for privacy advocates who were concerned that data held at the federal level could be used for purposes they were not intended to be used for (such as for a law enforcement action against, for example, a college student whose information was in the SLDN), the Commission recommended using a pre-existing legal framework to restrict the use of the data for “statistical purposes” only, violation of which would be subject to major penalties under the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) which could include fines and/or jail time.

Practically, this means that data would only be used by certain vetted governmental or non-governmental researchers to show the efficacy of a federal program or federal funding. For example, this would allow the government – and therefore the public – to know how students with similar characteristics perform at certain institutions and the expected employment value relative to the educational cost.

The Commission also discussed how to keep data safe, suggesting several methods to share data and disclose deidentified or aggregated data to the public while minimizing the likelihood of any privacy harms occurring.

What’s Next?

Speaker of the House Paul Ryan and Senator Patty Murray said at the press conference for today’s report release that they will be introducing bills in the House and Senate to codify the Commission’s recommendations into law. The report may also improve the chances of passing the College Transparency Act, a bi-partisan bill introduced earlier this year that would overturn the ban while improving the privacy and security of student data held by the federal government. Regardless of the legislative vehicle, the debate about federal collection and use of student data is sure to be an active one in the coming months.

 

This blog is cross-posted at https://studentprivacycompass.org/cepreport.