Protected: Future of Privacy Forum's 2019 Annual Meeting

This content is password protected. Please enter a password to view.





Representatives Latta, Schakowsky To Highlight Data and Privacy Considerations for Connected Vehicles at MobilityTalks International Conference

Remarks Set for Morning of April 3 at Conference Produced by The Washington Auto Show

Future of Privacy Forum’s Lauren Smith to Moderate Discussion on Privacy Risks in Today’s Vehicles

WASHINGTON, March 29, 2019 — The Future of Privacy Forum (FPF) and two senior members of Congress will discuss data and privacy in connected vehicles during the morning session of MobilityTalks International® on April 3, Washington Auto Show President and CEO John O’Donnell announced today.

Reps. Bob Latta (R-Ohio) and Jan Schakowsky (D-Ill.) will participate in a fireside chat moderated by Lauren Smith, Senior Counsel at the Future of Privacy Forum. The session will at the Washington Convention Center will take place in front of auto industry regulators representing more than 40 countries, and will cover the opportunities and privacy considerations created by the interconnected nature of modern vehicle technology. Their panel will begin the two-day MobilityTalks International conference on autonomous vehicles and connected cars. The conference is produced by The Washington Auto Show, and takes place immediately before the show opens to the public on April 5.

Many of the new features that enhance safety and convenience in today’s vehicles are enabled by the collection and sharing of data, putting the topic of privacy in connected cars on the agenda of industry, policymakers, and regulators.

“Connected vehicles are a hot topic in policy circles and Congress is going to consider rules of the road for privacy,” said John O’Donnell, President and CEO of The Washington Auto Show. “We’re excited to have Rep. Schakowsky and Rep. Latta provide attendees at the MobilityTalks International conference with their perspectives on how Congress may approach issues of mobility and data.”

MobilityTalks will feature presentations from policymakers representing over a dozen countries, and they will discuss various aspects of the regulation and certification challenges surrounding autonomous and self-driving vehicles, with focus this year on delineating the appropriate levels of data security and privacy.

The session, A Bipartisan Conversation on Data and Privacy in Connected Vehicles will take place at 8:55 a.m. on Wednesday, April 3, in the Walter E. Washington Convention Center’s East Salon.

The MobilityTalks International® conference gathers government policy makers from around the world to exchange ideas on best practices related to the development and regulation of connected and autonomous vehicles. The conference is organized and sponsored by The Washington Auto Show, and will take place April 3 and April 4, 2019, both on Capitol Hill and at the Walter E. Washington Convention Center.

For more information on MobilityTalks and The Washington Auto Show, please visit www.washingtonautoshow.com.

Media contacts:

Tony Baker

[email protected]

310-593-3680

 

Nat Wood

[email protected]

410-507-7898

Municipal Leaders Joining Network to Advance Civic Data Privacy

Connected technologies and always-on data flows are helping make today’s cities and communities more livable, productive, and equitable. At the same time, these technologies raise concerns about individual privacy, autonomy, freedom of choice, and institutional discrimination.

How do we leverage the benefits of a data-rich society while giving members of our community the confidence of knowing their privacy is protected? How can we reduce traffic, fill potholes faster, and deliver services more efficiently in an equitable, privacy-conscious way?

The Future of Privacy Forum, in partnership with the MetroLab Network and with the support of the National Science Foundation, has launched the Civic Data Privacy Leaders Network, a collaborative that will provide an active, authoritative resource for municipal leaders to navigate emerging privacy issues, share practical guidance, and promote fair and transparent data practices.

Government officials from 30+ communities have already joined this new network, along with other representatives from cities and counties from across the US and Canada, including, Asheville, NC; Arlington County, VA; Austin, TX; Bend, OR; Bloomington, IL; Boston, MA; Boulder, CO; Chattanooga, TN; Columbus, OH; Denver, CO; Gainesville, FL; Hamilton, Can.; Kansas City, MO; King County, WA; Long Beach, CA; Los Angeles, CA; McLean County, IL; Minneapolis, MN; New York, NY; Normal, IL; Oakland, CA; Portland, ME; Portland, OR; San Francisco, CA; San Jose, CA; Santa Clara County, CA; Seattle, WA; South Bend, IN; Syracuse, NY; Toronto, Can.; Washington, DC; and Weld County, CO.

Since 2018, FPF and the Civic Data Privacy Leaders Network have advanced civic data privacy by:

  1.  Creating a comprehensive Privacy Risk Assessment for smart and connected community projects, providing guidance to local governments and technology providers and ensuring projects serve the common good. This work is also the foundation for a Model PIA Policy aimed at global “smart city” initiatives.
  2. Expanding the network of privacy leaders for smart cities and creating peer networks, best practices, and practical tools for responsible data use.
  3. Hosting workshops in conjunction with MetroLab Network, the South Big Data Hub, and others in order to share best practices and identify areas for further research and collaboration.

These efforts and others help equip local officials with the tools they need to collect and use civic data responsibly, engage with the public about data and technology, and unleash the power of data to improve our communities. By working to advance public knowledge, understanding, and engagement with privacy-related concerns, the Civic Data Privacy Leaders Network supports public trust in smart city technologies and in local government.

This collaborative includes representatives from a diverse set of stakeholders in U.S. and global smart cities and communities.

Privacy leaders from local government are invited to join our network and take part in this unique opportunity to shape our communities in a way that balances privacy protections and data use for the common good.

To learn more or to join the network, contact Kelsey Finch at [email protected].

40 Organizations Release Principles For School Safety, Privacy, and Equity

WASHINGTON, DC (March 27, 2019) – Today, FPF and 39 other education, privacy, disability rights, and civil rights organizations released ten principles to protect all students’ safety, privacy, and right to an equal education. The principles are meant to serve as a starting point for conversations with policymakers and school officials about how to keep students safe while respecting their dignity and encouraging their individual growth.

“School safety measures such as monitoring should be evidence-based and focus on real threats of harm, not speculation,” said Amelia Vance, Director of the Education Privacy Project at the Future of Privacy Forum. “Unfairly labeling students with disabilities as potential threats based on little or no evidence does nothing to keep our schools safe.”

Many recent state school safety proposals call for increased surveillance in an attempt to reduce school violence and students’ self-harming behaviors. But studies show that school surveillance can disproportionately affect students with disabilities and students of color.

For example, a report submitted in January by Florida’s Marjory Stoneman Douglas High School Public Safety Commission recommends sharing information about children’s mental health and disabilities with threat assessment teams and allowing school resource officers to access student records more easily. The proposals may intend to protect students, but their effects are often different and can make it difficult for parents and students to discern where the school ends and the police begin. Students are still maturing and need to know schools are safe spaces where they can ask questions, think creatively, and make mistakes.

“School safety measures that target students because of their disabilities are not effective and they undermine these students’ opportunities to receive a quality education and to be treated fairly,” said Jennifer Mathis, Director of Policy and Legal Advocacy at the Bazelon Center for Mental Health Law.

The Principles for School Safety, Privacy, and Equity states that:

Read FPF’s blog on the principles here.

The Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

 

Media Contacts:

Tony Baker

[email protected]

310-593-3680

 

Nat Wood

[email protected]

410-507-7898

 

Signatories of the Principles for School Safety, Privacy, and Equity:

Fairness, Ethics, & Privacy in Tech: A Discussion with Chanda Marlowe

After beginning her career as a high school English teacher, Chanda Marlowe’s career change led her to become FPF’s inaugural Christopher Wolf Diversity Law Fellow. She’s an expert on location and advertising technology, algorithmic fairness, and how vulnerable populations can be uniquely affected by privacy issues.


Chanda Photo Jpeg

What led you to the Future of Privacy Forum?

I was a high school English teacher and I decided I wanted to be an advocate for student rights. I went back to school and earned a dual law degree and Master’s in Communications from the University of North Carolina at Chapel Hill. That’s where I found that I was really drawn to student privacy. I first came to FPF because I wanted to intern someplace that was leading in student privacy. I had cited FPF’s work in my papers and presentations. Also around that time, President Obama endorsed FPF’s Student Privacy Pledge, which was a big moment. I left FPF after my internship, and then about two years ago, I came back as the Christopher Wolf Diversity Law Fellow.

What attracts you to privacy issues?

There is a lot of room for thoughtful discussion among people who are very smart in the privacy space because there is so much grey area. It’s special to be part of these conversations when technology is emerging so rapidly.

What areas of privacy have attracted you?

I came to FPF intending to work primarily on education issues, and I ended up with opportunities to work on much more, including location and ad practices, the Internet of Things, and algorithmic fairness. So now I follow those issues closely and I really get into the granular details, like what does it mean to be compliant when the legal landscape is constantly changing with GDPR, state laws, and federal legislation.

I’m also drawn to how privacy impacts global populations. I did my Master’s thesis on surveillance of students, and I’ve been fortunate to work on the issue of algorithmic fairness.

Increasingly, everything is being automated and the decisions that were once made by humans are now being made by machines. This can lead to challenges – using machines doesn’t get rid of all bias. There could be bias in the data set, you could have problems with the machine itself, or the programmer could be biased. So those are three ways bias can be injected into a machine learning system.

I’ve been working very closely with Lauren Smith on a paper that maps out the potential harms that can arise from automated decision-making, and considers whether existing laws are adequate to address them.

I also enjoyed working on an FPF project that considers how the Internet of Things can have unique privacy impacts on people with disabilities. It was amazing to be are part of a project where we convened academics, consumer advocates (including disability organizations), and companies that create products for people with disabilities, ranging from startups to large platforms, and have a conversation about privacy concerns. We were able to work very closely with the American Association of People with Disabilities to draft a paper that not only explores the nuances of privacy considerations for people with disabilities using IoT devices and services, but also provides recommendations to address privacy considerations, which includes considering what mechanisms for informing people with disabilities are making their way into products. For example, the Amazon Echo added auditory cues so people who are blind aren’t expected to react to a light. We want to encourage more thoughtful approaches like that. It was an amazing experience to collaborate with so many people in the disability community.

What do you see as up-and-coming privacy issues?

I’m happy to see that there’s more talk about what privacy means for vulnerable populations. There has been increased attention to bringing in groups that have not previously been invited to the table to talk. Privacy isn’t just in one small bucket anymore, separate from broader conversations.

It’s really important that FPF is taking deliberate efforts to make sure the next generation of privacy professionals is diverse. The creation of the Christopher Wolf Diversity Law Fellowship embodies a commitment to valuing diversity in all fields that involve privacy.

Your fellowship is scheduled to end in a few months. What’s next for you?

This has already been a wonderful experience, and I have more to accomplish before it ends. Recently, I had the amazing experience of speaking before the Congressional Black Caucus Institute about privacy legislation. That’s something I never would have had the opportunity to do if I had stayed in North Carolina.

I’m interested in staying in the privacy law and policy space. I have learned so much and have been given so many opportunities that I cannot wait to launch the next steps of my privacy career.


FPF will host our next Privacy Book Club on April 24 at 2:00 PM EST. Join us to discuss Habeas Data: Privacy vs. the Rise of Surveillance Tech by Cyrus Farivar. Sign up for the book club here.

We hope you will join us at our 10th Anniversary Celebration on April 30. Buy your ticket here.

Future of Privacy Forum is Turning 10!

 

On April 30, 2019 from 6:00 PM – 8:00 PM, we will host a 10th Anniversary Celebration in Washington D.C. — and you’re invited! We are delighted to announce that at the 10th Anniversary Celebration we will present the following awards:

Helen Dixon

Data Protection Commissioner, Ireland

Distinguished Public Service

 

 

J. Trevor Hughes

President & Chief Executive Officer, IAPP (International Association of Privacy Professionals)

Community Builder

 

 

Dale Skivington

Privacy Consultant and Adjunct Professor of Law, University of Colorado Law School


Former Chief Privacy Officer, Dell Inc. and Eastman Kodak Company

Career Achievement

 

Peter Swire

Elizabeth and Tommy Holder Chair of Law and Ethics, Scheller College of Business


Georgia Institute of Technology

Outstanding Academic Scholarship

 

Additionally, FPF would like to thank the Leadership Sponsors who make this event possible:

And thank you to our Event Sponsor:

Date and Time

Tue, April 30, 2019

6:00 PM – 8:00 PM EDT

Location

The Line

1770 Euclid St NW

Washington, DC 20009

 

Schedule for the Evening

6:00 p.m. Eat, Drink and Socialize

7:00p.m. Short Program

Welcome

Christopher Wolf, President, Future of Privacy Forum Board of Directors

Andrea Jelinek, Chairwoman, European Data Protection Board, Director, Austrian Data Protection Authority

Award Ceremony

Introduction

Debra Berlyn, Treasurer, Future of Privacy Forum Board of Directors

Presentation

Rebecca Kelly Slaughter, Commissioner, Federal Trade Commission

Distinguished Public Service Award: Helen Dixon, Data Protection Commissioner, Republic of Ireland

Presentation

Sandra Hughes, Secretary, Future of Privacy Forum Board of Directors

Career Achievement Award: Dale Skivington, Chief Privacy Officer Dell Technologies 2011 – 2018 and Eastman Kodak Company 1988 – 2011

Introduction

Alan Raul, Future of Privacy Forum Board of Directors

Presentation

Abigail Slater, Special Assistant to the President, National Economic Council

Community Builder Award: J. Trevor Hughes, President and Chief Executive Officer, IAPP

Presentation

Danielle Keats Citron, Morton & Sophia Macht Professor of Law, University of Maryland Carey School of Law

Outstanding Academic Scholarship Award:

Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Institute of Technology, 

Future of Privacy Forum Senior Fellow

7:30 p.m.  Thank You Toast and Celebrate

Jules Polonetsky, Chief Executive Officer, Future of Privacy Forum

Future of Privacy Forum Future Leaders

Interested in attending?

We are sold out, but would appreciate your support here.  Donations benefit FPF’s Scholarship Fund, which supports the Elise Berkower Memorial Fellowship and the Christopher Wolf Diversity Fellowship.

Interested in sponsoring?

Sponsorship opportunities are available and may be found here. For additional sponsorship opportunities for the 10th Anniversary Celebration, contact Barbara Kelly, Leadership Director at [email protected].

The Future of Ad Tech: A Discussion with FPF's Stacey Gray

Almost everyone has had a similar experience: visiting a website to shop for a product and then having an advertisement for that product “follow” them around the internet. Most free content today, from social media to news, is funded by ads. In order to deliver those ads and measure their effectiveness, companies today rely heavily on data-driven technology (“ad tech”).

Stacey Gray is FPF’s lead Policy Counsel on ad tech and location services. Prior to joining FPF as a Policy Fellow in 2015, Stacey was at the Georgetown University Law Center, primarily focused on civil rights litigation and technology. In this week’s 10th Anniversary post, she traces the evolution of ad tech from the simple cookies of the past to the personalized billboards we might see in the future.


How has advertising technology changed over the past 10 years?

Probably the biggest change in the last decade has been the adoption of smartphones. It’s cliché, but it’s hard to overstate how everyone carrying a pocket supercomputer with 20+ sensors on it has changed the way advertisers think about ad campaigns. Conversations 10 years ago were limited to the online and desktop world—how websites placed “cookies” on browsers and whether ad networks should be able to track people across different websites. Now advertising technology is focused on reaching individuals in hyper-specific contexts (the right person at the right time and place) and measuring effectiveness by finding ways to link behavior across many devices and platforms.

Another way to describe it is that there’s been a blurring of online and offline worlds. This is enhanced by the rise of the so-called “Internet of Things,” meaning that we’re surrounded by connected devices—cars, appliances, toys, TVs, speakers—that collect and share information. Location-based marketing has also grown tremendously, with companies interested in geo-fencing content (targeting ads to a particular area) as well as using location data to measure things like whether people visit stores after seeing an advertisement. So there has been an explosion in the amount of data available about people and devices. In some ways though, the more things change, the more they stay the same. Advertisers still have the same basic goals—to serve ads to people and to measure whether the ads were effective in driving sales or visits. They are just doing it across far more channels and platforms than ever before.

How has FPF made sense of these changes, and what practices has it recommended?

We primarily convene privacy leaders who work on ad tech to level-set and figure out best practices. The FPF location & ad tech working group has 300+ participants, and a lot of good work happens behind the scenes. When we comment publicly, it’s usually to try to bridge the gaps in understanding between industry, advocates, and policymakers, or to identify privacy tensions in new technologies.

For example, we published a report in 2015 on cross-device tracking, explaining how and why companies are so interested in linking devices belonging to the same person. We’ve analyzed the state of privacy practices in smart TVs and published more narrow guidance on things like session replay scripts, Apple software updates, or understanding Bluetooth beacons. Advertising technology often raises issues that go beyond privacy, too. For example, advertisers have to ensure that they are not unfairly or harmfully discriminating against people on the basis of class, race, or gender. Advocates worry about “filter bubbles,” transparency in political advertising, and other potential harms related to algorithmic decision-making about online content.

What other developments in this space are currently being debated?

The biggest debate in ad tech right now is how it will be meaningfully regulated. For a long time, ad tech mostly self-regulated with codes of conduct through organizations like the Digital Advertising Alliance (DAA), Network Advertising Initiative (NAI), and others.

In contrast, in the last couple years, we’ve seen a massive proliferation in laws and regulations that specifically impact ad tech. Most importantly, European regulators are beginning to enforce the General Data Protection Regulation (GDPR) against ad tech companies and in some cases are being asked to evaluate the legality of the ecosystem as a whole. In the United States, the California Consumer Privacy Act (CCPA) that comes into effect next year will require companies to allow people to opt out of “sale” arrangements with their data, which will have a major impact on typical online and mobile advertising business models. Meanwhile, many other states are following California’s lead and introducing privacy legislation to be considered in 2019, not to mention the ongoing debate in Washington, DC, over a baseline, comprehensive national privacy law. There is a great deal ahead in 2019 and the opportunity (and challenge) for FPF is to combine industry knowledge with support for strong privacy safeguards to help shape effective regulation.

What changes in advertising technology do you expect to see in the next decade?

Advertising will continue to merge the online and offline worlds. I expect digital billboards using “intelligent video analytics” to be a major trend over the next 5 to 10 years. This is already beginning to happen. Last year in New York City, for example, there was a Fashion Week ad campaign using digital signs with video cameras that applied AI to analyze people’s outfits as they walked by and complement them for wearing particularly interesting ensembles. Walgreens has also started using facial characterization technology in their stores to offer ads based on whether the visitor is perceived to be male or female or a certain age. This kind of real-time video analytics may not always impact privacy, but may nonetheless change the way we think about public spaces.

I also think we will see a major shift in the way we understand revenue models for publishers. Ad-supported business models have generated a lot of free content, which can be good and often seen as “democratizing,” but in many ways it has come at the cost of data privacy. As the regulatory landscape shifts, I expect we will see new ideas and models for things like micro-payments or other new ways for people to support quality content. That said, we want to make sure that we don’t create an income divide in who has access to things like quality news and that we’re equally addressing issues of fairness and equal access to digital products and services.

Finally, it seems clear that we’re at the beginning of a major growth in academic and private support for public interest technology. I’m proud of my alma mater, Georgetown University Law Center, for recently joining with 20 other leading universities in forming a Public Interest Technology University Network to solve problems at the intersection of law and technology. The field of privacy-enhancing technologies (“PETs”) is also really promising. As we look ahead, I hope FPF’s unique knowledge and voice in the privacy space will continue to be helpful in supporting this kind of technical research and scholarship.


FPF will host our next Privacy Book Club on April 24 at 2:00 PM EST. Join us to discuss Habeas Data: Privacy vs. the Rise of Surveillance Tech by Cyrus Farivar. Sign up for the book club here.

We hope you will join us at our 10th Anniversary Celebration on April 30. Buy your ticket here.

FPF Welcomes Mark MacCarthy as Senior Fellow

The Future of Privacy Forum is pleased to announce the addition of Mark MacCarthy as a Senior Fellow. MacCarthy will work with FPF staff and members on data ethics, artificial intelligence and other issues.

“Mark has been a leading champion for responsible data practices and has worked to incorporate thinking from academia and civil society into policymaking and industry standards,” said FPF CEO Jules Polonetsky. “He was a key partner in working with SIAA and FPF to develop the Student Privacy Pledge, a commitment adopted by more than 350 companies. We and our cross-sector stakeholders look forward to his advice and scholarship on the challenging tech policy issues ahead.”

MacCarthy is an adjunct professor at Georgetown University, where he teaches courses in technology policy in the Communication, Culture, and Technology Program and courses on privacy and AI ethics in the Philosophy Department. He is also a Senior Fellow at the Institute for Technology Law and Policy at Georgetown Law and a Senior Policy Fellow at the Center for Business and Public Policy at Georgetown’s McDonough School of Business. Previously, he was Senior Vice President for Public Policy at the Software & Information Industry Association, where he directed initiatives and advised member companies on technology policy, privacy, AI ethics, content moderation and competition policy in tech.

FPF Comments on the California Consumer Privacy Act (CCPA)

On Friday, the Future of Privacy Forum submitted comments to the Office of the California Attorney General (AG), Xavier Becerra.

In FPF’s outreach to the AG, we commended the office for its multi-faceted solicitation of feedback from diverse stakeholders and the public in recent months, including through public forums, testimony before the California Assembly, and requests for comments.

Specifically, we wrote to:

1. Commend the State of California for addressing important data protection rights, including transparency, access, deletion, and reasonable security, for personal information. California has long been a leader in data privacy, and in the last year has served as a legislative model for other states as well as sparking a serious national conversation regarding a federal privacy law. While FPF supports a strong, comprehensive, baseline federal privacy law, we believe that states that do advance legislation should do so in ways that provide consumers with comprehensive protections that are in line with the Fair Information Practice Principles (FIPPs) and take into account interoperability with the EU General Data Protection Regulation (GDPR).

2. Recommend that rule-making efforts recognize that data exists on a spectrum of identifiability. While some data is firmly linked to an individual or provably non-linkable to a person, significant amounts of data exist in a gray area — obfuscated but potentially linkable to an individual under some circumstances. We recommend that the AG take account of this spectrum of identifiability and provide incentives for companies to de-identify data using technical, legal, and administrative measures.

3. Encourage further analysis of the impact of CCPA on socially beneficial research by non-HIPAA entities. Although CCPA excludes health data regulated by the Health Insurance Portability and Accountability Act (HIPAA) and related laws, its provisions govern private companies that may choose to conduct socially beneficial research using non-HIPAA data, including: consumer wearable manufacturers; health-related mobile apps; and genetic testing companies. While these companies should surely be subject to data privacy rules, we recommend that the AG take a close look at specific areas where beneficial research can be enabled or facilitated, or where restrictive requirements may pose particular challenges for researchers.

4. Encourage the AG to establish guidelines for data subject access requests (DSARs) that are secure, practical, and meaningful for consumers. The right to access one’s personal information is a fundamental tenet of the FIPPs, as well as a central feature of privacy laws in the United States and around the world. At the same time, there are inherent risks for some businesses in complying with data subject access request (DSARs), and often a direct tension between access rights and other important privacy safeguards. Ultimately, access requests should be secure, practical for businesses, and meaningful for consumers.

5. Recommend greater clarity on the intersection of CCPA and existing student privacy laws governing education technology vendors. For the benefit of schools, administrators, and education technology (“edtech”) vendors, the AG should clarify key points of CCPA that are applicable to education and student privacy, including: edtech vendors’ CCPA obligations (if any) when they act solely on behalf of public schools or districts; the circumstances under which edtech vendors may be considered “service providers” under the law; and alternately, how edtech vendors may navigate compliance obligations of CCPA in line with federal laws governing student records and California’s existing student privacy laws.

We also attached a list of relevant resources, including FPF publications on a variety of commercial privacy topics that may be of interest to the AG. We hope that our comments and the associated resources will be helpful to the important, ongoing discussion regarding consumer privacy in the State of California.

Privacy in Higher Education: A Conversation with Sara Collins

Innovation in higher education is increasingly fueled by data. From financial aid applications, to online classes, to student success initiatives, college students provide an extraordinary amount of data to schools, companies, and the government. This data provides unprecedented insights into student behavior, and colleges are using it to shape curricula, processes, and services to meet students’ needs.

In this week’s edition of FPF at 10, Sara Collins explains how data continues to transform the higher ed landscape, and why sound privacy practices are needed to ensure a safe, enriching academic experience.

Our 10th anniversary celebration will be on April 30. RSVP here.


Sara Collins is a Policy Counsel on FPF’s Education Privacy Project focused on higher education. While K-12 student privacy is the subject of much proposed legislation and public debate, less attention is focused on higher ed, which Collins hopes to change.

What drew you to the world of higher ed privacy? Which questions were you interested in exploring?

Prior to joining the Education Privacy Project, I was an attorney at the Department of Education’s Federal Student Aid in the Enforcement Office, and prior to that I was the Legal Director at Veterans Education Success. Both of my prior jobs involved analyzing how well colleges were serving their students, and that could only be done with comprehensive and accurate data. Our higher education system is fueled by data, and one persistent question that has shaped my career is, “How can we use this data to improve students’ education experience? “FPF recognizes that using data can change higher education and make it more beneficial for students, but also wants to answer the question: “How do we meaningfully protect student data?”

Why is it important to think of college students as a distinct population when it comes to data privacy?

Colleges and universities have massive amounts of data on their students. We don’t talk about this enough, but most modern college campuses have turned into miniature smart cities. Not only are schools increasingly using Massive Open Online Courses (MOOCs), which depend on data collection, often by third-party vendors, to run efficiently, but the infrastructure of campuses is full of new systems that collect data. When used correctly, this information can assist colleges in determining what capital improvements to make, managing traffic on campus, or improving graduation rates. But school administrators and third-party contractors must be aware of the challenge of collecting that data in a responsible manner and safeguarding it once it is collected. As we move forward, FPF hopes to put out more resources to help both schools and ed tech vendors manage these systems with an eye toward privacy and security.

How is FPF helping to provide these resources, and what privacy developments in higher ed will unfold in the next 10 years?

We regularly hold student privacy bootcamps for small and startup ed tech companies to provide training on privacy laws, best practices, and advocates’ concerns. We also run a higher education privacy working group to explore privacy concerns related to predictive analytics, big data, ethics, and data infrastructure in higher education. FERPA|SHERPA has other resources for higher ed officials, companies, and policymakers.

As far as the future is concerned, I imagine we’ll see major developments with regards to Title IX proceedings.

Title IX proceedings often involve highly sensitive data raising the question of how to balance due process, privacy, and transparency. And with the Department releasing new draft regulations, this conversation will only become more important.

Beyond Title IX, I think universities, companies, and the government will have to grapple with what privacy is and why it matters. For instance, federal initiatives like FSA Next Generation and the Pell Grant program mandate data collection. However, Federal Student Aid has not provided much transparency into its data collection, use, retention, and destruction practices. In April of 2017, FSA experienced a hack of its online FAFSA tool, which took them months to resolve. The lack of communication from FSA, as well as its cavalier attitude toward the data they collect, have forced students to make the choice of either getting money for their education or safeguarding their data.

As universities and colleges continue to grapple with how to best integrate technology in education, they will need to determine how to measure privacy harms, how to centralize administration of their sprawling data collection systems, and, above all, how to harness the potential of new technologies to enrich the lives of students under their care and instruction.