FPF Welcomes the 2019 Class of Policy Fellows

FPF is pleased to announce the selection of its 2019 Policy Fellows: Katelyn Ringrose, Charlotte Kress, and Anisha Reddy. Working at FPF for one- or two-year terms, Fellows are key members of the FPF policy team. Fellows focus on consumer and commercial privacy issues, from technology-specific areas such as drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, deidentification, algorithms, and the Internet of Things. Their roles may include filing comments on proposed regulatory actions, researching and analyzing US and European privacy issues, developing industry best practices or standards, and tracking consumer privacy legislation.

The Christopher Wolf Diversity Law Fellow is Katelyn Ringrose. This two-year fellowship is distinguished by its commitment to bring diverse perspectives to FPF’s work on contemporary privacy issues. Ringrose comes to this position as a recent graduate of the University of Notre Dame Law School, where she was the Faculty Awardee for Excellence in Advanced Legal Research and the President of the LGBT Law Forum. She published law review articles on facial recognition and body-worn cameras, government surveillance, and data collection in schools, among other topics.

While in law school, Ringrose founded Impowerus, an online company connecting juvenile immigrants to pro bono legal aid that was chosen as one of the top eight student tech startups in the U.S. by Inc. Magazine. She interned for the U.S. Department of Justice, the Washington State Attorney General’s Office, the Public Defender’s Office of the Juvenile Justice Center in South Bend, IN, and the Hawaii State Supreme Court. Prior to attending law school, she was a teacher in Tacoma, WA.

“Katelyn brings experience as an entrepreneur, an academic, and a public servant to her fellowship at FPF,” said Christopher Wolf, FPF Founder and Board President. “We’re excited that she will be sharing her unique perspective and extraordinary talents with our staff, supporters, and partners.”


Charlotte Kress is the Elise Berkower Memorial Fellow. This Fellowship is dedicated to the memory of Elise Berkower, a leader in the consumer privacy field, who was known for her ability to identify and nurture young lawyers, with a focus on consumer protection and business ethics. Kress graduated from The George Washington University Law School, where she was a member of the Federal Circuit Bar Journal and the Society for European Law Students and volunteered with the International Privacy and Security Forum. A fluent German speaker, Kress undertook post-graduate studies at Heidelberg University in Germany after her graduation from Bucknell University. Kress served as a Legal Intern at the Office of General Counsel for the District of Columbia Housing Authority, and at DLA Piper and Sedgwick law firms.

“Charlotte’s energy and enthusiasm for legal scholarship, collaboration, and ethical negotiation makes her a fitting recipient of the Elise Berkower Memorial Fellowship,” said Howard Berkower, Partner at McCarter & English, and brother of Elise Berkower. “Her international experience, drive, and commitment to the continued development of technology and privacy laws will be a great asset for FPF.”

FPF gratefully acknowledges the Nielsen Foundation, the Berkower family, IAPP and friends of Elise as founding sponsors of the Elise Berkower Memorial Fellowship.


Anisha Reddy is the inaugural FPF Education Privacy Fellow. This newly created position will focus on expanding FPF’s library of resources on student privacy issues, as well as tracking student and child-specific privacy legislation at state and federal levels. Reddy will also help process applicants to the Student Privacy Pledge. Reddy will set the bar high for this role. At Penn State’s Dickinson Law, Reddy was honored with the University’s 2017-2018 Montgomery and MacRae Award for Excellence in Administrative Law. She held the offices of Executive Editor for Digital Media of the Dickinson Law Review, President of the Asian Pacific Law Students Association, and Vice President of the Women’s Law Caucus. Reddy served as a Certified Legal Intern for the Children’s Advocacy Clinic in Carlisle, PA, where she represents children involved in civil court actions like adoption, domestic violence, and custody matters. She previously interned at the Governor of Pennsylvania’s Office of General Counsel, at Udacity in Mountain View, CA and at Blockchain, Inc. in New York, NY.

“We are thrilled to welcome Anisha to the FPF Education Privacy team,” said Amelia Vance, Senior Policy Counsel and Director of the FPF Education Policy Project. “The breadth and variety of her work experience and legal scholarship will be an asset when engaging with the variety of stakeholders in the education community.”

FPF is looking forward to welcoming Ringrose, Kress, and Reddy to the team in September. These three talented, young lawyers will continue a strong tradition of FPF Policy Fellows who make exceptional contributions to the privacy community during their fellowships and throughout their careers.


FPF is seeking to continue to support the next generation of privacy leaders through our fellowships. Please click on the button to make your contribution in support of the Elise Berkower Memorial Fellowship and the Christopher Wolf Diversity Law Fellowship.

DONATE

FPF and IAF Release "A Taxonomy of Definitions for the Health Data Ecosystem"

Healthcare technologies are rapidly evolving, producing new data sources, data types, and data uses, which precipitate more rapid and complex data sharing. Novel technologies—such as artificial intelligence tools and new internet of things (IOT) devices and services—are providing benefits to patients, doctors, and researchers. Data-driven products and services are deepening patients’ and consumers’ engagement, and helping to improve health outcomes. Understanding the evolving health data ecosystem presents new challenges for policymakers and industry. There is an increasing need to better understand and document the stakeholders, the emerging data types and their uses.

The Future of Privacy Forum (FPF) and the Information Accountability Foundation (IAF) partnered to form the FPF-IAF Joint Health Initiative in 2018. Today, the Initiative is releasing A Taxonomy of Definitions for the Health Data Ecosystem; the publication is intended to enable a more nuanced, accurate, and common understanding of the current state of the health data ecosystem. The Taxonomy outlines the established and emerging language of the health data ecosystem. The Taxonomy includes definitions of:

This report is as an educational resource that will enable a deeper understanding of the current landscape of stakeholders and data types. We hope it will be valuable as a common, reference language for the evolving health ecosystem. This is particularly important as organizations take on more data governance projects, take inventory of the data flowing into and out of their organizations, and participate in complex data exchanges. We intend the Taxonomy to be used to create consistent data collection and use models across the healthcare ecosystem. Establishing common, shared terminology is particularly useful as state privacy laws and pending Congressional proposals seek to codify a comprehensive consumer privacy framework in the United States; these proposals often include provisions that would require organizations to undertake data mapping and inventory activities.

For additional information about this Taxonomy or the Joint Health Initiative work between the Future of Privacy Forum and Information Accountability Foundation, please contact Stan Crosley ([email protected]) and John Verdi ([email protected]).

READ THE TAXONOMY

As Legislators Debate Ad Tech, Browsers and Operating Systems Announce New Technical Controls

Congress continues to hold data privacy hearings, including yesterday’s Understanding the Digital Advertising Ecosystem and the Impact of Data Privacy and Competition Policy. The continued debate over adtech practices is reaching a crescendo, making the case for quick action on a comprehensive federal privacy law that can set parameters for how personal data is collected, used, and shared, for adtech and for the many ways data is used by companies of every sort. But any law would do well to incentivize technical solutions to privacy challenges, rather than rely solely on legal commitments, as FPF CEO Jules Polonetsky argued in his recent testimony at the Senate Commerce Committee. There is no question that well-crafted laws and rules can give consumers important rights and provide companies with clarity about their obligations. It is also important to recognize that technical solutions continue to play an integral role in improving consumers’ privacy. As policymakers craft new privacy protections in law, they should be mindful that both legal and technical safeguards are necessary to ensure strong consumer protections.

We have seen many examples of technological solutions bolstering or otherwise supplementing legal protections.

Recent developments in this direction by leading companies are a welcome step in this direction. Some of the most recently announced privacy updates from Google, Apple, and Mozilla include:

Google Privacy Updates

At Google’s annual I/O developer conference, the company announced updates to its products and services, including changes to the Chrome browser, that will provide users with enhanced privacy controls.

Apple Privacy Updates

Apple recently announced updates to its Intelligent Tracking Prevention (ITP) feature and proposed a new privacy-conscious technological solution to allow for ad click attribution.

We anticipate additional privacy announcements at Apple’s upcoming WWDC conference in early June.

Mozilla Privacy Updates

Mozilla has long been a leader in developing and implementing technical privacy solutions.  Mozilla’s Firefox was the first browser to implement Do Not Track and one of the first browsers to block third-party cookies by default. Mozilla has released several technologies and policies this year to strengthen user privacy protections.

Conclusion

It will take a combination of solutions to address consumer privacy issues. As Congress debates federal privacy legislation, policymakers should bear in mind the importance of technical safeguards in protecting consumers’ data. Lawmakers should consider ways that a baseline, comprehensive privacy law could create incentives for organizations to develop and implement technical safeguards that align with consumers’ privacy expectations.

Understanding Artificial Intelligence and Machine Learning

The opening session of FPF’s Digital Data Flows Masterclass provided an educational overview of  Artificial Intelligence and Machine Learning – featuring Dr. Swati Gupta, Assistant Professor in the H. Milton Stewart School of Industrial and Systems Engineering at Georgia Tech; and Dr. Oliver Grau, Chair of ACM’s Europe Technology Policy Committee, Intel Automated Driving Group, and University of Surrey. To learn more about the Basics of AI/ML and how Bias and Fairness impact these systems, watch the class video here.

Understanding AI and its underlying algorithmic processes presents new challenges for privacy officers and others responsible for data governance in companies ranging from retailers to cloud service providers. In the absence of targeted legal or regulatory obligations, AI poses ethical and practical challenges for companies that strive to maximize consumer benefits while preventing potential harms.

In conjunction with this class, FPF released The Privacy Expert’s Guide to AI and Machine Learning. Covering much of the course content, this guide explains the technological basics of AI and ML systems at a level of understanding useful for non-programmers, and addresses certain privacy challenges associated with the implementation of new and existing ML-based products and services.

The Digital Data Flows Masterclass is a year-long educational program designed for regulators, policymakers, and staff seeking to better understand the data-driven technologies at the forefront of data protection law & policy. The program features experts on machine learning, biometrics, connected cars, facial recognition, online advertising, encryption, and other emerging technologies.

Personal Data and the Organization: Stewardship and Strategy

Personal data – used lawfully, fairly, and transparently – is central to helping organizations achieve their missions. Today, Boards of Directors, CEOs, policymakers, and others need to understand the wide range of data inputs, the broad scope of risks and benefits, and how privacy and ethics are at the center of an organization’s ability to fulfill its leaders’ vision. Traditionally, privacy was considered a legal and compliance matter, but now it is a fundamental concern because of emerging issues such as advertising practices, content standards, global data flows, concerns about civil rights, law enforcement cooperation, ethics, community engagement, research standards, and more. With these trends in mind, FPF has created an infographic that shows:

We hope you will find this useful in communicating with colleagues, students, leaders, and policymakers.

Personal Data and the Organization: Stewardship and Strategy

FPF Research Coordination Network Helps Academic Stars Connect with Private Sector Privacy Pros at IAPP

The IAPP Global Privacy Summit convened many of the best minds in privacy from industry, government and civil society. To add to the mix, FPF brought six academic stars to the Summit to share their privacy research and insights, thanks to the Applied Privacy Research Coordination Network (RCN).

Supported by the National Science Foundation, the FPF RCN connects academic researchers with industry practitioners to provide structured networking, research, and working partnership opportunities. The goal of the RCN is to help bring the best academic privacy research into practice.

Based on the strong attendance at the academics’ presentations, the many thoughtful discussions they had with practitioners at the FPF booth, and feedback from IAPP attendees, their participation was very successful. We look forward to helping advance collaborations that bring the best privacy tools and advances from researchers to the private sector. If you would like to learn more about the work of any of the academics listed below, or if you are an academic who would like to join the RCN please email FPF’s Elizabeth Lyttleton at [email protected].

Dr. Swati Gupta

Dr. Gupta is an Assistant Professor in the Center for Machine Learning and the School of Industrial and Systems Engineering at the Georgia Institute of Technology. In her paper, “Individual Fairness in Hindsight,” Dr. Gupta and her co-author explore how to ensure that the treatment of individuals under algorithms is demonstrably fair under reasonable notions of fairness. Ensuring similar treatment of similar individuals is impossible when a machine learning model is still trying to learn the parameters of the world. The authors introduce the notion of fairness-in-hindsight, where individual fairness is defined relative only to past algorithmic decisions, in a similar spirit as a legal precedent. Requiring fairness in hindsight might be a reasonable safeguard against unfair discrimination in algorithmic deployments, without hindering the ability to learn good decisions by machine learning models.

To read the full paper, click here.

Norman Sadeh

 

Dr. Sadeh is a pioneer of Privacy-Enhancing Artificial Intelligence technologies. He is a professor of Computer Science at Carnegie Mellon University, where he also co-directs the first-of-its-kind Master’s Program in Privacy Engineering. In “Privacy in the Age of the Internet of Things,” Dr. Sadeh presents a new privacy infrastructure developed to publicize the presence of Internet of Things (IoT) devices and services. Under this infrastructure, Personalized Privacy Assistants can help improve user awareness and expose relevant opt-in/opt-out functionality to users.

To see slides for Dr. Sadeh’s talk, click here.

Dr. Hassan Takabi

 

Dr. Takabi is Assistant Professor in the Department of Computer Science and Engineering at the University of North Texas. His paper “Privacy-preserving Machine Learning as a Service”, Dr. Takabi and his co-authors develop a framework based on a novel combination of homomorphic encryption techniques and machine learning that enables secure privacy-preserving deep learning. The framework includes new techniques to adopt deep neural networks within the practical limitations of current homomorphic encryption schemes and provide solutions for applying deep neural network algorithms to encrypted data.

Read the full paper here.

Om Thakkar

Thakkar is a fifth-year Ph.D. candidate in the Security group and the Theoretical Computer Science group at the Department of Computer Science, Boston University. “Towards Practical Differentially Private Convex Optimization” makes two major contributions to the literature. First, it develops a novel algorithm, Approximate Minima Perturbation, that can leverage any off-the-shelf optimizer. Second, it contains an extensive empirical evaluation of the state-of-the-art algorithms for differentially private convex optimization, on a range of publicly available benchmark datasets (including some high-dimensional datasets) and real-world datasets obtained through an industrial collaboration.

Read more about the research here.

Ashwin Machanavajjhala

Machanavajjhala is an Associate Professor, Director of Graduate Studies, Department of Computer Science, at Duke University. He co-developed Ektelo, an open source programming framework and system that aids programmers in developing differentially private programs with high utility. Ektelo can be used to author programs for a variety of statistical tasks that involve answering counting queries over a table of arbitrary dimension. Benefits of Ektelo include privacy for free, the ability to express complex privacy programs that permit high accuracy, transparency of privacy algorithms, and flexibility and modularity of the programming framework.

Learn more about Ektelo on Github.

Stacey Truex

Truex is a Ph.D. student in the School of Computer Science at the Georgia Institute of Technology. She co-authored “Fast, Privacy Preserving Linear Regression over Distributed Datasets based on Pre-Distributed Data.” This work proposes a protocol for performing linear regression over a dataset that is distributed over multiple parties. The parties are able to jointly and accurately compute a linear regression model without actually sharing their own private datasets. The paper also introduces and investigates the insider membership inference threat, defined as a membership inference attack launched by a member of a federated learning system against other participants in a collaborative learning environment.

Read the full paper here.

Rob van Eijk

Dr. van Eijk received his Ph.D. in January 2019 at Leiden University at the Faculty of Law. Van Eijk participated as a dual PhD student in the Leiden University Dual PhD Center program and is working as a senior inspector at the Dutch Data Protection Authority. In “Web Privacy Measurement in Real-Time Bidding Systems. A Graph-Based Approach to RTB System Classification,” van Eijk shows that the privacy component of online advertisements can be measured. By applying mathematical algorithms to network traffic, he integrated data science into law (looking at the privacy component of RTB from a network science perspective is new). Dr. van Eijk’s attendance at the Summit was not arranged through the RCN initiative.

You can read the full paper here.

Jules Polonetsky Testifies Before the Senate Commerce, Science, and Transportation Committee

FPF CEO Jules Polonetsky recently testified during the Senate Commerce, Science, and Transportation Committee’s hearing, “Consumer Perspectives: Policy Principles for a Federal Data Privacy Framework.”

The testimony underscored the urgent need for federal legislation. Not only have recent, high-profile cases revealed the extent to which data can be misused, but the benefits of modern technology in areas like mobility, health care, and education cannot be fully realized until there is a clear, federal privacy law on the books.

Read Jules’ full testimony and appendices.