FPF Research Coordination Network Helps Academic Stars Connect with Private Sector Privacy Pros at IAPP
The IAPP Global Privacy Summit convened many of the best minds in privacy from industry, government and civil society. To add to the mix, FPF brought six academic stars to the Summit to share their privacy research and insights, thanks to the Applied Privacy Research Coordination Network (RCN).
Supported by the National Science Foundation, the FPF RCN connects academic researchers with industry practitioners to provide structured networking, research, and working partnership opportunities. The goal of the RCN is to help bring the best academic privacy research into practice.
Based on the strong attendance at the academics’ presentations, the many thoughtful discussions they had with practitioners at the FPF booth, and feedback from IAPP attendees, their participation was very successful. We look forward to helping advance collaborations that bring the best privacy tools and advances from researchers to the private sector. If you would like to learn more about the work of any of the academics listed below, or if you are an academic who would like to join the RCN please email FPF’s Elizabeth Lyttleton at [email protected].
Dr. Gupta is an Assistant Professor in the Center for Machine Learning and the School of Industrial and Systems Engineering at the Georgia Institute of Technology. In her paper, “Individual Fairness in Hindsight,” Dr. Gupta and her co-author explore how to ensure that the treatment of individuals under algorithms is demonstrably fair under reasonable notions of fairness. Ensuring similar treatment of similar individuals is impossible when a machine learning model is still trying to learn the parameters of the world. The authors introduce the notion of fairness-in-hindsight, where individual fairness is defined relative only to past algorithmic decisions, in a similar spirit as a legal precedent. Requiring fairness in hindsight might be a reasonable safeguard against unfair discrimination in algorithmic deployments, without hindering the ability to learn good decisions by machine learning models.
To read the full paper, click here.
Dr. Sadeh is a pioneer of Privacy-Enhancing Artificial Intelligence technologies. He is a professor of Computer Science at Carnegie Mellon University, where he also co-directs the first-of-its-kind Master’s Program in Privacy Engineering. In “Privacy in the Age of the Internet of Things,” Dr. Sadeh presents a new privacy infrastructure developed to publicize the presence of Internet of Things (IoT) devices and services. Under this infrastructure, Personalized Privacy Assistants can help improve user awareness and expose relevant opt-in/opt-out functionality to users.
To see slides for Dr. Sadeh’s talk, click here.
Dr. Takabi is Assistant Professor in the Department of Computer Science and Engineering at the University of North Texas. His paper “Privacy-preserving Machine Learning as a Service”, Dr. Takabi and his co-authors develop a framework based on a novel combination of homomorphic encryption techniques and machine learning that enables secure privacy-preserving deep learning. The framework includes new techniques to adopt deep neural networks within the practical limitations of current homomorphic encryption schemes and provide solutions for applying deep neural network algorithms to encrypted data.
Read the full paper here.
Thakkar is a fifth-year Ph.D. candidate in the Security group and the Theoretical Computer Science group at the Department of Computer Science, Boston University. “Towards Practical Differentially Private Convex Optimization” makes two major contributions to the literature. First, it develops a novel algorithm, Approximate Minima Perturbation, that can leverage any off-the-shelf optimizer. Second, it contains an extensive empirical evaluation of the state-of-the-art algorithms for differentially private convex optimization, on a range of publicly available benchmark datasets (including some high-dimensional datasets) and real-world datasets obtained through an industrial collaboration.
Read more about the research here.
Machanavajjhala is an Associate Professor, Director of Graduate Studies, Department of Computer Science, at Duke University. He co-developed Ektelo, an open source programming framework and system that aids programmers in developing differentially private programs with high utility. Ektelo can be used to author programs for a variety of statistical tasks that involve answering counting queries over a table of arbitrary dimension. Benefits of Ektelo include privacy for free, the ability to express complex privacy programs that permit high accuracy, transparency of privacy algorithms, and flexibility and modularity of the programming framework.
Learn more about Ektelo on Github.
Truex is a Ph.D. student in the School of Computer Science at the Georgia Institute of Technology. She co-authored “Fast, Privacy Preserving Linear Regression over Distributed Datasets based on Pre-Distributed Data.” This work proposes a protocol for performing linear regression over a dataset that is distributed over multiple parties. The parties are able to jointly and accurately compute a linear regression model without actually sharing their own private datasets. The paper also introduces and investigates the insider membership inference threat, defined as a membership inference attack launched by a member of a federated learning system against other participants in a collaborative learning environment.
Read the full paper here.
Dr. van Eijk received his Ph.D. in January 2019 at Leiden University at the Faculty of Law. Van Eijk participated as a dual PhD student in the Leiden University Dual PhD Center program and is working as a senior inspector at the Dutch Data Protection Authority. In “Web Privacy Measurement in Real-Time Bidding Systems. A Graph-Based Approach to RTB System Classification,” van Eijk shows that the privacy component of online advertisements can be measured. By applying mathematical algorithms to network traffic, he integrated data science into law (looking at the privacy component of RTB from a network science perspective is new). Dr. van Eijk’s attendance at the Summit was not arranged through the RCN initiative.
You can read the full paper here.