FPF has convened a leading group of its members to consider priority areas for technologies and companies to address ML privacy and ethics concerns. Our AI and Machine Learning Working Group, composed of FPF member companies with an interest in AI and Machine Learning privacy and data management challenges, meets monthly to discuss various relevant issues regarding new updates, hear from experts regarding AI in the EU and under GDPR, the occurrence and defense against bias, and other timely topics.
Posts by Brenda Leong
Beyond Explainability aims to provide a template for effectively managing this risk in practice, with the goal of providing lawyers, compliance personnel, data scientists, and engineers a framework to safely create, deploy, and maintain ML, and to enable effective communication between these distinct organizational perspectives.
Today, the Partnership on AI announced a new group of key stakeholders who will work with the Partnership’s Board of Directors to define and advance a shared vision of artificial intelligence that benefits people and society. The Future of Privacy Forum is proud to join this organization and help drive this important work forward.
The Future of Privacy Forum and the Brussels Privacy Hub of the Vrije Universiteit Brussel (VUB) are partnering with IEEE Security & Privacy in a call for papers focused on AI Ethics: The Privacy Challenge. Selected papers will be featured at The Brussels Privacy Symposium, an academic program jointly presented by the Brussels Privacy Hub of the VUB and FPF’s National Science Foundation supported Research Coordination Network.
Yesterday, Congress introduced the Email Privacy Act (H.R. 387), which would update protections in the Electronic Communications Act (ECPA) to take account of citizens’ evolving use of technology and better align the law with consumers’ reasonable expectations of privacy in the contents of their email communications.
FPF has produced a checklist to assist parents and schools in considering the “basics” of security standards on new ed tech products and services they may be considering or using. In on-line security, there is unfortunately no “one size fits all” solution, but with so many products and services available, this checklist is designed to provide some initial key triggers of areas that either meet a basic threshold, or might serve as discussion points for further review with the company involved.
We are pleased to announce that we are publishing the FPF Guide to Student Data Protections Under SOPIPA: For K-12 School Administrators and Ed Tech Vendors. Co-written with education privacy experts Linnette Attai of PlayWell LLC, Amelia Vance of the National Association of State Boards of Education, and David B. Rubin, Esq., this document provides a in-depth analysis for ed tech companies.
New America released a report today that addresses the use of data in higher ed analytics – predicting student outcomes and managing university academic programs based on prior data. The growing ability to gather and analyze this data allows colleges to intervene with students struggle, put in place mentoring programs, create support structures addressing “whole student” welfare, ultimately improving academic outcomes and graduation rates.
Today, Google announced new features that provide users with additional customized options and controls over personal data, as well as easy-to-follow instructions and notifications that explain users’ choices in simple terms. The new features make privacy controls quicker to find and easier to understand and operate.
Data has always been an inherent part of the educational process – a child’s age, correlated with her grade level, tracked to specific reading or math skills that align with that grade, measured by grades and tests which rank her according to her peers. Today this data is ever more critical.