
Closer than Apart: Comparing Senate Commerce Committee Bills
Together with Senator Cantwell (D-WA)’s bill, the Consumer Online Privacy Rights Act, Senator Wicker’s Discussion Draft represents a significant movement toward bipartisan negotiations in the Senate. But how do the two bills, one from leading Democrats, and one from the Republican Chairman, compare to each other? We find them to be closer together on most issues than they are apart: a promising sign for bipartisan negotiation.

COPPA Workshop Takeaways
On Monday, the Federal Trade Commission (FTC) held a public workshop focused on potential updates to the Children’s Online Privacy Protection Act (COPPA) rule. The workshop follows a July 25, 2019 notice of rule review and call for public comments regarding COPPA rule reform. The comment period remains open until December 9th. Senior FTC officials […]

CCPA 2.0? A New California Ballot Initiative is Introduced
Introduction On September 13, 2019, the California State Legislature passed the final CCPA amendments of 2019. Governor Newsom is expected to sign the recently passed CCPA amendments into law in advance of his October 13, 2019 deadline. Yesterday, proponents of the original CCPA ballot initiative released the text of a new initiative (The California Privacy […]

New White Paper Explores Privacy and Security Risk to Machine Learning Systems
FPF and Immuta Examine Approaches That Can Limit Informational or Behavioral Harms WASHINGTON, D.C. – September 20, 2019 – The Future of Privacy Forum (FPF) released a white paper, WARNING SIGNS: The Future of Privacy and Security in an Age of Machine Learning, exploring how machine learning systems can be exposed to new privacy and […]

Warning Signs: Identifying Privacy and Security Risks to Machine Learning Systems
FPF is working with Immuta and others to explain the steps machine learning creators can take to limit the risk that data could be compromised or a system manipulated.

Digital Deep Fakes
The media has recently labeled manipulated videos of people “deepfakes,” a portmanteau of “deep learning” and “fake,” on the assumption that AI-based software is behind them all. But the technology behind video manipulation is not all based on deep learning (or any form of AI), and what are lumped together as deepfakes actually differ depending on the particular technology used. So while the example videos above were all doctored in some way, they were not all altered using the same technological tools, and the risks they pose – particularly as to being identifiable as fake – may vary.

Ethical and Privacy Protective Academic Research and Corporate Data
Is edtech helping or hindering student education? What effect does social media have on elections? What types of user interfaces help users manage privacy settings? Can the data collected by wearables inform health care? In almost every area of science, academic researchers are seeking access to personal data held by companies to advance their work. […]

NAI’s 2020 Code of Conduct Expands Self-Regulation for Ad Tech Providers
By Christy Harris, Stacey Gray, and Meredith Richards As debates over the shape of federal privacy legislation in the United States continue, online advertising remains a key focus of scrutiny in the US Congress, with its recent hearing on digital advertising and data privacy. Amidst these debates, the Networking Advertising Initiative (NAI), the leading self-regulatory body […]

Protected: Protected: Future of Privacy Forum's 2019 Annual Meeting
There is no excerpt because this is a protected post.

Consumer Genetic Testing: A Q&A with Carson Martinez
Carson Martinez is FPF’s Health Policy Fellow. She works on privacy challenges surrounding health data, particularly where it is not covered by HIPAA, as is the case with consumer-facing genetics companies, wearables, mobile health and wellness applications, and connected medical devices. Carson also leads the FPF Genetics Working Group and Health Working Group. How did […]