Blog

Understanding Session Replay Scripts – a Guide for Privacy Professionals

March 5, 2018 | Stacey Gray

Understanding Session Replay Scripts – a Guide for Privacy Professionals

Researchers at Princeton University’s Center for Information Technology Policy (CITP) have demonstrated that many websites are using third-party tools to track visitors’ individual browsing sessions. “Session replay scripts” can raise serious privacy concerns if implemented incorrectly — but with the right safeguards, can be part of a range of ordinary, useful web analytics tools. FPF has published a 3-page guide for Privacy Professionals to assist in deciding whether and how to implement session replay scripts.

More
Taming The Golem: Challenges of Ethical Algorithmic Decision-Making

March 2, 2018 | Melanie E. Bates

Taming The Golem: Challenges of Ethical Algorithmic Decision-Making

This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice.

More
FPF Welcomes New Senior Fellow

February 8, 2018 | Melanie E. Bates

FPF Welcomes New Senior Fellow

FPF is pleased to welcome Stanley W. Crosley as a senior fellow. Stanley has over 20 years of applied experience in law, data governance and data strategy across a broad sector of the economy, including from inside multinational corporations, academia, large law firm and boutique practices, not-for-profit advocacy organizations, and governmental agencies, and is the Co-Director of the Indiana University Center for Law, Ethics, and Applied Research in Health Information (CLEAR), is Counsel to the law firm of Drinker Biddle & Reath in Washington, DC, and Principal of Crosley Law Offices, LLC.  Stan is a Senior Strategist at the Information Accountability Foundation and a Senior Fellow at the Future of Privacy Forum, where he leads health policy efforts.

More
Consumer Reports Publishes Initial Findings for Privacy and Security of Smart TVs

February 7, 2018 | Stacey Gray

Consumer Reports Publishes Initial Findings for Privacy and Security of Smart TVs

Today, Consumer Reports released their initial findings on the privacy and security aspects of Smart TVs. Applying their Digital Standard (developed with Ranking Digital Rights and other partner organizations), Consumer Reports identified a range of important privacy aspects and potential security vulnerabilities in Smart TVs from five leading manufacturers (Sony, Samsung, LG, TCL, and Vizio).

More
Seeing the Big Picture on Smart TVs and Smart Home Tech

February 1, 2018 | Stacey Gray

Seeing the Big Picture on Smart TVs and Smart Home Tech

CES 2018 brought to light many exciting advancements in consumer technologies. Without a doubt, Smart TVs, Smart Homes, and voice assistants were dominant: LG has a TV that rolls up like a poster; Philips introduced a Google Assistant-enabled TV is designed for the kitchen; and Samsung revealed its new line of refrigerators, TVs, and other home devices powered by Bixby, their intelligent voice assistant.

More
FPF Publishes Model Open Data Benefit-Risk Analysis

January 30, 2018 | Kelsey Finch

FPF Publishes Model Open Data Benefit-Risk Analysis

This Report first describes inherent privacy risks in an open data landscape, with an emphasis on potential harms related to re-identification, data quality, and fairness. To address these risks, the Report includes a Model Open Data Benefit-Risk Analysis (“Model Analysis”). The Model Analysis evaluates the types of data contained in a proposed open dataset, the potential benefits – and concomitant risks – of releasing the dataset publicly, and strategies for effective de-identification and risk mitigation.

More
Examining the Open Data Movement

January 25, 2018 | Kelsey Finch

Examining the Open Data Movement

The transparency goals of the open data movement serve important social, economic, and democratic functions in cities like Seattle. At the same time, some municipal datasets about the city and its citizens’ activities carry inherent risks to individual privacy when shared publicly. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy.[1] To ensure its Open Data Program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked the Future of Privacy Forum (FPF) with creating and deploying an initial privacy risk assessment methodology for open data.

More