This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice.
FPF is pleased to welcome Stanley W. Crosley as a senior fellow. Stanley has over 20 years of applied experience in law, data governance and data strategy across a broad sector of the economy, including from inside multinational corporations, academia, large law firm and boutique practices, not-for-profit advocacy organizations, and governmental agencies, and is the Co-Director of the Indiana University Center for Law, Ethics, and Applied Research in Health Information (CLEAR), is Counsel to the law firm of Drinker Biddle & Reath in Washington, DC, and Principal of Crosley Law Offices, LLC. Stan is a Senior Strategist at the Information Accountability Foundation and a Senior Fellow at the Future of Privacy Forum, where he leads health policy efforts.
Today, Consumer Reports released their initial findings on the privacy and security aspects of Smart TVs. Applying their Digital Standard (developed with Ranking Digital Rights and other partner organizations), Consumer Reports identified a range of important privacy aspects and potential security vulnerabilities in Smart TVs from five leading manufacturers (Sony, Samsung, LG, TCL, and Vizio).
The Future of Privacy Forum is delighted to welcome new interns to our team!
CES 2018 brought to light many exciting advancements in consumer technologies. Without a doubt, Smart TVs, Smart Homes, and voice assistants were dominant: LG has a TV that rolls up like a poster; Philips introduced a Google Assistant-enabled TV is designed for the kitchen; and Samsung revealed its new line of refrigerators, TVs, and other home devices powered by Bixby, their intelligent voice assistant.
This Report first describes inherent privacy risks in an open data landscape, with an emphasis on potential harms related to re-identification, data quality, and fairness. To address these risks, the Report includes a Model Open Data Benefit-Risk Analysis (“Model Analysis”). The Model Analysis evaluates the types of data contained in a proposed open dataset, the potential benefits – and concomitant risks – of releasing the dataset publicly, and strategies for effective de-identification and risk mitigation.
Today, the Future of Privacy Forum released its City of Seattle Open Data Risk Assessment. The Assessment provides tools and guidance to the City of Seattle and other municipalities navigating the complex policy, operational, technical, organizational, and ethical standards that support privacy-protective open data programs.
The transparency goals of the open data movement serve important social, economic, and democratic functions in cities like Seattle. At the same time, some municipal datasets about the city and its citizens’ activities carry inherent risks to individual privacy when shared publicly. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy. To ensure its Open Data Program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked the Future of Privacy Forum (FPF) with creating and deploying an initial privacy risk assessment methodology for open data.
FPF requested feedback from the public on its proposed Draft Open Data Risk Assessment for the City of Seattle. In 2016, the City of Seattle declared in its Open Data Policy that the city’s data would be “open by preference,” except when doing so may affect individual privacy. To ensure its Open Data program effectively protects individuals, Seattle committed to performing an annual risk assessment and tasked FPF with creating and deploying an initial privacy risk assessment methodology for open data.
Computers Privacy and Data Protection conference (CPDP) kicks off this week in Brussels, and the theme this year is “The Internet of Bodies”. The conference will gather 400 speakers for 80 panels to set the stage for the privacy and data protection conversation in Europe for 2018. And this is such an important year for data protection – not only the General Data Protection Regulation becomes applicable in May, but also the text of the new ePrivacy Regulation will likely be finalized.