A New Voice in Online Privacy

A New Voice In Online Privacy

Washington Post

By Kim Hart

November 17, 2008

Page A06

Group Wants Tighter Rules for Collecting, Using Consumer Data.

A group of privacy scholars, lawyers and corporate officials are launching an advocacy group today designed to help shape standards around how companies collect, store and use consumer data for business and advertising.

The group, the Future of Privacy Forum, will be led by Jules Polonetsky, who until this month was in charge of AOL’s privacy policy, and Chris Wolf, a privacy lawyer for law firm Proskauer Rose. They say the organization, which is sponsored by AT&T, aims to develop ways to give consumers more control over how personal information is used for behavioral-targeted advertising.

Internet companies have come under fire for tracking consumers’ online habits in order to tailor ads relevant to their interests. Lawmakers have held several hearings this year to examine online privacy protections.

Jules Polonetsky quoted:

In the tightening economy, “advertisers are looking increasingly more to data to decide which marketing campaigns will be cut and which will survive,” Polonetsky said. “There’s a rush to make deeper decisions that will impact privacy.”

To view the entire article click here.

The Future of Privacy Forum Agenda for Consumers and Businesses

FPF will seek to bring transparency to online data practices. Our plan is to document practices, produce multi-media educational materials, and commission reports and studies that provide consumers and policy makers the real story about how their data is used.

FPF will seek to bring true transparency and user control to behavioral targeting and will broaden the discussion of the ethics of what the online norms can be with regard to use of web browsing.

FPF will seek to ensure that considerations around data retention, limitation, and deletion are a significant part of the consumer privacy debate.

FPF will seek to drive practices that enhance consumer controls – ensuring that data use is obvious, useful, intuitive and used and for a benefit he values and controls – no matter the type of technology used.

FPF will explore opportunities to clarify the definitions of personal data and establish baseline practices about what is accepted as anonymous. But even when data isn’t identifiable, trustworthy practices must be in place whenever data can be used to tailor a user’s experience.

FPF will seek solutions that get beyond the limitations of cookies to improve the state management of privacy.

FPF will seek to highlight the privacy risks and the data protection opportunities presented by new data from technologies such as geo-location, mobile and RFID. There is a limited window to ensure that the deployment of these technologies builds in the kind of controls needed. Already we see examples of leading edge start-ups rushing forward without the needed tools in place.

FPF will help drive online privacy education for consumers and will particularly consider the impacts on teens, users with disabilities and seniors. We will work to develop civic norms applicable to both data subject and data user. We need our teens to think twice about the embarrassing disclosures they may make online and to understand that we live in a world where we must manage our own brand and digital persona – but equally we must train the businesses that making secondary use of data in a manner disturbing to users may be akin to peeking at a diary just because it was left open. We cannot expect the generation that lives virtually to be in a continual state of self censorship. Users need tools to be able to speak freely, informally and privately without having to worry it will be used against them.

FPF will advocate for privacy advances that are business practical, but that substantially raise the bar to ensure personal autonomy for all who seek to embrace the benefits of our digital society. We will seek to work with industry, advocates and policymakers to ensure the future of privacy is one where we are not enslaved by our data, but rather where data serves the benefit of humankind.

The Future of Privacy: Our Mission & Agenda

Society is approaching a turning point that could well determine the future of privacy. Policy-makers and business leaders soon will make decisions about technology practices that will either ensure that data is used for the benefit of individuals and society, or take us down a path where we are controlled by how others use our data.

Why do we say this?

Technology is advancing even more quickly than before with respect to data collection, data mining, and correlation of data across platforms, channels, devices and over time. The use of data is becoming increasingly practical and profitable. Technical limitations to true one-to-one marketing have been overcome and the expense of data storage is no longer a barrier. As a result, it is now possible for consumers to enjoy the full benefits of data sharing for research, convenience, personalization. This data is also helping to support universally available and valuable content and services.

Along with these benefits come some challenges. A difficult economy is encouraging businesses to delve more deeply into data that can be used to make more efficient and effective marketing decisions. At the same time, pressure to make unethical decisions maybe driven by challenges to meet revenue goals in a turbulent market.

The financial system meltdown has, rightly or wrongly, cast significant doubt on the concept of self-regulation and is likely to encourage government engagement in this area. Indeed, it seems that it is no longer enough to say that, “these things are complicated but experts crafted them, so they must be safe.”

A new tech savvy administration is entering office, with the likely entry of new appointees who are steeped in the privacy and tech policy debates. Joining them will be veterans of a campaign that broke new ground in maximizing online data use to connect to its audience. This intersection between privacy and a full appreciation of the value of data may provide an opportunity for policymaking that seeks to balance data use with user controls.

Data regulators in Europe have increased their scrutiny of the practices of US internet companies and have been pressing search engines and social networks to respect European data standards for their platforms. At home, the Federal Trade Commission has proposed behavioral targeting guidelines and continues to examine practices in this area.

Social networks have become ubiquitous, with users providing more personal information than ever. These networks now also serve as platforms for 3rd party applications that rely on the data of their user bases to provide services. Cloud computing efforts seek to store all the data of users and businesses on central servers. New robust mobile platforms are integrating geo-location based data, and are beginning to be able to implement behavioral targeting and tracking.

Also supporting an opportunity for change in business practice is the philosophy of Web 2.0. Increasingly, developers have embraced the point of view that users want control of their experience, including control over their data.

These factors all combine to bring us to a uniquely opportune moment. Individual companies have taken major steps forward. AT&T has committed to an affirmative consent model for behavioral targeting and other ISPs have joined in advocating that model. Yahoo! is collaborating with eBay and Wal-Mart to label ads and expand user choices. Microsoft is adding new privacy features to Internet Explorer, and AOL has launched an educational effort around behavioral targeting. However, there is clearly much more that can be done to create a movement to put trust at the center of decisions about data use.

We believe that if dedicated technologists, policymakers, industry groups and advocates focus on advancing privacy in a manner that businesses can achieve, then privacy, profits and personalization are all possible. Join us to help improve the state of online privacy by advancing responsible data practices.

Our Agenda for Consumers & Businesses

FPF will seek to bring transparency to online data practices. Our plan is to document practices, produce multi-media educational materials, and commission reports and studies that provide consumers and policy makers the real story about how their data is used.

FPF will seek to bring true transparency and user control to behavioral targeting and will broaden the discussion of the ethics of what the online norms can be with regard to use of web browsing.

FPF will seek to ensure that considerations around data retention, limitation, and deletion are a significant part of the consumer privacy debate.

FPF will seek to drive practices that enhance consumer controls – ensuring that data use is obvious, useful, intuitive and used and for a benefit he values and controls – no matter the type of technology used.

FPF will explore opportunities to clarify the definitions of personal data and establish baseline practices about what is accepted as anonymous. But even when data isn’t identifiable, trustworthy practices must be in place whenever data can be used to tailor a user’s experience.

FPF will seek solutions that get beyond the limitations of cookies to improve the state management of privacy.

FPF will seek to highlight the privacy risks and the data protection opportunities presented by new data from technologies such as geo-location, mobile and RFID. There is a limited window to ensure that the deployment of these technologies builds in the kind of controls needed. Already we see examples of leading edge start-ups rushing forward without the needed tools in place.

FPF will help drive online privacy education for consumers and will particularly consider the impacts on teens, users with disabilities and seniors. We will work to develop civic norms applicable to both data subject and data user. We need our teens to think twice about the embarrassing disclosures they may make online and to understand that we live in a world where we must manage our own brand and digital persona – but equally we must train the businesses that making secondary use of data in a manner disturbing to users may be akin to peeking at a diary just because it was left open. We cannot expect the generation that lives virtually to be in a continual state of self censorship. Users need tools to be able to speak freely, informally and privately without having to worry it will be used against them.

FPF will advocate for privacy advances that are business practical, but that substantially raise the bar to ensure personal autonomy for all who seek to embrace the benefits of our digital society. We will seek to work with industry, advocates and policymakers to ensure the future of privacy is one where we are not enslaved by our data, but rather where data serves the benefit of humankind.

Studies and Resources

Where does your data go … before you even click

Before You Click

Chris & Jules’s IAPP Cheers & Jeers Panel Presentation

Search and Privacy

Alissa Cooper, at the Center for Democracy and Technology, has just published a superb paper on search log files. She does a great job at walking through all the reasons search engines retain a person’s searches, flags the privacy risks of having those queries sitting around log term, and reviews potential solutions.

As search becomes an increasingly essential part of so many Internet users daily lives, the breadth and depth of information contained in query logs grows to unparalleled levels. As a body of data that can reveal the interests, preferences, search strategies, and linguistic behaviors of entire populations, query logs are a true bounty for research of all kinds, conducted internally, at the search engine companies, and externally, by academics and others.

But the great promise of query logs as a research tool is bound by the privacy risks that arise for some of the very same reasons that the logs are so useful in the first place—the richness of detail that they offer about individuals’ lives.

Achieving the right balance between protecting privacy and promoting the utility of the logs is thus difficult but necessary to ensure that Internet users can continue to rely on Web search without fear of adverse privacy consequences.

We will look forward to hosting some frank discussions at the Forum the about the risks and rewards of log file retention, by search engines as well as adservers. As Alissa lays out, there is much more companies can do in this area to maintain the functions users want, while reducing privacy consequences.

Mobile Cookies

As people increasingly use mobile devices to access the Internet, online privacy issues arise in new ways and some of the old problems arise in new forms. One of the factors that has limited behavioral targeting by businesses across web sites viewed on a standard mobile phone is the lack of a “cookie” that could be used to track the user across the sites they visit. One of the critical privacy issues for mobile marketers is give users control over cookies that track their online behavior . Managing tracking tools in a way that leaves users firmly in control is a must if mobile personalization is to succeed. If they go down the path of telling consumers “we will track you but trust us to do the right thing” but don’t give consumers control, the model will not work.

Now one company, Ringleader Digital, claims to have built an ad network enabling targeting using a cookie-like state identifier they call Media Stamp. Eager to learn about the choices and controls that such an ad network will provide for users, We navigated to the their site to find out something about the privacy options. How will users manage this mobile cookie? How can the tracking be turned on or off? Unfortunately the site doesn’t seem to have a privacy policy nor any other details that we could find. Not a good sign!

In some later press stories, we see Ringleader claiming that they do provide an opt-out. But if we can’t easily find it, how will the average user?

(By the ways, kudos to MediaPost’s Wendy Davis, one of the savviest reporters covering the online ad industry for immediately flagging the privacy problem the minute this mobile cookie was announced)

Consumer Tool Kit

This Page Under Construction

*The Future of Privacy Forum Consumer Central Opt-out List

*Opting out of behavioral targeting can mean going to many different locations to disable the ad targeting or tracking done by a range of companies. For your convenience, all in one location below, we have listed for you the opt-out pages of many of the portals, ad networks and analytics companies. If there are others you suggest we add, please contact us…thanks!

http://networkadvertising.org/managing/opt_out.asp

http://info.yahoo.com/privacy/us/yahoo/opt_out/targeting/details.html

https://choice.live.com/advertisementchoice/Default.aspx

http://www.google.com/privacy_ads.html

http://www.adtechus.com/privacy/

http://www.omniture.com/en/privacy/2o7

One of the best resources for how to use the settings at most browsers can be found here.

Instructions for controlling Flash cookies are here

Additional useful resources for online privacy can be found at the TRUSTe and Federal Trade Commission sites.

http://www.truste.org/consumers/consumer_tips.php/

http://www.ftc.gov/bcp/edu/pubs/consumer/alerts/alt082.shtm

About the Forum

Our Advisory Board – 2008:

Members of the FPF Advisory Board provide input to the Forum in support of transparency, user control and the advancement of responsible data practices. By serving as advisors, they are not responsible for the content of the website, nor do they necessarily endorse the positions taken by FPF. Advisors serve in a personal capacity and their affiliation does not indicate the endorsement of their corporation or organization.

Annie I. Antón, Professor of Computer Science, College of Engineering, North Carolina State University.

Dorothy Attwood, Senior Vice President, Public Policy and Chief Privacy Officer, AT&T

Elise Berkower, Associate General Counsel, Privacy, The Nielsen Company

Joan (Jodie) Z. Bernstein, Counsel, Kelley Drye & Warren, LLP and former director of the Bureau of Consumer Protection at the Federal Trade Commission

Bruce Boyden, Assistant Professor of Law, Marquette University Law School

Allen Brandt, Corporate Counsel, Data Privacy & Protection, Graduate Management Admission Council (GMAC)

Kathryn C. Brown, Senior Vice President, Public Policy Development and Corporate Responsibility, Verizon

James  M. Byrne, Chief Privacy Officer, Lockheed Martin Corporation

Ryan Calo, Resident Fellow, Center for Internet & Society at the Stanford Law School

Dr. Ann Cavoukian, Ontario Privacy Commissioner

Danielle Citron, Professor of Law, University of Maryland Law School

Maureen Cooney, Chief Privacy Officer and Vice President for Public Policy, TRUSTe

Lorrie Faith Cranor, Associate Professor of Computer Science and Engineering, Carnegie Mellon University

Mary Culnan, Slade Professor of Management and Information Technology, Bentley University

Simon Davies, Director, Privacy International

Michelle Dennedy, Chief Governance Officer, Cloud Computing, Sun Microsystems

Carol DiBattiste, Senior Vice President Privacy, Security, Compliance and Government Affairs, LexisNexis

Benjamin Edelman, Assistant Professor, Harvard Business School

Scott Goss, Senior Privacy Counsel, Qualcomm

David Hoffman, Director of Security Policy and Global Privacy Officer, Intel

Marcia Hoffman, Staff Attorney, Electronic Frontier Foundation

Andy Holleman, Chief Privacy Officer, Qwest Communications

Chris Hoofnagle, Director, Berkeley Center for Law & Technology’s information privacy programs and senior fellow to the Samuelson Law, Technology & Public Policy Clinic

Pamela Jones Harbour, Former Federal Trade Commissioner; Partner,Fulbright & Jaworski LLP

Nuala O’Connor Kelly, Senior Counsel, Information Governance & Privacy, GE

Ian Kerr, Canada Research Chair in Ethics, Law & Technology,University of Ottawa, Faculty of Law

Brian Knapp, Chief Privacy Officer and Vice President, Corporate Affairs, Loopt

Brendon Lynch, Chief Privacy Officer, Microsoft

Terry McQuay, President, Nymity

Rena Mears, Partner, Deloitte & Touche LLP, Global & U.S. Leader Privacy and Data Protection

Scott Meyer, CEO, Better Advertising

Doug Miller, Executive Director, Consumer Advocacy & Privacy, AOL

Paul Ohm, Associate Professor of Law and Telecommunications, University of Colorado Law School

Adam Palmer, Law & Policy Counsel, .ORG, The Public Interest Registry

Harriet Pearson, Chief Privacy Officer & VP Regulatory Policy, IBM

MeMe Rasmussen, Senior Director, Associate General Counsel, Adobe Systems Incorporated

Ari Schwartz, Vice President and Chief Operating Officer, Center for Democracy and Technology (CDT)

Paul Schwartz, Professor of Law, University of California-Berkeley School of Law

Scott Shipman, Senior Counsel, Global Privacy Practices, eBay

Daniel Solove, Professor of Law, George Washington University Law School

Zoe Strickland, Vice President, Chief Privacy Officer, Walmart

Omar Tawakol, CEO, BlueKai

Omer Tene, Associate Professor, College of Management School of Law, Rishon Le Zion, Israel

Anne Toth, Vice President of Policy and Head of Privacy, Yahoo!