The Future of Privacy: Our Mission & Agenda

|

Society is approaching a turning point that could well determine the future of privacy. Policy-makers and business leaders soon will make decisions about technology practices that will either ensure that data is used for the benefit of individuals and society, or take us down a path where we are controlled by how others use our data.

Why do we say this?

Technology is advancing even more quickly than before with respect to data collection, data mining, and correlation of data across platforms, channels, devices and over time. The use of data is becoming increasingly practical and profitable. Technical limitations to true one-to-one marketing have been overcome and the expense of data storage is no longer a barrier. As a result, it is now possible for consumers to enjoy the full benefits of data sharing for research, convenience, personalization. This data is also helping to support universally available and valuable content and services.

Along with these benefits come some challenges. A difficult economy is encouraging businesses to delve more deeply into data that can be used to make more efficient and effective marketing decisions. At the same time, pressure to make unethical decisions maybe driven by challenges to meet revenue goals in a turbulent market.

The financial system meltdown has, rightly or wrongly, cast significant doubt on the concept of self-regulation and is likely to encourage government engagement in this area. Indeed, it seems that it is no longer enough to say that, “these things are complicated but experts crafted them, so they must be safe.”
A new tech savvy administration is entering office, with the likely entry of new appointees who are steeped in the privacy and tech policy debates. Joining them will be veterans of a campaign that broke new ground in maximizing online data use to connect to its audience. This intersection between privacy and a full appreciation of the value of data may provide an opportunity for policymaking that seeks to balance data use with user controls.

Data regulators in Europe have increased their scrutiny of the practices of US internet companies and have been pressing search engines and social networks to respect European data standards for their platforms. At home, the Federal Trade Commission has proposed behavioral targeting guidelines and continues to examine practices in this area.

Social networks have become ubiquitous, with users providing more personal information than ever. These networks now also serve as platforms for 3rd party applications that rely on the data of their user bases to provide services. Cloud computing efforts seek to store all the data of users and businesses on central servers. New robust mobile platforms are integrating geo-location based data, and are beginning to be able to implement behavioral targeting and tracking.

Also supporting an opportunity for change in business practice is the philosophy of Web 2.0. Increasingly, developers have embraced the point of view that users want control of their experience, including control over their data.

These factors all combine to bring us to a uniquely opportune moment. Individual companies have taken major steps forward. AT&T has committed to an affirmative consent model for behavioral targeting and other ISPs have joined in advocating that model. Yahoo! is collaborating with eBay and Wal-Mart to label ads and expand user choices. Microsoft is adding new privacy features to Internet Explorer, and AOL has launched an educational effort around behavioral targeting. However, there is clearly much more that can be done to create a movement to put trust at the center of decisions about data use.

We believe that if dedicated technologists, policymakers, industry groups and advocates focus on advancing privacy in a manner that businesses can achieve, then privacy, profits and personalization are all possible. Join us to help improve the state of online privacy by advancing responsible data practices.

Our Agenda for Consumers & Businesses

FPF will seek to bring transparency to online data practices. Our plan is to document practices, produce multi-media educational materials, and commission reports and studies that provide consumers and policy makers the real story about how their data is used.

FPF will seek to bring true transparency and user control to behavioral targeting and will broaden the discussion of the ethics of what the online norms can be with regard to use of web browsing.

FPF will seek to ensure that considerations around data retention, limitation, and deletion are a significant part of the consumer privacy debate.

FPF will seek to drive practices that enhance consumer controls – ensuring that data use is obvious, useful, intuitive and used and for a benefit he values and controls – no matter the type of technology used.

FPF will explore opportunities to clarify the definitions of personal data and establish baseline practices about what is accepted as anonymous. But even when data isn’t identifiable, trustworthy practices must be in place whenever data can be used to tailor a user’s experience.

FPF will seek solutions that get beyond the limitations of cookies to improve the state management of privacy.

FPF will seek to highlight the privacy risks and the data protection opportunities presented by new data from technologies such as geo-location, mobile and RFID. There is a limited window to ensure that the deployment of these technologies builds in the kind of controls needed. Already we see examples of leading edge start-ups rushing forward without the needed tools in place.

FPF will help drive online privacy education for consumers and will particularly consider the impacts on teens, users with disabilities and seniors. We will work to develop civic norms applicable to both data subject and data user. We need our teens to think twice about the embarrassing disclosures they may make online and to understand that we live in a world where we must manage our own brand and digital persona – but equally we must train the businesses that making secondary use of data in a manner disturbing to users may be akin to peeking at a diary just because it was left open. We cannot expect the generation that lives virtually to be in a continual state of self censorship. Users need tools to be able to speak freely, informally and privately without having to worry it will be used against them.

FPF will advocate for privacy advances that are business practical, but that substantially raise the bar to ensure personal autonomy for all who seek to embrace the benefits of our digital society. We will seek to work with industry, advocates and policymakers to ensure the future of privacy is one where we are not enslaved by our data, but rather where data serves the benefit of humankind.