Senator Leahy (D-VT) Makes Keynote Address at CFP on Thursday (June 16, 2011)

Today Senator Leahy (D-VT) made a brief keynote address at the Computers, Freedom, and Privacy Conference at the Georgetown Law Center (CFP). Senator Leahy urged that we must “modernize” the legal framework to stay apace with the technology of today’s age. He discussed the numerous recent data breaches, and quoted statistics indicating the hundreds of millions of records have been subject to data breaches.

Additionally, the senator addressed some of his proposed legislation. First, Senator Leahy discussed his proposed updates to the Electronic Communications Privacy Act (ECPA)—“The Electronic Communications Privacy Act Amendments of 2011.”  These amendments could be the first in 25 years, and include requirements for search warrants based on probable cause in order to access information held in electronic communications. The amendments also propose new protections for location based information and require search warrants before tracking people in real time using location information provided by service providers. In responding to a question, a member of the Senator’s staff noted however, that this protection does not extend to historically collected data because the legislation intents to maintain balance (presumably for law enforcement purposes).  Additionally, the Senator addressed the data breach notification bill he introduced on June 7, 2011, known as the “Personal Data Privacy and Security Act of 2011.” He indicated that this will be his fourth attempt to pass this legislation, and that with each new introduction of the bill, the threat to security and privacy has been greater.

The comments were clear and succinct, with a tone of urgency: we need to work towards modernizing the legal framework to address the privacy issues that exist in our fast-growing digital age.

(Posted by: Shreya Vora, FPF Fellow)

FPF Advisory Board Members Take the Stage at Computers, Freedom, and Privacy Conference to discuss: “Frontiers in Privacy"

Wednesday, June 15 at Computers Freedom and Privacy (CFP), three Future of Privacy Forum (FPF) Advisory Board members took the stage during a discussion about “Frontiers in Privacy.” Advisory Board member Professor Annie Anton moderated a discussion between Professors Peter Swire and Daniel Solove. The discussion was fast-paced and covered six topics–each getting a lightning-round type of discussion which elicited discussion by the panelists.

The topics included:

First, the all or nothing fallacy in privacy and security. The professors debated whether there is false tradeoff between the concepts of privacy and security. Professor Solove chimed in that he believes these concepts are not all-or-nothing, but rather are “different sides of the same coin.” Professor Swire commented similarly that the debate is not really between privacy versus security, but rather security versus security.

Second, encryption and globalization in India and China. Professor Swire discussed his recent trip to India and his growing concerns over the maximum 40-bit encryption key limit in the country. Professor Solove agreed with Professor Swire’s commentary, and both find the trends abroad alarming and distressing. Specifically both pointed to the similar debate that occurred in the US’ recent past, and believe that the default internationally should similarly shift to “good encryption.”

Third, the concept of having nothing to hide. Both professors disagree with the sentiment that if you have “nothing to hide” there should be no concern over privacy. The “nothing to hide” concept was labeled as too narrow because it does not account for those who want privacy rights related to things such as access to view and correct information retained about them online, or even the right to prevent aggregation of profiles about them. Along the lines of aggregation and profiling, the professors voiced concerns about having to deal with judgments and inferences, often wrong, that arise from ones online actions.

Fourth, social networks, freedom of association, and privacy. The discussion engaged the panelists about the growth of social networks and the benefits and drawbacks associated therein. Professor Swire succinctly stated that we as consumers are torn between the wonderful ability of sharing and networks but at the same time, fear. The professors also debated the social value derived from these networks and discussed potential regulatory limitations that could be placed on them as well.

Fifth, the future of the Fourth Amendment. The discussion focused on the Fourth Amendment’s reasonable expectation of privacy test. Professor Solove emphasized his view that the test is drawn too narrowly today. Professor Swire indicated a gradual shift that seems to be occurring in the Federal bench, specifically citing to recent cases that limit the scope of the Fourth Amendment as it relates to e-mail and computer searches.

Finally, the panel discussed data minimization versus data drench. There is a significant focus on data minimization both in the EU and US (FTC; McCain/Kerry Bill). However, both panelists emphasized that this concept of minimization is contradictory to the data infusion that is actually occurring in the real world. Professor Solove aptly compared asking entities to limit their use of data to the example of a tiger in a cage with a huge amount of meat—could you really request the tiger to only eat the data in small chunks?

EPIC Honors Wall Street Journal for “What They Know” Series

Earlier this week, FPF’s Co-Chair Christopher Wolf, had the pleasure of honoring The Wall Street Journal for its  “What They Know” series, on behalf of EPIC at a special event in Washington, D.C.   His remarks from that celebration are featured below:

I am pleased to serve on EPIC’s Advisory Board and pleased to have been asked to present an award from EPIC to the Wall Street Journal.

EPIC has been explaining for some time that there is a big gap between the type of tracking that companies are engaged in on the Internet and what most people know or think is occurring. As EPIC has explained it, the general public has very little idea that every second they are on the Internet, their behavior is being tracked; that even if  the information collected is anonymous, it is used to create a “profile” which is then sold to companies on “stock-market-like” exchanges to create and deliver targeted advertising back to the individual.

In 2010, EPIC gained a new ally in spreading the work about what is going on online.

Starting last summer, The Wall Street Journal began a  year-long “What They Know” investigation into online tracking and exposed a fast-growing network of hundreds of companies that collect highly personal details about Internet users—their online activities, political views, health worries, shopping habits, financial situations and even, in some cases, their real names—to feed a $26 billion U.S. online-advertising industry.  As  the Journal described it, the nation’s top fifty websites installed an average of 64 pieces of tracking technology onto the computers of visitors, usually without warning, for a total of 3,180 tracking files. A dozen sites installed more than a hundred. Two-thirds of those files were installed by 131 companies that are in the tracking and online consumer profiling business.

Having been in the online privacy world since it began, I have never witnessed a impact like the one theWall Street Journal’s “What They Know” series has had in the privacy world.

First, as a privacy litigator and FTC regulatory attorney, the series has generated a lot of business for me and my colleagues at Hogan Lovells.   Always happy to have the business.  

But more importantly, the series has provoked debates and discussions about privacy that we have never seen before.  It quite literally has made privacy front page news. In many ways, the series of articles about online privacy that the Wall Street Journal began publishing last year has set the tone for the privacy debate nationally.

At a recent conference, one technology executive complained to Julie Angwin, one of the Journal’s principal reporters on the series that “When you use words like ‘surveillance’ and ‘spying,’ it freaks people out.”  And another participant at that conference said the series directly had influenced the comments made by Congressional representatives. “The question addressed to me [by Congress] was, ‘Look at these apps the Wall Street Journal found—so you, app developer, tell us why we shouldn’t be afraid of these.” 

Julie Angwin responded to that comment by saying “What we’re doing is reporting the facts,”  “The fact is, we tested a bunch of apps, and this is the data they were sending,” she said. “And this is pretty revolutionary in the news business.” “Most often, data written about in the newspaper is provided to them, as in, ‘a Brooking Institution report says this.’ We decided to test things ourselves. It was expensive, it was difficult. And it turns out, we now have the best data available about what apps are doing. It’s hard to replicate that study. You have to hack the phones, and measure the traffic.”

She continued: “There are some loaded words in those stories, I agree. But I also think that this is actually what is happening—you are being tracked.”

[I should mention that Ashkan Soltani, an independent researcher and consultant on privacy, who is here with us tonight, assisted the Wall Street Journal with its research.]

The series triggered significant steps to provide greater transparency and consumer control over the use of data collected, including a major bipartisan bill in the Senate sponsored by John McCain and John Kerry calling for a “privacy Bill of Rights” for Americans which was a direct response to the Journal’s work, as the senators made clear in introducing the bill, with Senator McCain reading from a What They Know article at the press conference.

And in the House, the series also prompted a bipartisan privacy bill in the House, introduced by Cliff Stearns (R, Fla.) and Utah Democrat Jim Matheson, that encouraged companies to offer more information to consumers about how they are being tracked. The bill also called for the data-collection industry to develop a policing program that would be approved by the Federal Trade Commission.

Representative Jackie Speiers, who has introduced a Do Not Track bill in the House, is quoted as saying: “I must tell you that until I read it in the Wall Street Journal, and their 13-part series, I didn’t know that Dictionary.com was just a means by which tracking takes place. And they’re using something like the dictionary to identify you and then to track you. I was pretty outraged when I read that.”

The series also has echoed through the advertising and tech industries, with industry groups toughening privacy codes and dozens of businesses changing basic practices. Microsoft, Apple and Mozilla have all moved to install robust new privacy features in their browsers in direct response to our report of Microsoft’s quashing of a privacy feature at the behest of advertisers.  The Future of Privacy Forum, the think tank that I founded and co-chair with Jules Polonetsky, who is here tonight, has helped to convene discussions among companies, privacy advocates, privacy scholars and regulators about how to improve privacy online for consumers and, without question the Journal series was a catalyst.  The Future of Privacy Forum is hoping our efforts will further help the App world reach a consensus that the value and convenience of Apps will not be fully realized until and unless privacy is built in.

Over the weekend, I was reading a piece by Dean Starkman in the Columbia Journalism Review about the Journal’s  series and want to share his observations with you.  He said:

Reading The Wall Street Journal’s “What They Know” series on Internet (un)privacy last year, I thought, this has “Pulitzer” written all over it.

I don’t mean that in a cynical way. Unlike some people, I do care who wins the Pulitzers and other prizes because they often reward big, risky, in-depth, investigative reporting, including some of my all-time favorites. 

[And] aside from prizes, there really aren’t any other metrics for journalism quality. 

[So]I thought the series, by Julia Angwin and others, had all the hallmarks of a Pulitzer winner: it was ambitious, risky (some of the companies named had objected vociferously, I am told), well-written, and full of surprises.

Plus, it touched off government investigations and reform. Check, check, and check.

It didn’t win, and wasn’t even a finalist. I’m surprised. Staffers at my old paper were crushed when they learned, one of them tells me. I understand. (I noticed corrections appended to a couple of the stories and don’t know if that was a factor, but none seems major.) 

Well, while perhaps not yet as prestigious as a Pulitzer prize, the EPIC prize tonight is given to the Wall Street Journal for its significant contribution to privacy education, and to shining the line on online privacy issues that has led to an important national discussion.

I am told it is the custom at the Journal not to accept such awards in person, so we will send them their prize and I will report on the recognition I expect you now will give with your applause.

Thank you.

White House Smart Grid Report Includes Key Privacy Guidance

Earlier today Future of Privacy Forum co-chair Jules Polonetsky attended the Obama administration’s announcement at the White House for a number of new initiatives designed to accelerate the modernization of the Nation’s electric infrastructure, bolster electric-grid innovation, and advance a clean energy economy.

The National Science and Technology Council (NSTC) report focuses on 4 pillars for state and federal policy-makers: (1) enabling cost-effective smart grid investments

(2) unlocking the potential of innovation in the electricity sector

(3) empowering consumers and enable informed decision-making

(4) securing the grid

Some of the issues potentially relevant to consumers’ energy usage data and privacy are as follows. Under pillar (2), the Government will work toward fostering open, uniform technology-neutral interoperability standards and will protect consumer options and prevent anticompetitive practices.  Under pillar (3), state and federal policymakers are called to “evaluate the best means of ensuring that consumers receive meaningful information and education about smart grid technologies and options” (noting that some state regulators already mandate education/outreach programs for smart grid deployments that affect consumers); ensure consumers have access to and control over their energy consumption data in machine readable formats; help foster consumer-facing devices and applications that make it easier for users to manage energy consumption; ensure utilization of Fair Information Practice Principles (FIPP) to help protect consumer information (noting that the Administration supports FIPPs for industries not subject to sector-specific regulation); and update consumer protection policies (in addition to privacy) as necessary to account for new issues. Under pillar (4), the Federal Government will continue to work towards standards and guidelines for cybersecurity through public-private cooperation, including through the promotion of a “rigorous, performance-based cybersecurity culture.”

Regarding privacy specifically, the reports notes that currently “there is not in place a comprehensive and broadly-accepted application of FIPPs in the smart grid context.”  The report talks about the Administration’s support for a broad “consumer bill of rights” which could cover energy usage information.  The Administration also supports a broad array of stakeholders taking “responsibility for implementing FIPPs through privacy rules that are specific to the smart grid context.”  The report lauds FIPPs as “comprehensive, yet flexible” and envisions them facilitating the development of enforceable codes through the collaboration of industries, consumer advocates, and regulators.  The report notes that any rules or guidelines will vary depending on whether energy usage information is shared with third parties.

The report also states that: “State regulators may consider requiring utilities and other firms to provide customers clear information regarding how their data may be used, if consumers authorize such use, and guaranteeing that customers have the ability to select the purposes for which their data may be used.” The report does not advocate either a default opt-in or opt-out approach, but acknowledges that defaults “can be influential.”  It favors FIPPs because they don’t categorically require one approach or the other.

Regarding cybersecurity, the Report refers to the Administration’s proposed cybersecurity legislation and states that “the Federal Government will seek to ensure that grid operators have access to actionable threat information; support research, development, and demonstration of cybersecurity systems and develop human capital; and work with private-sector stakeholders to establish accountability for meeting standards and performance expectations.”

The administration also announced the creation of Grid 21, which is a private sector initiative to promote consumer-friendly innovations while ensuring proper privacy safeguards and consumer protections.

More information and the full text press release can be found at the website for the Office of Science and Technology Policy.

Additionally, for more smart grid resources, see the Future of Privacy Forum’s smart grid page.

FPF Supports and Participates in Recent Privacy Law Scholars Conference; Announces 2011 “Privacy Papers for Policy Makers” Submission Period

Last Thursday and Friday a large group of academic privacy experts—as well as leading government, industry and advocacy participants—gathered at the Privacy Law Scholars Conference (PLSC) in Berkeley, California to discuss and hold workshops on several new papers addressing key privacy issues. The conference was hosted by the Berkeley Center for Law and Technology and the George Washington University Law School.  The event was co-chaired by FPF Advisory Board Members, Chris Hoofnagle of Berkeley Law School and Dan Solove of GW Law School. Additionally Danny Weitzner, the Deputy Chief Technology Officer in the White House Office of Science and Technology Policy gave the keynote address. 

In his keynote speech, Mr. Weitzner discussed how to implement a privacy bill of rights and also his office’s goal of working towards a global interoperability of privacy regimes. He said that interoperability will result in global coordinated and cooperative enforcement.  In response to a question from FPF founder and co-chair Christopher Wolf, Mr. Weitzner said that if, as is likely, privacy legislation does not pass this year  — and he was more optimistic about data security laws passing – “Plan B” is to push industry to follow the best Fair Information Practices detailed in the Commerce Green Paper.

FPF’s co-chairs, Wolf and Jules Polonetsky also participated in the PLSC discussions of the various privacy papers.  In addition, Polonetsky presented a paper he co-authored with Omer Tene, entitled “To Track or ‘Do Not Track’: Advancing Transparency and Individual Control in Online Behavioral Advertising,” which discussed the range of policy considerations around user control of information collected about them for targeted advertising.  Chris Wolf moderated the discussion about a paper by Colette Vogele & Erica Johnstone entitled, “Without My Consent, describing a new web site focused on the removal of and remedies for revealing photographs posted without a person’s consent.” 

At the PLSC dinner on Thursday night, continuing in the spirit of privacy scholarship, FPF announced its ongoing open call for papers for our annual “Privacy Papers for Policy Makers.” 

FPF invites privacy scholars and authors  to submit papers to be considered for FPF’s second edition of “Privacy Papers for Policy Makers.”  Our goal is to highlight important research and analytical work on a variety of privacy topics for policy makers.  Specifically, we want to showcase papers that analyze current and emerging privacy issues and either propose achievable short-term solutions, or propose new means of analysis that could lead to solutions.

If you have not yet submitted your paper, or know of any one who has yet to submit, please click here for more details about the review process and submission requirements. The paper submission deadline is July 15, 2011.

Privacy Papers for Policy Makers 2011

PRIVACY PAPERS FOR POLICY MAKERS 2011

The Future of Privacy Forum (FPF) invites privacy scholars and authors with an interest in privacy issues to submit papers to be considered for FPF’s second edition of “Privacy Papers for Policy Makers.”

Special thanks to our Policy Papers for Policy Makers Sponsors:

AT&T | Microsoft

PURPOSE

•      To highlight important research and analytical work on a variety of privacy topics for policy makers

•      Specifically, to showcase papers that analyze current and emerging privacy issues and either propose achievable short-term solutions, or propose new means of analysis that could lead to solutions.

REVIEW PROCESS

•      Academics, privacy advocates and Chief Privacy Officers on FPF’s Advisory Board will review the submitted papers to determine which papers are best suited and most useful for policy makers in Congress, at federal agencies and for distribution to data protection authorities internationally.

•      Two papers selected by the chairs of the Privacy Law Scholars Conference will be included in the publication and will receive a cash award from the International Association of Privacy Professionals.

•      The Future of Privacy Forum will announce the selected papers at an event with privacy leaders in September and will provide a printed digest to policy makers in the United States and abroad.

SUBMISSION

Paper Submission Deadline: July 29, 2011

Please include: author’s full name, phone number, current postal address and e-mail address.

Send via e-mail to [email protected] with the subject line “Privacy Papers for Policy Makers 2011,” or send by mail to:

Future of Privacy Forum

919 18th Street, NW, Suite 925

Washington, D.C. 20006

The entry can provide a link to a published paper or a draft paper that has a publication date. FPF will work with the authors of the selected papers to develop a digest.

Visit fpf.org/the-privacy-papers to view the 2010 edition of Privacy Papers for Policy Makers.

This compilation is not intended to be a publication of original work.  Rather we seek to make policymakers aware of papers presented at workshops or published in journals and we provide this compilation of descriptions of these papers in order to call attention to those deemed most significant.

Future of Privacy Forum Launches App Privacy Site

Future of Privacy Forum Launches App Privacy Site

Privacy resource portal provides application developer

tools and guidance for responsible privacy practices

WASHINGTON – With hundreds of thousands of online and mobile applications already in use and more being developed, the Future of Privacy Forum (FPF) today launched a new website to help application developers provide users with privacy protections.  Supported by app developers, platforms and tech companies, ApplicationPrivacy.org is the only hub of its kind containing emerging standards, best practices, privacy guidelines, platform and application store requirements, as well as relevant laws and regulatory guidance.

A recent survey by FPF found that 22 out of the 30 most popular mobile apps lacked even a basic privacy policy where consumers could learn about what data is collected or exchanged when they download the app.  A recent study estimated that by 2016 the worldwide mobile app industry could achieve 44 billion downloads, and according to Facebook, people install 20 million applications every day on their site.   

Christopher Wolf, FPF’s founder and co-chair noted the importance of educating app developers on key data protection principles. “Apps often provide valuable services using people’s contacts, location and profile information.  But unless users trust that their privacy will be protected, the use of Apps will decline and that would be unfortunate, as Apps provide innovative ways to interact over the Internet and contribute to the Internet economy.”

FPF’s director and co-chair Jules Polonetsky emphasized the need to educate more developers about the importance of responsible data practices. “App developers with limited staff or resources can end up being responsible for the data of millions of users.  Platforms and operating systems have roles to play, but app developers themselves need to be responsible for their own practices. We hope that Applicationprivacy.org  will provide a one-stop shop for the one person start-up or the large scale company.”

Facebook, AT&T and Sprint will also be promoting the site to developers to help them navigate the development process.  FPF’s leaders are urging other companies to do the same to help provide developers with this information.

App developers also recognize the value of the site. Sze Wong, CEO at Zerion Software, Inc., a company which creates the app known as iFormBuilder, said, “As a general purpose data collection platform, iFormBuilder stores a lot of private information from our clients around the world.  When we first drafted our

data privacy policy and subsequently the safe harbor provision for Europe, we felt we were on our own. Now developers have a place to get general information and get help. I wish the project was there when we started! “

Peter Erickson, who is the founder of MoDev, a national mobile developers network, said, “I know our developers spend a lot of time focusing on the privacy issue. Resources like the Future of Privacy Forum’s ApplicatonPrivacy.org site will be a critical resource for navigating the tricky privacy terrain.”

The site will also have an active presence on Facebook and on Twitter, using the handle @AppPrivacy. The website was built with the support of application developers, platforms and tech companies, including:  AT&T, CardStar, the Center for Democracy and Technology, Facebook, Google, Infield Health, Intel, MoDev, Savvy Apps, TRUSTe, Zerion Software, Zynga, and 3ADVANCE.  Shaun Dakin, a fellow at the Future of Privacy Forum played a lead role in the development of the site. 

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.

                                                                               ###

FOR IMMEDIATE RELEASE:  May 26, 2011             

Media Contact:

Ted Kresse

202.777.3719

[email protected]

Statement from CDT and FPF on the Development of App Privacy Guidelines

Statement from the Center for Democracy & Technology (CDT)

and Future of Privacy Forum (FPF)on the

Development of App Privacy Guidelines

WASHINGTON, DC – Today, the Center for Democracy & Technology (CDT) and the Future of Privacy Forum (FPF) released the following statement in response to this morning’s Senate hearing on “Consumer Privacy and Protection in the Mobile Marketplace.”  CDT and FPF are working together to improve mobile and app privacy and take the opportunity of the Senate hearing to make this statement on app privacy:

Today’s hearing demonstrated that the collection of personal information through Apps operating on mobile devices raises serious privacy issues. “Apps,” a shorthand for “applications” commonly used to refer to programs on mobile devices, are booming in popularity.  Apps are also beginning to appear on Internet-linked televisions, on desktop computer operating systems and on the Web.

Apps often collect, use, share, and retain a variety of information, including location data. Sometimes this data is important to the app’s functionality. Sometimes, however, the data is not actually needed for app functionality and may be collected inadvertently. In other cases, the data is collected for targeted advertising, helping developers provide free and low-cost programs.  However, any data collection practices can pose privacy issues, especially when the user is not aware of or has not consented to the collection. For users of mobile devices, a recent survey shows that privacy is their number one concern.

Accordingly, CDT and FPF are currently engaged with major stakeholders in the mobile ecosystem—app developers, device manufactures, and mobile platforms—to develop best practices and privacy principles for mobile devices. Once complete, we hope these principles will provide guidance to developers, platforms, and policymakers. For developers who are not familiar with the complex concerns surrounding user privacy, the CDT and FPF process will address the following fundamental issues:

1.  Privacy Policy.  Every app should have a written Privacy Policy explaining to users, in plain language, what data is collected, how it is used, how it will be displayed, shared, or transferred, and how long it will be retained.  If data is collected, even incidentally, for the financial benefit of the app developer, e.g. for advertising, this should be disclosed.   The Privacy Policy should be readily accessible.  At a minimum, a link to the Privacy Policy should be provided prominently on the app itself and the contents of the Privacy Policy should be easy for the user to read and understand. Consideration should be given to layered privacy notices that summarize and link to the more detailed contents of a Privacy Policy.  Other means of summarizing privacy practices, such as symbols or icons, should also be considered.

2.  Meaningful User Choice.  Users should be provided meaningful choices about the collection,  disclosure, and use of the personal or device information.  These choices should be explained in the Privacy Policy, but also presented “just-in-time” to users, when data is about to be collected.

3.  Data Minimization and Limited Retention.  Developers should only collect as much data as is necessary to perform the functions of the app and only retain this data for as long as it is needed, unless the user clearly has consented to greater collection and retention.

4.  Appropriate Data Security.  Developers should employ all reasonable physical, technical and administrative methods to protect the integrity and security of collected data.

5.  Education.  Developers should educate users about the types of data an app collects, and ways they can protect their privacy using the app.  Developers should educate themselves about the laws they are subject to and take note of possible obligations under COPPA, as well as self regulatory initiatives such as those proposed by CTIA, MMA and the GSMA.

6.  Privacy by Design.  Developers should think about privacy from the beginning of the app development process.  Developers should consider what personal or device data is needed for app functionality and design the app to collect only what is needed, share it only with those needed to perform the functions of the app, and retain it only for as long as is necessary, and only after proper notice and choice for the user has been provided.  This also means ensuring that needed physical, technical and administrative protections are in place for the data collected, and that accountability principles are employed to ensure that data is handled properly, including regular auditing and training of employees and contractors.

CDT and FPF are seeking input from platforms, carriers, device manufacturers, app developers and others on these issues and plan on expanding the forgoing concepts in order to provide the detail and specificity necessary for them to be effectively implemented. Given the incredible growth in the number of apps and the immediate need for a basic set of rules for developers, we urge all stakeholders to participate.

Center for Democracy & Technology (CDT)  is a non-profit public interest organization working to keep the Internet open, innovative, and free. With expertise in law, technology, and policy, CDT seeks practical solutions to enhance free expression and privacy in communications technologies. CDT is dedicated to building consensus among all parties interested in the future of the Internet and other new communications media. 

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.

Media Contacts:

Brock Meeks (CDT)

202-407-8814

[email protected]

Ted Kresse (FPF)

202-777-3719

[email protected]

FPF Finds Nearly Three-Quarters of Most Downloaded Mobile Apps Lack A Privacy Policy

Earlier this week in the US Senate, the Privacy, Technology and Law Subcommittee of the Judiciary Committee  held a hearing on mobile privacy issues.  One focus of the hearing was the privacy of personal information collected and used by Apps on mobile devices, and one line of questioning concerned the absence of privacy policies for Apps used by consumers.  Without a privacy policy to review, consumers may not have the ability to understand and control the use of their personal data by the Apps.  And although privacy policies should not be the only way companies communicate with users about data use, posting a privacy policy is the essential first step for companies to take to be accountable for their practices of collecting and using online data.

With that in mind, the Future of Privacy Forum this week analyzed the top 30 paid mobile apps across the leading operating systems (iOS, Android, & Blackberry) and discovered that out of the top 30 applications, 22 of them — nearly three-quarters– lacked even a basic privacy policy.  A previous analysis of mobile apps by the Wall Street Journal this past December, found that forty-five of the top 101 i-Phone or Android apps they assessed did not provide privacy policies on their websites or inside the apps at the time of testing. 

 

FPF’s methodology included analyzing the top paid iPhone apps in the Apple App Store on May 10, 2011, and industry standard reporting from Distimo.  In the assessment, FPF looked for the website of the application developer and investigated whether the developer had a privacy policy that could be associated with their App. If a privacy policy was found on a website, the application developer was credited with having a mobile application privacy policy.  FPF also downloaded a sample of the paid apps to a mobile device and determined if at any time during the download and installation process a privacy policy was presented to the user of the device.  Out of the sample tested, FPF found that only one (Angry Birds iOS) had a privacy policy link from within the user interface.  

 

FPF believes that a fundamental element of protecting the privacy of consumers using Apps is the availability of a readily-accessible, written privacy policy.   FPF believes that, at a minimum, App Developers should have privacy policies (with which they comply) for all Apps offered to consumers.  Once a consumer reviews a privacy policy, he or she can choose whether to install or continue using the App, a fundamental part of privacy control.  FPF is working with Center for Democracy and Technology (CDT) to suggest additional ways that app developers can improve their privacy practices to protect consumers personal privacy.  

 

To see the list of 30 apps analyzed by FPF and whether or not they have a basic privacy policy in place, click here.

*Research and creation of app privacy policy matrix by Shaun Dakin and Shreya Vora, Future of Privacy Forum Fellows

FPF Summary of CPUC Smart Grid Rules

On May 6, 2011, the California Public Utilities Commission (CPUC) issued a proposed decision addressing privacy and security concerns around the Smart Grid.  The CPUC proposed decision is significant, because it presents the most significant step yet in the U.S. towards a comprehensive set of smart grid privacy rules. 

With that in mind, we have prepared a brief summary of the CPUC proposed decision to help navigate the terrain.  

Among the highlights: 

There are several principles targeted toward data management. Covered entities will be limited in their ability to collect data—only information that is “reasonably necessary” or “authorized by the Commission” to accomplish primary or secondary purposes.  Covered entities must have prior customer consent to collect, store and use information, except that electrical corporations may collect and store customer data without customer consent if for a primary purpose.  Subject to certain conditions, covered entities may share information with service providers without consent.  Covered entities must also ensure the quality, integrity, and security of the data. Finally, the PUC imposes data security and privacy audit and reporting requirements which include providing copies of the privacy notices for customers, internal privacy and data security policies, third party disclosure information and secondary uses authorization forms.  The PUC rejected suggestions that third parties should be required to register for certification to offer services that require access to customer energy consumption data.

For a more comprehensive look into the proposed decision, see the FPF summary here

The CPUC is accepting comments regarding its proposed rules until May 26, 2011, with reply comments due five days after that deadline.  FPF will be filing its comments in the upcoming weeks.

Many thanks to our colleague Tim Tobin for his excellent and comprehensive review of the decision.