A Critical Time for the EU Data Protection Regulation

Editorial By Christopher Wolf

 A Critical Time for the EU Data Protection Regulation

Policymakers around the world are re-examining the legal framework that regulates the collection, use, sharing, and storing of personal information – proposing more robust protections afforded to such information, and increasing the legal obligations of business. The new approaches are in response to the dramatically different ways in which technology interacts with personal data and the potential for that data to be exposed and misused.

Within the past year, new privacy frameworks were proposed by the European Commission and also by the Obama-Administration, each seeking more protection for individuals. Despite common foundations – Fair Information Practice Principles – the privacy regimes from opposite sides of the Atlantic exhibit fundamental differences in approach and substance.

The US proposal eschews the EU fundamental rights approach, but focuses on a privacy “Bill of Rights” and a related set of enforceable, multi-stakeholder codes of conduct. At the same time, solutions are being sought to accomplish a “Do Not Track”-option for consumers; the rules for children’s privacy are being tightened; mobile and App privacy are in focus; and data brokers are under scrutiny. Major issues associated with new technologies are being addressed in the US, although without the across-the-board approach to privacy protection that characterizes the EU approach.

At the start of the second Obama-Administration, the timetable for changes in American privacy law is indefinite. Progress is being made, but the completion date for any one of the initiative cannot be predicted. (The one exception relates to health privacy, as to which new regulations from the Department of Health and Human Services are expected forthwith.)

In contrast, in the EU it is widely expected that an opinion on the proposed EU Data Protection Regulation will be coming soon from the European Parliament Committee on Civil Liberties, Justice and Home Affairs – the so-called LIBE Committee. And while some adjustments to various provisions are likely to be proposed (such as the time period for reporting data security breaches, tagged at a presumptive 24-hours in the current draft), endorsement of the Regulation to the Parliament and Council is expected. At that point, rapid consideration of the proposed Regulation is likely in the Parliament and Council.

It is the proverbial “home stretch” of the formal consideration of the Regulation introduced January 2012 by Vice-President Viviane Reding. And for that reason, it is time for sharp focus on the EU Regulation, because what happens in the EU has an impact on multinational organizations operating across borders, and on the evolution of privacy frameworks around the world.

Over the past year, there has been little disagreement over the goals of the Regulation to provide region-wide uniformity in privacy law, to reduce bureaucracy, to institutionalize “Privacy by Design,” and to establish a new framework that reflects the evolution of technology and social media and their impact on the protection of personal data.

More controversial are the proposed penalties of up to 2% of an entity’s global turnover for violations of the Regulation; the extension of jurisdiction and applicability of the Regulation outside the EU borders to companies “offering of goods or services to […] data subjects in the Union or [engaged in] the monitoring of their behavior;” the establishment of data portability that could create a ban on “tying” information to services that otherwise would be permissible under competition law; and the “right to be forgotten.”

There are lingering questions over the operation of the “one stop”-shop, in which one Data Protection Authority (DPA) will have primary jurisdiction over a company based on the location of its “main establishment”; and concerns have been expressed over the impact of the Regulation on Small-Medium Enterprises (SMEs).

In November, the UK-Government published its Impact Assessment of the draft European data protection regulation. When the draft regulation was first published, the European Commission estimated that harmonizing the European data protection regime would bring a net administrative benefit of  2.3 billion to the EU. However, the UK-Ministry of Justice has carried out its own analysis of the proposals and concluded that for the UK alone there would be an annual net cost of between £ 100 million and £ 360 million.

The UK-Government takes the position that the Commission failed to take into account all of the costs that would arise from the draft regulation, and it identifies the following aspects of the regulation that will impose additional costs on businesses:

It also points out that supervisory authorities will require substantially more resources to carry out their widened responsibilities, and that the powers the Commission has proposed to give itself to make delegated acts could also affect the costs and benefits of the new proposal. The UK-Government stated that it will use the evidence set out in its Impact Assessment to “continue to push for a lasting data protection framework that is proportionate, and that minimizes the burdens on businesses and other organizations, while giving individuals real protection in how their personal data is processed.”

At the same time, also in November 2012, Europe’s Network and Information Security Agency (ENISA), released a report on the technical aspects of the “right to be forgotten”. ENISA pointed out that any technical solution for the “right to be forgotten” would require an unambiguous definition of the personal data that is covered by the “right to be forgotten”, a clear notion of who can enforce the right, and a mechanism for balancing the “right to be forgotten” against other rights such as freedom of expression. According to the Report, the text of the current European proposal leaves each of these subjects open to debate, making it difficult to implement technical mechanisms to deal with the “right to be forgotten”.

ENISA also noted that the “right to be forgotten” is virtually impossible to enforce in an open network such as the Internet. Nothing prevents users from freely copying and redistributing digital content, including photos. Subsequently trying to find and erase the distributed copies would be impossible. ENISA states that the only way to prevent such redistribution would be to use digital rights management (DRM) technology similar to that used by certain publishers of digital content such as motion pictures and music. However, most of the DRM technologies can be easily circumvented. ENISA points out that partial enforcement of the “right to be forgotten” could be achieved by requiring search engines subject to European jurisdiction to filter search results so that the information that is supposed to be forgotten does not show up:

“A natural way to “mostly forget” data is thus to prevent its appearance in the results of search engines, and to filter it from sharing services like Twitter. EU member states could require search engine operators and sharing services to filter references to forgotten data. As a result, forgotten data would be very difficult to find, even though copies may survive, for instance, outside the EU jurisdiction”.

The French data protection authority, the CNIL, recently made three critical points about the Regulation in its Annual Report. First, the CNIL expressed concern that making a single data protection authority responsible for the Europe-wide activities of an enterprise could result in a significant decrease in the level of protection of individuals. Citing the example of a social network whose main establishment is located in another European member state, the CNIL said it was inappropriate to reduce the role of the French data protection authority to a simple mailbox to forward complaints to the principal DPA responsible for the social network’s activities. According to the CNIL, a French user who is harmed by the activities of an enterprise doing business in France should be able to look to the French regulator for redress.

The second point on which the CNIL diverges from the Commission is on the issue on international data transfers. The CNIL believes that transfers to countries that have not been recognized as providing adequate protection should be based on contractual clauses or binding corporate rules (BCRs) that have been approved in advance by the CNIL. Under the proposed regulation, an international transfer based on standard contractual clauses will not require the prior approval of the DPA.

Finally, the CNIL made the point that the new accountability measures included in the draft regulation should not be viewed as a form of self-regulation, or as a trade-off for less regulatory supervision. Instead, the accountability measures should be viewed as a supplement to existing regulatory principles and enforcement practices.

The issues that have been raised about the proposed Regulation are real and substantial. How the reviewers in the European Parliament analyze and report on the proposal will be critical. As important as momentum may be to obtain approval of the Regulation in a timely fashion, equally important is ensuring the passage of a workable and balanced set of new rules.

Key Elements of a Code of Conduct for Mobile Apps Transparency

Guest blog post from Mary J. Culnan, Senior Fellow at the Future of Privacy Forum

In February 2012, the White House issued a Consumer Privacy Bill of Rights.  In the report, the White House proposed legislation based on the privacy principles in the report and called on NTIA to convene stakeholders to develop enforceable codes of conduct implementing these principles for specific industries.  On July 15, 2012, NTIA-convened the first meeting of a multistakeholder process with the goal of developing a code of conduct to provide transparency about how companies providing applications and interactive services for mobile devices handle personal data.  FPF is an active participant in this process.  To date, the NTIA process has focused primarily on developing requirements for disclosure since effective disclosure is central to transparency and building consumer trust.  While workable disclosure standards are essential, they are not sufficient.  The FTC and others have identified additional characteristics of credible self-regulation  including promoting competition and providing for effective accountability and enforcement.  These issues also need to be addressed during NTIA process.  Because a large proportion of mobile apps are created by entrepreneurs or small businesses, it is likely these two issues will need to be considered jointly to ensure that the requirements of the final code of conduct are not anti-competitive.

Promoting Competition

On November 29, 2012, the FTC held an interesting workshop on Enforceable Codes of Conduct:  Protecting Consumers Across Borders.  In a keynote address, former FTC Chairman William Kovacic identified how codes can create barriers to entry by favoring incumbents and the ways they do business.  For example, these could include requirements that assume a privacy infrastructure larger firms already have in place while necessitating that small firms or entrepreneurs build an infrastructure from scratch in order to comply with the code.  Kovacic further argued that stakeholders can mitigate these problems during design and implementation by explicit consideration of competition as the code is developed, and evaluation after the fact to ensure there are no unanticipated consequences.

Effective Accountability and Enforcement

There are common principles for effective self-regulation, independent of the context, and many of these were discussed at the November FTC workshop described above.  In 2011, Dennis Hirsch and Ira Rubenstein published an article where they outlined the design considerations for enforceable codes of conduct to implement a broader set of privacy principles, citing an earlier version of the White House report as an example of this approach.  Organizations following an approved set of rules would enjoy safe harbor under an enforcement regime.  Hirsch and Rubenstein argued that accountability is critical to the credibility and success of any self-regulatory regime, and proposed a mix of monitoring techniques including self-certification, third-party audits and certification to keep costs reasonable.  They also cited the need for procedures to handle complaints and resolve disputes.  Individuals should exhaust their options under these dispute resolution procedures before a complaint is referred to the FTC or the state attorney general for an enforcement action.

Implications for Mobile Apps Transparency

Many of the traditional methods for transparency, accountability and enforcement have the potential to be anti-competitive.  For example, requiring organizations to post an online privacy notice on their website works well for ecommerce because online companies of all size use a website as a platform to do business.  App developers have no such need for a public website when they deliver their apps through an app market.  Very small app developers may not have the resources to create and maintain a public website that is not needed for their business.  Similarly, small app developers may not have the resources to develop programs to handle complaints or hire a third party to process complaints on their behalf.

The recent agreement the California Attorney General negotiated with many of the largest mobile app markets provides a potentially attractive solution for promoting accountability and enforcement while simultaneously promoting competition.  The agreement also will help educate app developers about privacy and their responsibilities.  Under this agreement, the app markets will include in the app submission process data fields for the developer to link to the app’s privacy policy or place to include the text of a privacy statement for that app.  The app markets will either enable the link or display the privacy statement.  Further, the apps markets will implement procedures for users to report complaints, and for investigating and addressing the complaints they receive.  App developers should self-certify that they will comply with their privacy notice and participate in the complaint resolution process.

Finally, other stakeholders in the mobile apps ecosystem can provide additional support for app developers by creating privacy policy generators or other tools that can help even the smallest app developer make their information practices transparent.  As there is likely to be a similar learning curve for app developers as there was for .com firms in the 1990’s, trade associations and other organizations in the ecosystem can help app developers understand that privacy is good for business because transparency is one key to earning and keeping the trust of consumers.

 

Gigya Launches SocialPrivacy™ Certification in Collaboration with FPF

New Survey Data Indicates Consumer Confusion Over Businesses’ Use of Social Data

Gigya, the leading provider of social infrastructure for business today announced the launch of SocialPrivacy™ Certification. The new certification will enable businesses to verify that they follow approved social network guidelines and industry best practices for managing consumer social data. Businesses can become certified after a thorough audit by Gigya to determine that they partake in fair social data marketing practices. In particular, they must follow four principles in order to gain SocialPrivacy™ Certification: 1) they will not sell user social data, 2) they will not post to social feeds without explicit permissions, 3) they will not engage in social data-based email marketing campaigns without user permissions, and 4) they will not send private messages to friends without permission. Once businesses are certified, they can display the Gigya SocialPrivacy™ Certification Seal in the login flow on their websites to transparently inform users how their social data will be used. In testing, use of the SocialPrivacy™ Certification Seal has shown an increase in social login conversion rates of 15%.

Future of Privacy Forum, recognized as a global leader in advancing responsible data practices, collaborated with Gigya to create the specific requirements for SocialPrivacy™ Certification.  FPF Director Jules Polonetsky will serve as chairman of the Gigya Privacy and Safety Advisory Board, established in part to provide expert guidance on privacy and safety best practices for Gigya and its clients.

Leading online businesses, including Martha Stewart Living Omnimedia, LUSH Cosmetics, Finish Line (Run.com), and The Globe and Mail are Gigya SocialPrivacy™ Certification launch partners, and will be implementing the SocialPrivacy™ Certification Seal in the coming months.

SocialPrivacy™ Certification was launched in conjunction with the release of new social login privacy survey results. The survey results indicate that, while social login usage is high, consumers are confused about social data collection practices. Users are also more apt to login with their social identities if sites explain how their profile data will be collected and used. The survey, conducted by SurveyMonkey on behalf of Gigya, questioned more than 2,600 U.S. consumers 18+, and found that:

“As sites look to leverage permission-based social data to power their marketing efforts, they must ensure that they follow the strict guidelines outlined by the social networks and other social data collection best practices,” said Jules Polonetsky, Director and Co-chair of the Future of Privacy Forum. “Gigya has taken a critical step in ensuring that businesses abide by these guidelines and, in doing so, is helping create a safer, more transparent web for businesses and consumers.”

“Consumers clearly prefer to use social login in many settings because it allows them to login quickly and securely without creating new credentials,” said Patrick Salyer, CEO of Gigya. “Yet we’ve also seen that those users want transparency around how their information is being collected and used. With SocialPrivacy™ Certification, we are bridging the gap by letting consumers know which sites have proved that their data is being used responsibly.”

Businesses interested in applying for Gigya SocialPrivacy™ Certification

can go to http://gigya.com/solutions/social-privacy for more information.

 For additional details on Gigya’s Social Privacy Survey, go to:

http://wp.me/p1crgF-57V

 

About Gigya:

Gigya provides websites with a complete social infrastructure that creates immersive social experiences for users and provides unparalleled customer insights for businesses.

Gigya equips businesses like ABC, Pepsi, and Verizon with a comprehensive solution to socialize their online properties. Our technology enables seamless registration with Social Login and Registration-as-a-Service, increases traffic and time spent on-site via Social Plugins and Gamification, strengthens customer relationships through SocialPrivacy™ Certification, and transforms marketing by leveraging permission-based social identity data.

Gigya works with more than 600 enterprises and touches more than one billion users per month. Our platform extracts the real value from social networks, empowering online businesses to attract, engage, and understand users like never before.

 

FTC's Second Kid's App Report: Disclosures Not Making the Grade

The Federal Trade Commission has just released of their second report reviewing the practices of apps targeted to kids. “Apps are often free, which is great for kids and easy on their parents wallet.” FPF Co-Chair and Director, Jules Polonetsky said, ” but when free means in return for data, app developers must understand the legal obligations that come with collecting kids data.”

The Commission repeated the methodology from their earlier report, but went several steps further this time – they downloaded the apps to compare whether the privacy notices accurately disclose the collection and sharing practices of app developers. The Commission found that only 20% of the 400 apps downloaded provided a link to a privacy policy prior to downloading the app. Among the apps that provided notices, these policies failed to provide critical information for parents as to how information is collected, used and shared. Moreover,  the survey shows that many apps fail to disclose that they transmit data to third parties and contain interactive features. For example, 58% percent of apps contain advertising but only 15% disclose this fact.

“Kids privacy is no longer an optional extra for App Developers given the heavy regulatory scrutiny that exists today,” said Christopher Wolf, FPF Founder and Co-Chair. Before downloading apps to children, parents should be able to learn what data an app collects and how it will be used. Indeed, the report states that the Commission is launching non-public investigations to determine if developers are violating COPPA or engaging in unfair or deceptive practices.

The FTC was particularly concerned about Device IDs being sent to ad networks, but recognized that doing so for analytics or other limited purposes wasn’t the primary privacy concern.  Rather, the Commission was worried about behavioral ads using Device IDs, a practice it has targeted in its proposed update to the Children’s Online Privacy Protection Rule.  The Commission has also launched a number of investigations stemming from the report.

The Commission called on app stores, app developers and third parties that interact with the apps to work together to develop accurate disclosures for parents.  FPF provides privacy resources for app developers, including COPPA information at www.applicationprivacy.org.  FPF is also working to provide a path for app platforms to be able to take more of a role in handling parental notice and age verification for apps by seeking clarification to the COPPA rule by supporting creation of “common consent mechanisms”.

It’s Not How Much Data You Have, But How You Use It

On Thursday, December 6, 2012, the FTC is hosting a panel to “explore the practices and privacy implications of comprehensive collection of data about consumers’ online activities.” Initially envisioned to grapple with the question of ISPs using data about consumers for advertising, the topic for discussion has since broadened to include a range of business models that “have the capability to collect data about computer users across the Internet, beyond direct interactions between consumers and these entities.”

To help inform the discussion, FPF has released a white paper entitled “It’s Not How Much data You Have, But How You Use It: Assessing Privacy in the Context of Consumer Data Integration.” The paper seeks to explain the market factors driving companies to provide an increasingly wide range of integrated services to consumers. We point to consumer demand for interoperability; the need for companies to maintain their channel to the consumer across multiple platforms and devices; the need for access to social content and signals; and innovative data uses that benefit consumers.

Expansions in data collection and new integrated uses have repeatedly been the cause of privacy concerns. But rather than impose new obligations on companies solely because of factors such as comprehensiveness of data, we propose a logical extension of the concept of context, which was introduced by the White House and FTC reports earlier this year. When data is used in new contexts, corporate practices should be judged by the nature of such new contexts and the communication needed to engage consumers without creating a “privacy lurch.” Important factors to consider include the nature of the new context; the value of the new data use; and the expectations a consumer may have developed with respect to a given “brand.” An evaluation based on this “enhanced context” model may warrant a decision to rely solely on good communication without providing consumers with additional choice. Alternatively, it may call for consumer opt-out rights or even express consent, if the nature of the shift in context and supporting factors so warrant.

Attached to the white paper are two annexes prepared by the FPF staff to describe the range of devices and services offered by leading companies and selected examples of integration of data across services and choices provided.

FCC Ruling Allows “One-Time Opt-Out Confirmation Messages” to Continue

The Federal Communications Commission (FCC) has issued a declaratory ruling confirming “that sending a one-time text message confirming a consumer’s request that no further messages be sent does not violate the 1991 Telephone Consumer Protection Act (TCPA) or Commission rules.”

The FCC ruling comes in response to a petition filed earlier this year by SoundBite Communications, a company that provides text message communication services to consumers on behalf of its clients.

The FCC emphasized certain criteria to ensure that one-time opt-out messages do not violate the TCPA or Commission rules:

  1. The sender of the text message has “obtained prior express consent…from the consumer to be sent text messages using and an automatic dialing system.”
  2. The text message confirms the consumer’s opt-out request and does not include marketing or promotional information.
  3. The text message is sent within five minutes of receiving the consumer’s opt-out request or the sender can demonstrate that any delay was reasonable.
  4. The text message is the only additional message sent to the consumer once the opt-out request is received and does not extend to a follow-up confirmation call.

The Future of Privacy Forum filed comments with the FCC encouraging it to grant the petition, explaining the importance of opt-out confirmation messages in cases where consumers are at risk of privacy invasions or identity theft.  Opt-out confirmation messages can help companies verify that individuals requesting the opt-out is in fact the subscriber, provide a record of opt-out activity in case the subscriber temporarily loses physical control over the phone, and will likely prompt-further inquiry in cases where the subscriber did not actually opt-out.

SoundBite sought the FCC ruling to reduce the TCPA’s legal ambiguity in this area, which has resulted in numerous lawsuits against communications service providers.