1/30/2013 – Privacy by Design allows secure third party access energy innovations to grow – Electric Light & Power

Canada’s Information and Privacy Commissioner, Dr. Ann Cavoukian, and Jules Polonetsky examine the benefits as well as the potential risks that the third party access to CEUD issue possesses. The new product will support conversation and new market opportunities.

To learn more about CEUD and the Privacy by Design approach click here.

Privacy by Design and Third Party Access to Customer Energy Usage Data

A new white paper is available today, “Privacy by Design and Third Party Access to Customer Energy Usage Data”. Written by Ontario, Canada’s Information and Privacy Commissioner, Dr. Ann Cavoukian, and Jules Polonetsky, Co-Chair, of the Future of Privacy Forum, the paper addresses best practices for third party access to customer energy usage data.

Paper Summary: The increased availability of customer energy usage data (CEUD) is one of the numerous benefits of Smart Grid improvements. CEUD allows for more efficient power use by utilities and customers to better deal with spikes in demand. Utilities can avoid having to use expensive “peaker plants” which kick in when energy demands exceed normal levels of supply. And with greater access to information about their own CEUD, customers can make more informed choices about when and how much electricity to use. The paper explores the issue of third party access to CEUD and its benefits, as well as the potential privacy risks. It examines the potential new products and services created by third party access which may support conservation and new market opportunities.

The Information and Privacy Commissioner of Ontario’s News Release is available here (pdf).

1/28/2013 – FTC Announces Data Privacy Settlement – Politico

Jules Polonetsky joined a number of other privacy advocates and officials from leading technology firms on a panel as part of Data Privacy Day. The keynote speaker and FTC Commissioner, Maureen Ohlhausen addressed the audience at The George Washington University and was expected to touch on a number of privacy related topics.

To read more about the news article on Politico and the event click here.

PrivacySmart Seal Powered by TRUSTe Awards First Smart Grid Seal

The PowerTools app by Candi Controls is now available for customers to download on their mobile devices and it allows them to check energy use, set goals, and track patterns relating to energy use. Data is being made available by San Diego Gas & Electric (SDG&E) utilizing their Green Button Connect My Data platform. Companies that are awarded the seal are able to display it at SDG&E’s web portal where consumers can choose to enable third party services.

Candi Controls is the first to receive certification under the Future of Privacy Forum – TRUSTed Smart Grid Privacy Program. This user-friendly tool for customers to analyze their energy use is exactly what the seal program is meant to enable. The self-regulatory program certifies that companies seeking to access energy information are doing so with the express consent of consumers.

If you are a company offering home energy management, remote home control or security, smart thermostats and other services that seek to access consumer energy data, you can find out more by contacting the Future of Privacy Forum at [email protected].

More details about the program are also available at

http://www.truste.com/consumer-privacy/TRUSTed-smart-grid

EU Roundtable Discussion & White Paper launch

The Future of Privacy Forum (FPF) is engaged in the discussion on the draft European General Data Protection Regulation. In particular, we are publishing white papers addressing three issues raised by the new legislation:

You and your colleagues are warmly invited to attend an event releasing the white papers, which will feature a discussion with leading policymakers.

Location: Renaissance Hotel in Brussels, Belgium

Rue du Parnasse 15, 1050 Ixelles, Belgium

Venue may change if the demand for attendance grows beyond current expectation.

Date: Wednesday, 23 January 2013

Time: 5:00 – 7:00pm CET

Registration: http://2013fpfbrussels.eventbrite.com/ 

The discussion will be moderated by Christopher Wolf, Founder and Co-Chair of the Future of Privacy Forum and Omer Tene, Senior Fellow at the Future of Privacy Forum.

The panelists are (in alphabetical order):

MEP Jan Philipp Albrecht, Rapporteur, General Data Protection Regulation

Ms. Françoise Le Bail, European Commission, Director General for Justice

Ms. Bojana Bellamy, Director of Data Privacy, Accenture

Ms. Julie Brill, FTC Commissioner

Mr. Peter Hustinx, European Data Protection Supervisor

MEP Seán Kelly, Rapporteur for Industry, Research and Energy (ITRE) Committee (TBC)

Mr. Jacob Kohnstamm, Chairman of the Article 29 Data Protection Working Party and the Dutch Data Protection Authority

Ms. Gabriela Krader, Corporate Data Protection Officer, Deutsche Post DHL

Mr. Peter Schaar, Federal Commissioner for Data Protection and Freedom of Information, Germany

Dr. Rainer Stentzel, Head of Project Group, Data Protection Reform, German Federal Ministry of the Interior

Dr. Wojciech Rafał Wiewiórowski, Inspector General for the Protection of Personal Data, Poland

Participation is free of charge, but please register at http://2013fpfbrussels.eventbrite.com/ above as space is limited.

Questions: [email protected]

Happy New Year from the Future of Privacy Forum!

 

Happy New Year!

Dear Friends,

Happy New Year from the Future of Privacy Forum!  And thank you for following our work and for your support in advancing privacy issues.  Here is our 2013 List of Ins and Outs for your enjoyment. On behalf of the entire team at FPF we wish you a fulfilling New Year.
To receive regular updates from FPF, please subscribe here.                           

Chris Wolf                                          Jules Polonetsky
Founder and Co-chair                       Director and Co-chair

1.   EU Politicians
2.   California AG Kamala Harris
3.   Madame Chairperson?
4.   FBI follows Petraeus with email
5.   Mac users pay more for travel
6.   No tracking cookies for kids
7.   Challenging FTC Section 5   unfairness jurisdiction
8.   Damages in privacy cases
9.   Short notices
10. Do Not Agree 

1.   EU Data Commissioners
2.   FTC’s David Vladeck
3.   Chairman Leibowitz
4.   FBI follows Jones with GPS
5.   Mac users pay more for computer

6.   No personal info from kids

7.   Consent order settlements

8.   No financial harm/no case

9.   30 page mobile privacy policies

10. Do Not Track

 

A Critical Time for the EU Data Protection Regulation

Editorial By Christopher Wolf

 A Critical Time for the EU Data Protection Regulation

Policymakers around the world are re-examining the legal framework that regulates the collection, use, sharing, and storing of personal information – proposing more robust protections afforded to such information, and increasing the legal obligations of business. The new approaches are in response to the dramatically different ways in which technology interacts with personal data and the potential for that data to be exposed and misused.

Within the past year, new privacy frameworks were proposed by the European Commission and also by the Obama-Administration, each seeking more protection for individuals. Despite common foundations – Fair Information Practice Principles – the privacy regimes from opposite sides of the Atlantic exhibit fundamental differences in approach and substance.

The US proposal eschews the EU fundamental rights approach, but focuses on a privacy “Bill of Rights” and a related set of enforceable, multi-stakeholder codes of conduct. At the same time, solutions are being sought to accomplish a “Do Not Track”-option for consumers; the rules for children’s privacy are being tightened; mobile and App privacy are in focus; and data brokers are under scrutiny. Major issues associated with new technologies are being addressed in the US, although without the across-the-board approach to privacy protection that characterizes the EU approach.

At the start of the second Obama-Administration, the timetable for changes in American privacy law is indefinite. Progress is being made, but the completion date for any one of the initiative cannot be predicted. (The one exception relates to health privacy, as to which new regulations from the Department of Health and Human Services are expected forthwith.)

In contrast, in the EU it is widely expected that an opinion on the proposed EU Data Protection Regulation will be coming soon from the European Parliament Committee on Civil Liberties, Justice and Home Affairs – the so-called LIBE Committee. And while some adjustments to various provisions are likely to be proposed (such as the time period for reporting data security breaches, tagged at a presumptive 24-hours in the current draft), endorsement of the Regulation to the Parliament and Council is expected. At that point, rapid consideration of the proposed Regulation is likely in the Parliament and Council.

It is the proverbial “home stretch” of the formal consideration of the Regulation introduced January 2012 by Vice-President Viviane Reding. And for that reason, it is time for sharp focus on the EU Regulation, because what happens in the EU has an impact on multinational organizations operating across borders, and on the evolution of privacy frameworks around the world.

Over the past year, there has been little disagreement over the goals of the Regulation to provide region-wide uniformity in privacy law, to reduce bureaucracy, to institutionalize “Privacy by Design,” and to establish a new framework that reflects the evolution of technology and social media and their impact on the protection of personal data.

More controversial are the proposed penalties of up to 2% of an entity’s global turnover for violations of the Regulation; the extension of jurisdiction and applicability of the Regulation outside the EU borders to companies “offering of goods or services to […] data subjects in the Union or [engaged in] the monitoring of their behavior;” the establishment of data portability that could create a ban on “tying” information to services that otherwise would be permissible under competition law; and the “right to be forgotten.”

There are lingering questions over the operation of the “one stop”-shop, in which one Data Protection Authority (DPA) will have primary jurisdiction over a company based on the location of its “main establishment”; and concerns have been expressed over the impact of the Regulation on Small-Medium Enterprises (SMEs).

In November, the UK-Government published its Impact Assessment of the draft European data protection regulation. When the draft regulation was first published, the European Commission estimated that harmonizing the European data protection regime would bring a net administrative benefit of  2.3 billion to the EU. However, the UK-Ministry of Justice has carried out its own analysis of the proposals and concluded that for the UK alone there would be an annual net cost of between £ 100 million and £ 360 million.

The UK-Government takes the position that the Commission failed to take into account all of the costs that would arise from the draft regulation, and it identifies the following aspects of the regulation that will impose additional costs on businesses:

It also points out that supervisory authorities will require substantially more resources to carry out their widened responsibilities, and that the powers the Commission has proposed to give itself to make delegated acts could also affect the costs and benefits of the new proposal. The UK-Government stated that it will use the evidence set out in its Impact Assessment to “continue to push for a lasting data protection framework that is proportionate, and that minimizes the burdens on businesses and other organizations, while giving individuals real protection in how their personal data is processed.”

At the same time, also in November 2012, Europe’s Network and Information Security Agency (ENISA), released a report on the technical aspects of the “right to be forgotten”. ENISA pointed out that any technical solution for the “right to be forgotten” would require an unambiguous definition of the personal data that is covered by the “right to be forgotten”, a clear notion of who can enforce the right, and a mechanism for balancing the “right to be forgotten” against other rights such as freedom of expression. According to the Report, the text of the current European proposal leaves each of these subjects open to debate, making it difficult to implement technical mechanisms to deal with the “right to be forgotten”.

ENISA also noted that the “right to be forgotten” is virtually impossible to enforce in an open network such as the Internet. Nothing prevents users from freely copying and redistributing digital content, including photos. Subsequently trying to find and erase the distributed copies would be impossible. ENISA states that the only way to prevent such redistribution would be to use digital rights management (DRM) technology similar to that used by certain publishers of digital content such as motion pictures and music. However, most of the DRM technologies can be easily circumvented. ENISA points out that partial enforcement of the “right to be forgotten” could be achieved by requiring search engines subject to European jurisdiction to filter search results so that the information that is supposed to be forgotten does not show up:

“A natural way to “mostly forget” data is thus to prevent its appearance in the results of search engines, and to filter it from sharing services like Twitter. EU member states could require search engine operators and sharing services to filter references to forgotten data. As a result, forgotten data would be very difficult to find, even though copies may survive, for instance, outside the EU jurisdiction”.

The French data protection authority, the CNIL, recently made three critical points about the Regulation in its Annual Report. First, the CNIL expressed concern that making a single data protection authority responsible for the Europe-wide activities of an enterprise could result in a significant decrease in the level of protection of individuals. Citing the example of a social network whose main establishment is located in another European member state, the CNIL said it was inappropriate to reduce the role of the French data protection authority to a simple mailbox to forward complaints to the principal DPA responsible for the social network’s activities. According to the CNIL, a French user who is harmed by the activities of an enterprise doing business in France should be able to look to the French regulator for redress.

The second point on which the CNIL diverges from the Commission is on the issue on international data transfers. The CNIL believes that transfers to countries that have not been recognized as providing adequate protection should be based on contractual clauses or binding corporate rules (BCRs) that have been approved in advance by the CNIL. Under the proposed regulation, an international transfer based on standard contractual clauses will not require the prior approval of the DPA.

Finally, the CNIL made the point that the new accountability measures included in the draft regulation should not be viewed as a form of self-regulation, or as a trade-off for less regulatory supervision. Instead, the accountability measures should be viewed as a supplement to existing regulatory principles and enforcement practices.

The issues that have been raised about the proposed Regulation are real and substantial. How the reviewers in the European Parliament analyze and report on the proposal will be critical. As important as momentum may be to obtain approval of the Regulation in a timely fashion, equally important is ensuring the passage of a workable and balanced set of new rules.

Key Elements of a Code of Conduct for Mobile Apps Transparency

Guest blog post from Mary J. Culnan, Senior Fellow at the Future of Privacy Forum

In February 2012, the White House issued a Consumer Privacy Bill of Rights.  In the report, the White House proposed legislation based on the privacy principles in the report and called on NTIA to convene stakeholders to develop enforceable codes of conduct implementing these principles for specific industries.  On July 15, 2012, NTIA-convened the first meeting of a multistakeholder process with the goal of developing a code of conduct to provide transparency about how companies providing applications and interactive services for mobile devices handle personal data.  FPF is an active participant in this process.  To date, the NTIA process has focused primarily on developing requirements for disclosure since effective disclosure is central to transparency and building consumer trust.  While workable disclosure standards are essential, they are not sufficient.  The FTC and others have identified additional characteristics of credible self-regulation  including promoting competition and providing for effective accountability and enforcement.  These issues also need to be addressed during NTIA process.  Because a large proportion of mobile apps are created by entrepreneurs or small businesses, it is likely these two issues will need to be considered jointly to ensure that the requirements of the final code of conduct are not anti-competitive.

Promoting Competition

On November 29, 2012, the FTC held an interesting workshop on Enforceable Codes of Conduct:  Protecting Consumers Across Borders.  In a keynote address, former FTC Chairman William Kovacic identified how codes can create barriers to entry by favoring incumbents and the ways they do business.  For example, these could include requirements that assume a privacy infrastructure larger firms already have in place while necessitating that small firms or entrepreneurs build an infrastructure from scratch in order to comply with the code.  Kovacic further argued that stakeholders can mitigate these problems during design and implementation by explicit consideration of competition as the code is developed, and evaluation after the fact to ensure there are no unanticipated consequences.

Effective Accountability and Enforcement

There are common principles for effective self-regulation, independent of the context, and many of these were discussed at the November FTC workshop described above.  In 2011, Dennis Hirsch and Ira Rubenstein published an article where they outlined the design considerations for enforceable codes of conduct to implement a broader set of privacy principles, citing an earlier version of the White House report as an example of this approach.  Organizations following an approved set of rules would enjoy safe harbor under an enforcement regime.  Hirsch and Rubenstein argued that accountability is critical to the credibility and success of any self-regulatory regime, and proposed a mix of monitoring techniques including self-certification, third-party audits and certification to keep costs reasonable.  They also cited the need for procedures to handle complaints and resolve disputes.  Individuals should exhaust their options under these dispute resolution procedures before a complaint is referred to the FTC or the state attorney general for an enforcement action.

Implications for Mobile Apps Transparency

Many of the traditional methods for transparency, accountability and enforcement have the potential to be anti-competitive.  For example, requiring organizations to post an online privacy notice on their website works well for ecommerce because online companies of all size use a website as a platform to do business.  App developers have no such need for a public website when they deliver their apps through an app market.  Very small app developers may not have the resources to create and maintain a public website that is not needed for their business.  Similarly, small app developers may not have the resources to develop programs to handle complaints or hire a third party to process complaints on their behalf.

The recent agreement the California Attorney General negotiated with many of the largest mobile app markets provides a potentially attractive solution for promoting accountability and enforcement while simultaneously promoting competition.  The agreement also will help educate app developers about privacy and their responsibilities.  Under this agreement, the app markets will include in the app submission process data fields for the developer to link to the app’s privacy policy or place to include the text of a privacy statement for that app.  The app markets will either enable the link or display the privacy statement.  Further, the apps markets will implement procedures for users to report complaints, and for investigating and addressing the complaints they receive.  App developers should self-certify that they will comply with their privacy notice and participate in the complaint resolution process.

Finally, other stakeholders in the mobile apps ecosystem can provide additional support for app developers by creating privacy policy generators or other tools that can help even the smallest app developer make their information practices transparent.  As there is likely to be a similar learning curve for app developers as there was for .com firms in the 1990’s, trade associations and other organizations in the ecosystem can help app developers understand that privacy is good for business because transparency is one key to earning and keeping the trust of consumers.

 

Gigya Launches SocialPrivacy™ Certification in Collaboration with FPF

New Survey Data Indicates Consumer Confusion Over Businesses’ Use of Social Data

Gigya, the leading provider of social infrastructure for business today announced the launch of SocialPrivacy™ Certification. The new certification will enable businesses to verify that they follow approved social network guidelines and industry best practices for managing consumer social data. Businesses can become certified after a thorough audit by Gigya to determine that they partake in fair social data marketing practices. In particular, they must follow four principles in order to gain SocialPrivacy™ Certification: 1) they will not sell user social data, 2) they will not post to social feeds without explicit permissions, 3) they will not engage in social data-based email marketing campaigns without user permissions, and 4) they will not send private messages to friends without permission. Once businesses are certified, they can display the Gigya SocialPrivacy™ Certification Seal in the login flow on their websites to transparently inform users how their social data will be used. In testing, use of the SocialPrivacy™ Certification Seal has shown an increase in social login conversion rates of 15%.

Future of Privacy Forum, recognized as a global leader in advancing responsible data practices, collaborated with Gigya to create the specific requirements for SocialPrivacy™ Certification.  FPF Director Jules Polonetsky will serve as chairman of the Gigya Privacy and Safety Advisory Board, established in part to provide expert guidance on privacy and safety best practices for Gigya and its clients.

Leading online businesses, including Martha Stewart Living Omnimedia, LUSH Cosmetics, Finish Line (Run.com), and The Globe and Mail are Gigya SocialPrivacy™ Certification launch partners, and will be implementing the SocialPrivacy™ Certification Seal in the coming months.

SocialPrivacy™ Certification was launched in conjunction with the release of new social login privacy survey results. The survey results indicate that, while social login usage is high, consumers are confused about social data collection practices. Users are also more apt to login with their social identities if sites explain how their profile data will be collected and used. The survey, conducted by SurveyMonkey on behalf of Gigya, questioned more than 2,600 U.S. consumers 18+, and found that:

“As sites look to leverage permission-based social data to power their marketing efforts, they must ensure that they follow the strict guidelines outlined by the social networks and other social data collection best practices,” said Jules Polonetsky, Director and Co-chair of the Future of Privacy Forum. “Gigya has taken a critical step in ensuring that businesses abide by these guidelines and, in doing so, is helping create a safer, more transparent web for businesses and consumers.”

“Consumers clearly prefer to use social login in many settings because it allows them to login quickly and securely without creating new credentials,” said Patrick Salyer, CEO of Gigya. “Yet we’ve also seen that those users want transparency around how their information is being collected and used. With SocialPrivacy™ Certification, we are bridging the gap by letting consumers know which sites have proved that their data is being used responsibly.”

Businesses interested in applying for Gigya SocialPrivacy™ Certification

can go to http://gigya.com/solutions/social-privacy for more information.

 For additional details on Gigya’s Social Privacy Survey, go to:

http://wp.me/p1crgF-57V

 

About Gigya:

Gigya provides websites with a complete social infrastructure that creates immersive social experiences for users and provides unparalleled customer insights for businesses.

Gigya equips businesses like ABC, Pepsi, and Verizon with a comprehensive solution to socialize their online properties. Our technology enables seamless registration with Social Login and Registration-as-a-Service, increases traffic and time spent on-site via Social Plugins and Gamification, strengthens customer relationships through SocialPrivacy™ Certification, and transforms marketing by leveraging permission-based social identity data.

Gigya works with more than 600 enterprises and touches more than one billion users per month. Our platform extracts the real value from social networks, empowering online businesses to attract, engage, and understand users like never before.

 

FTC's Second Kid's App Report: Disclosures Not Making the Grade

The Federal Trade Commission has just released of their second report reviewing the practices of apps targeted to kids. “Apps are often free, which is great for kids and easy on their parents wallet.” FPF Co-Chair and Director, Jules Polonetsky said, ” but when free means in return for data, app developers must understand the legal obligations that come with collecting kids data.”

The Commission repeated the methodology from their earlier report, but went several steps further this time – they downloaded the apps to compare whether the privacy notices accurately disclose the collection and sharing practices of app developers. The Commission found that only 20% of the 400 apps downloaded provided a link to a privacy policy prior to downloading the app. Among the apps that provided notices, these policies failed to provide critical information for parents as to how information is collected, used and shared. Moreover,  the survey shows that many apps fail to disclose that they transmit data to third parties and contain interactive features. For example, 58% percent of apps contain advertising but only 15% disclose this fact.

“Kids privacy is no longer an optional extra for App Developers given the heavy regulatory scrutiny that exists today,” said Christopher Wolf, FPF Founder and Co-Chair. Before downloading apps to children, parents should be able to learn what data an app collects and how it will be used. Indeed, the report states that the Commission is launching non-public investigations to determine if developers are violating COPPA or engaging in unfair or deceptive practices.

The FTC was particularly concerned about Device IDs being sent to ad networks, but recognized that doing so for analytics or other limited purposes wasn’t the primary privacy concern.  Rather, the Commission was worried about behavioral ads using Device IDs, a practice it has targeted in its proposed update to the Children’s Online Privacy Protection Rule.  The Commission has also launched a number of investigations stemming from the report.

The Commission called on app stores, app developers and third parties that interact with the apps to work together to develop accurate disclosures for parents.  FPF provides privacy resources for app developers, including COPPA information at www.applicationprivacy.org.  FPF is also working to provide a path for app platforms to be able to take more of a role in handling parental notice and age verification for apps by seeking clarification to the COPPA rule by supporting creation of “common consent mechanisms”.