EU Data Protection Reform: Draft Calendar

Green MEP  Jan Philipp Albrecht has released a draft calendar of action points for the EU Data Protection Reform. Mr. Albrecht, the  European Parliament Rapporteur assigned to the Data Protection Reform,  released the draft calendar ahead of next week’s Workshop on the Proposed Data Protection Regulation to be held by the Civil Liberties, Justice and Home Affairs Committee (LIBE).

The calendar, which will need to be approved by the other committees involved,  indicates that the Parliament may not enter into  a “trilogue” (an informal discussion aimed at finding agreement on package amendments) with the Council and Commission until the summer of 2013.

The Parliament’s first public consideration of the draft  regulation, a workshop hosted by the LIBE committee, will take place on Tuesday May 29th from 15h-18:30h in Brussels.

Below is a copy of Mr. Albrecht’s draft calendar:

May 23, 2012 – Study: Patriot Act Gives US Government No Special Access to Cloud Data, PC World

An often-repeated concern that the U.S. Patriot Act gives the U.S. government unequaled access to personal data stored on cloud services is incorrect, with several other nations enjoying similar access to cloud data, according to a study released Wednesday.

Swire Blog Post on TAP

Check out FPF Senior Fellow Peter Swire’s latest blog post on Technology, Politics, Academics (TAP) available here.

Swire recaps the recent Congressional Internet Caucus event “New Internet Privacy Legislation: What the White House, Federal Trade Commission and the European Commission are Recommending.” The event began with a presentation by Maneesha Mithal (FTC) and then transitioned to a panel discussion; Swire was a panelist in the event, and the panel was moderated by FPF co-chair Christopher Wolf.

The audio from the event is available here.

May 17, 2012 – Congressional Internet Caucus Talks Privacy, TAP

On May 14, the Advisory Committee to the Congressional Internet Caucus hosted a lunch discussion to a standing-room crowd, entitled “New Internet Privacy Legislation: What the White House, Federal Trade Commission and the European Commission are Recommending.”

May 11, 2012 – Analyst: App Developers Need to Lead the Way in Mobile Privacy, WebProNews

The debate surrounding mobile privacy is really heating up as smartphones become more ubiquitous. Consumers are growing dependent on their mobile devices, and are taking advantage of the hundreds of thousands of apps that are available to them

Consent and Cookies: How Will the ePrivacy Directive Change Online Business Practices?

The Online Trust Alliance hosted a webinar this week to consider how companies are preparing for the European Union’s new “ePrivacy Directive”. The 2009 amendment is set to be implemented in the United Kingdom on May 26th and will influence on-line companies’ ability to access and collect user information. In particular, the Directive will change information practices for companies who provide services to users within the EU by requiring that “the subscriber or user concerned has given his or her consent, having been provided with clear and comprehensive information”.

The new formulation of “consent” has led some companies to wonder whether they can rely on implied user consent or must obtain explicit user consent before collecting or accessing data on a user’s terminal equipment. Implied consent suggests that tools like browser settings, which can be set to allow for behavioral tracking, suffice in establishing a user’s willingness to provide information. Explicit consent, by contrast, refers to a situation in which users must allow their data to be collected through express action before their information can be collected.

Colin O’Malley, Chief Strategy Officer at Evidon, indicates that a company’s consent-procurement responsibilities depend on which EU Member State it is based in.  This is because the ePrivacy directive has been, and will continue to be implemented differently across member states. For example, France and Greece already require opt-in (explicit) consent to be obtained by companies while the UK and Germany consider that consent can be established using browser settings (implied). These laws vary further when a cookie is used (a tool used to store information on a user’s computer) depending on the purpose of that cookie.

Differences in member state implementation have led to some operational confusion among companies.

Mr. O’Malley corrects some broad misunderstandings regarding consent requirements in the EU to clarify potential compliance issues. First, despite being nicknamed “the cookie directive”, the ePrivacy Directive does not only affect the use of website cookies. Instead, the Directive’s provision on consent applies to all collection practices that store or access data on a user’s terminal equipment.

Second, the use of a separate ‘pop-up’ window is not necessary to gain explicit consent from users, in most cases consent buttons can be placed directly on a webpage. Finally, the amended Directive does not explicitly state that companies must obtain consent before setting a browser cookie; this is an interpretation that has emerged because cookies are commonly used for data collection purposes.

Mr. O’Malley suggests four steps to ensuring that your company is compliant or can become compliant with the EU legal regime.

First, “audit your website”. This means that you know what is on your site: who is using your site to collect information, what information they are collecting, and with what frequency they are collecting the information. Second, “assess intrusiveness” of the technology used (cookies, flash, etc.), whether it can be easily identified by users, and whether data collection is actually necessary.

Third, “determine your consent strategy” by identifying the implications of data collection. This will, for example, require you to consider the usefulness of data versus the intrusiveness of collection. Some sectors or business models consider data collection as an operational imperative (e.g. ad supported businesses) while others can suffer from overly intrusive collection practices. Finally, the amount of overhead that you are willing to dedicate to maintaining your consent policy will influence your strategy. Some businesses will be willing to carry a higher risk of non-compliance to limit internal technology costs.

Fourth, “deploy your consent model”. Your model should accommodate your company’s compliance needs and available resources. While this can be developed internally, you can consider using a technology provider who will be more familiar with the data-collection landscape. Finally, consider restricting your data collection practices (in addition to those of third parties) because they are also subject to the Directive’s provisions.

Mr. O’Malley asserts that there are currently no examples of consent that follow “the letter of the law” as laid out in the ePrivacy Directive. This means that companies will need to rework their consent strategies as member states continue to implement the ePrivacy Directive. Website operators will find it increasingly difficult to use ‘cookies’  and other forms of terminal-based data collection, leading to industry concerns about how the ePrivacy Directive will affect online business in the EU and globally.

Interested readers might also examine the UK Information Commissioner’s office guidance on the new “cookie regulation” and the International Chamber of Commerce UK cookie implementation guide. Also visit the DataDial blog for cookie law implementation ideas.

 

-Julian Flamant

May 9, 2012 – Privacy, information sharing issues loom in world of apps, KTVU News

Growing concerns over privacy with smartphone apps has gotten everyone from the courts to the federal government to industry leaders entering the debate, but a solution may still be a long time coming.

Missing the Consumer Value of Social Media

I love Consumer Reports.   I rely on the magazine for top notch reviews.  Their testing of consumer products is unbiased and invaluable to anyone who takes both price and value seriously. I am currently looking to purchase a quality home treadmill and was pleased to see that a recent issue included a detailed report based on hands on testing and detailed consumer surveys.  But before I make a final decision, I will reach out to my Facebook social network friends for their input.  Some of them will have personal experience researching this type of purchase, others will know me and my foibles and perhaps advise that I get one that works with the new fitness apps available or a longer track so I don’t fall off while multi-tasking.

Facebook and other social media tools have empowered consumers more than any other development in many years, except perhaps for the creation of Consumer Reports itself.  Consumers have a megaphone to voice and spread their concerns about any business or product in a way that demands a reaction.  Companies dedicate special teams to follow opinions about their brands on social media and employ specialists to respond to individual concerns.

Certainly social media has its downsides.  Despite improvements in recent years, privacy controls are still not intuitive.  For years, posting online was public while email and chat were private.  Now, depending on our settings and which service we are using, our posts may be public, our pictures may be private and our location check-ins may be available to “friends of friends”.  It can get very complicated.

What’s the point of sharing with a friend of a friend? The other day I was complaining on Facebook about a new airplane baggage fee when a friend responded to commiserate.  But then she tagged her friend the travel agent who jumped in with great expert advice about how to avoid that fee.  Privacy concerns about sharing my travel plans?  In my neighborhood, folks on my block and friends of my kids know when we are on vacation.  If I forget to stop the newspaper delivery, they pick it up for me.  Don’t announce it to the general public, where some crook may scour public information for evil purposes.  But online or offline, sharing with your friends or community does far more to empower most of us than it does to create any new risks.

I was surprised therefor to see the harshly negative general view Consumer Reports took towards Facebook in the issue released yesterday.  I expected criticism of the usability of privacy controls or complaints about apps that ask consumers for more information than they need.  Facebook is aware of those issues and is working with our think tank and others on the best ways to continue to make progress ensuring that the thousands of developers who create their own consumer apps use data responsibly.  But Consumer Reports seems to be taking the view that social media sharing is by definition a bad idea, even when people are sharing with their own friends.  True, things posted privately can be further shared, but that’s the case with much of what we do online and the price for convenience we make every day when we use email to send information around the world.

Consumer Reports notes with alarm that millions of people are publicly showing support for the battles against various illnesses by publicly “liking” pages dedicated to the diseases.  I would like to urge more consumers to “like” the fights to cure and de-stigmatize disease.  And I think it is preposterous to think that if I announce that I am attending a march to raise money to fund breast cancer research that an insurer will be able to use it against me.  Are there any companies that are mistreating people for such activity? I would like Consumer Reports to find out, so I can denounce them on Facebook for my friends to read and pass on and on.

It was also surprising to see media reports alarmed at Consumer Reports finding that 13 million people were unfamiliar with Facebook privacy controls.  With 188 million users in Facebook and Canada (the regions surveyed) this means that more than 90% of users say they are familiar with the controls.  That is remarkably positive!

Certainly folks who post compromising pictures or comments, on Facebook or on blogs or anywhere public, should understand that friends, employers, colleges, prospective dates will judge you.  Do a search using Google and see what shows up.  You have a digital identity that is being used to assess you, just like the clothes you wear every day and the people you associate with form your public identity.  As more is available online, Facebook, Google and others needs to help us shape that identity to put our best self forward to those who are interested in us.  Which online services are doing the most to help consumers pro-actively shape their reputations and empower them to make smarter decisions?  That’s the Consumer Reports study that I would like to see.  But by applying a pessimistic eye towards social media in general as it conducted this survey, this month’s Consumer Reports magazine isn’t going to be a “best value” for today’s online consumer.

 

-Jules Polonetsky

Big Data Research

TechAmerica hosted a Congressional Briefing, Big Data: What it Means and How it Drives Innovation, this week. The event’s distinguished panel included a variety of industry experts and focused on the meaning of “Big Data”, how data is being used in the market-place, government uses of “Big Data”, and the future of “Big Data”.

The panel explained that “Big Data” as a concept has existed for several years and is also referred to as “Data Mining”. The terms refer to the processing, or mining, of large raw data sets with the goal of establishing usable information. Often cited examples of Big Data use are identity verification, market research, and fraud protection.

Despite not being a new concept in itself, the evolution of technological capabilities is changing the way in which Big Data is used. Heightened data processing capabilities mean that analysis can be done with larger data sets and using fewer resources. Essentially, this is changing the way that companies analyze data and allowing them to develop a greater amount of insights through their analysis.

Bill Perlowitz, CTO of the Science Technology & Engineering Group at Wyle, illustrated the emerging form of data analysis as a shift from hypothetical research to data driven research. Rather than analyzing data to reach pre-determined information goals, actors can now process data to establish their goals and reach new factual assumptions. Big Data Research is no longer limited to what researchers can imagine.

This shift in analytical paradigms highlights two factors that could lead to privacy concerns. First, the new model relies on large amounts of data, which incentivizes companies to collect and retain data on a larger scale and for longer periods of time. This can potentially conflict with privacy practices such as data minimization and purpose limitation.

Second, the new data driven research model, which no longer necessarily relies on a pre-established research goal, maximizes the factual discoveries that are established in each analytic cycle. When the data is about individuals, insights gleaned can potentially be intrusive by nature.

The increased usability of Big Data is placing strain on companies that deal with personal information to maintain privacy practices. How can companies maintain privacy under these circumstances? For Nuala O’Connor, privacy lead at General Electric, industry actors should focus on good data stewardship and establishing best practices to keep data confidential. Ms. O’Connor indicated data de-identification as an example of a good privacy practice. However, though potentially mitigating privacy threats, de-identification is not a privacy ‘silver bullet’. Mainly, some data processors question whether de-identifying data could limit its usability and may therefore be reluctant to use the privacy practice.

As Big Data becomes increasingly accessible and usable, good privacy practices are also contingent on policy-makers and companies’ ability to balance privacy with the gains expected from data analysis. This involves making a value judgment regarding what appropriate data uses are. As indicated by Jules Polonetsky and Omer Tene, “It is doubtful that such a value choice has consciously been made”. Privacy leaders would do well to begin to consider both the tools that can promote privacy protections when data is used and the societal merits of the use of Big Data.

-Julian Flamant

Swire Presents at Privacy Working Group

On April 26, FPF Senior Fellow and Ohio State Professor Peter Swire gave a presentation to the Privacy Working Group based on his forthcoming article in the University of North Carolina Law Review, entitled “Social Networks, Privacy, and Freedom of Association: Data Empowerment vs. Data Protection.” The presentation focused on Swire’s new research exploring how social networks, such as Facebook and LinkedIn, are platforms for association, and Swire explored how to establish a legal framework regarding social networks.

Swire examined some of the complex benefits and risks of social networks. Social networks can inadvertently reveal very personal information about individuals and increase privacy concerns. On the other hand, social networks can be used for political mobilization and allow people to freely voice their opinions, such as during the Arab Spring or the 2008 presidential campaign. Indeed, social media has become increasingly essential to associational activity; Swire pointed out that virtually all charities in the U.S. use some form of social media and political campaigns are increasingly relying on social media, too. Social networks, therefore, are an important way people can exercise their First Amendment rights to freedom of association and speech.

Privacy, Swire pointed out, can both help and hinder freedom of association and speech on social networks. Privacy can be essential to protect politically unpopular organizations from harassment. For example, in 1957 the Supreme Court found that the NAACP did not have to divulge its membership lists to the state of Alabama as this would chill free speech. Conversely, privacy controls can make it very difficult for political organizations to find new members, for example by preventing them from easily contacting new potential members.

Swire also explored how the concepts of data empowerment and data protection impact social networks. In the U.S., the debate frequently centers around how to use data in innovative ways (data empowerment). Technology can consequently be used to empower the creation of new associations, a right enshrined in the first amendment. Meanwhile, in the E.U., the debate frequently focuses on how to protect privacy (data protection). Data protection implies that data is fundamentally risky, and that users’ rights frequently outweigh the utility of new uses for data. The tension between these two outlooks can lead to a conflict between rights: freedom to associate vs. privacy.

There are therefore a number of complex legal issues surrounding social networks that have yet to be decided. Swire’s presentation and research are an important step in determining what this framework should look like. However, the details of this legal framework will be very difficult to determine, and as Swire ironically pointed out, “this may turn out to be just as easy as campaign finance reform.”

-Steven Beale

Please find a link to the presentation here.