Purpose or Interest: that is the question!

We are pleased to present this guest post from Prof. Lokke Moerel, a leading EU privacy lawyer.  We think her blog and paper are fascinating and important contributions to the current discussion of key privacy topics, including big data, the Internet of Things, and EU data protection laws. 

Let us imagine a mobile phone application that traces your movements and phone calls in order to inform you whether you are likely to catch influenza. The app can even tell you which friends you should avoid in order to minimize your risk of catching the flu – even if those friends have not yet been affected by it themselves (this is not fiction, see research of MIT Professor Alex Pentland). Would you install this application on your smartphone as soon as you had the chance? Then imagine the use of a similar app by the World Health Organization, in order to protect public health during pandemics. Two applications that both collect and process personal data for the same purpose: the monitoring and personalized prediction of health and illness. But the sentiments that these two applications give rise to are likely very different.

When we pause to reflect on this, the conclusion is that it is not so much the purposes for which personal data might be used that are the primary consideration here, but rather the interests that are served by the use of the data collected. And yet, both the current and the upcoming EU data protection regime are based primarily on the purpose for which data are collected and processed, while the interests served play a much more subordinate role. This raises the question of whether this legal regime can be effective and can be considered legitimate as we move into a future whereby society is driven by data.

In my paper with Prof. Corien Prins: “Privacy for the homo digitalis: Proposal for a new regulatory framework for data protection in the light of Big Data and the Internet of Things,” we analyze innovations in data processing as a result of developments such as big data and the Internet of Things and discuss why these developments undermine the effectiveness and legitimacy of the current as well as upcoming EU data protection regime, thereby focusing on the private sector. The paper includes a detailed analysis of key data processing principles used in the European data protection regime (purpose limitation, informational self-determination and data quality) and argues that due to social trends and technological developments, the principle of purpose limitation should be abandoned as a separate criterion. Also, other principles (such as consent and the performance of an agreement) should no longer be recognised as grounds that play a role on their own in legitimizing data processing. Instead, we propose a single test: whether there is a legitimate interest for the whole life cycle of personal data processing (collection, use, further use and destruction of data). We argue that such a test will provide for a more effective data protection regime that will have more legitimacy than the assessment under the existing legal regime that is primarily based on the purposes for which data may be collected and further used. This test has been drafted in such a way that it enables companies to comply with the new requirements under the upcoming EU General Data Protection Regulation, which will become effective in 2018. We conclude our analysis with proposals to increase the effectiveness of enforcement of the data protection rules.

Read the paper summary

Read the full paper

June 28th Event: Ensuring Individual Privacy in a Data Driven World

Criteo and The Future of Privacy Forum are pleased to invite you to an exceptional conference gathering a very high-level selection of regulators, lawyers, advertisers, publishers and politics to discuss about individual privacy in a data driven world.

You will get valuable insights from Axelle Lemaire (French government), Jean-Baptiste Rudelle (Criteo), Jurgen Van Staden (NAI), Gwendal Le Grand (CNIL) and other speakers.

Save the date for Tuesday, June 28th from 09:00 to 18:00.

REGISTER HERE

The Benefits, Challenges, and Potential Roles for the Government in Fostering the Advancement of the Internet of Things

Yesterday, the Future of Privacy Forum filed comments with the National Telecommunications and Information Administration (NTIA) in response to NTIA’s inquiry into the Internet of Things (IoT).  NTIA asked policy experts and other stakeholders to identify key issues affecting deployment the IoT – a broad category of devices, appliances, and objects that can be connected via the Internet.  The Internet of Things has been a focus of FPF’s work since our founding in 2008. FPF recognizes the enormous potential benefits to consumers and to society of the inter-connected applications offered through the Internet of Things.

FPF’s comments, “The Benefits, Challenges, and Potential Roles for the Government in Fostering the Advancement of the Internet of Things,” describe the privacy and security challenges presented by IoT technologies, as well as the enormous potential benefits to consumers and to society of the inter-connected applications offered through the Internet of Things.  FPF urges NTIA to promote the use of IoT data in ways that will benefit disadvantaged populations and promote inclusion.  Our comments highlight IoT technologies that offer direct, meaningful benefits for individuals who are elderly, infirm, visually impaired, deaf, living with chronic health conditions, suffering from mobility-related disabilities, or economically disadvantaged.  Today, IoT technologies are improving the day-to-day quality of life of traditionally underserved groups:

Emerging IoT technologies promise to broaden inclusiveness for traditionally underserved groups in the immediate future.  Common sense privacy protections can build trust in IoT technologies and help ensure that consumers enjoy the full benefits of IoT sensors and devices.

Read NTIA’s Request for Comment.

Read FPF’s Comments.

Enhancing Usability for Online Privacy Controls

Today, Google announced new features that provide users with additional customized options and controls over personal data, as well as easy-to-follow  instructions and notifications that explain users’ choices in simple terms. The new features make privacy controls quicker to find and easier to understand and operate. For example, the changes make Google’s privacy controls more accessible via web search and voice commands; users are increasingly relying on search and voice to quickly get important information and operate mobile devices.

FPF is committed to advancing responsible data practices, including design techniques that create practical, usable tools that help consumers access and control personal data.  Part of using data responsibly means going beyond just posting privacy policies; it should also mean putting the same effort that goes into making a product user friendly into making privacy and data-related functions easy to find and understand.  When these attributes meet, consumers win with well-designed, user-friendly privacy settings and controls.

FPF has long talked about the need for companies to compete on privacy. We strongly support signs like Google’s new features that show how the market for privacy tools is growing. The biggest advances for consumers – privacy tools that consumers most want and use – are increasingly being driven by consumer demand and market competition.

The new Google features are additions to the company’s “My Account” – the hub Google created last year to give users a quick and easy way to safeguard their data and protect their privacy throughout Google accounts. “My Account” put these privacy and security tools in one place, simplifying users’ making it easy for the user to understand and select clear privacy settings from any one of their devices, to control settings across all their devices.

Now, it is easier than ever to find the access controls. From any device, signed-in Google users can simply search their name, and see a shortcut to the My Account hub. This interaction leverages the fact that people rarely remember where the account settings are in different programs – they are increasingly searching for options rather than using menus.  Showing My Account atop Search – and linking to the options directly – promotes the functions that let people easily get to important information. Users follow this process for flights, to track package delivery, and to review payment accounts, so why not make account and privacy info and options just as easy and responsive?

In addition, Google offers a voice option to get to My Account. Voice controls are an increasingly important interface for mobile functions – to launch apps and to access options buried within apps or sites – so it is great to see voice controls that get you directly to privacy and security features on an account that is used across every device you have. According to Google, a user can simply say, “Ok Google, show me my Google account,” and it takes take the user there.

Finally, “find your phone” is a new feature that will help locate a phone that has been lost or stolen. Phones hold some of our most sensitive data: personal texts, family photos, work emails, financial information, and more. Millions of phones are lost or stolen every year. When users first first realizd their phone is missing, it’s easy to panic and not always easy to know what to do next. Now, a user can locate and lock their phone, as well as secure their account and leave a callback number on the screen.

These updates are a major step forward for practical, usable design in the privacy field. It is encouraging to see leading design principles applied to privacy controls that are available to more than 1 Billion users.

W&L Law Offers DC-based Cyber Law and Privacy Seminar with Future of Privacy Forum

Washington and Lee University School of Law has launched a new summer program in Washington, DC for students interested in studying cyber and privacy law.

The program is part of W&L’s exclusive academic partnership launched last year with the Future of Privacy Forum (FPF), a DC-based think tank that promotes responsible data privacy policies. The FPF was founded by W&L alumnus Christopher Wolf ‘80L, senior partner and former director of the Information Privacy Practice Group of Hogan Lovells.

The course, titled “Cyber Policy and Privacy Law,” will be co-taught by Professor Margaret Hu and Jules Polonetsky, CEO at FPF. The course will examine how the expanding role of the internet, big data, e-commerce, social media, and wearable technology has strained the preexisting regulatory and constitutional frameworks that have guided privacy protections under the law. The seminar will delve into these topics in both corporate and government contexts.

Read the full story.

June 14th Event: A Roundtable on Ethics, Privacy, and Research Reviews

Please join the Future of Privacy Forum (FPF) and the Ohio State University’s Program on Data and Governance in Washington, DC, on Tuesday, June 14, 2016, for a discussion of ethics, privacy and practical research reviews in corporate settings.

This timely event, which follows the White House’s call to develop strong data ethics frameworks, will convene corporate and academic leaders to discuss how to integrate ethical and privacy considerations into innovative data projects and research.  The roundtable is an important extension of FPF’s December 2015 workshop, “Beyond IRBs: Designing Ethical Review Processes for Big Data Research,” supported by the National Science Foundation and Alfred P. Sloan Foundation.

A major focus of the roundtable will be on new papers by Facebook’s Molly Jackman and Lauri Kanerva entitled, “Evolving the IRB: Building Robust Review for Industry Research” and by CDT’s Michelle De Mooy and Fitbit’s Shelten Yuen entitled “Toward Privacy Aware Research and Development in Wearable Health.”

The event is free and open to the public, but please register as space is limited.

We look forward to seeing you there.

REGISTER HERE

Agenda


9:30-9:45 am.  Welcome

9:45-10:15 am.  Paper: Evolving the IRB: Building Robust Review for Industry Research

10:15-10:45 am.  Paper: Toward Privacy Aware Research and Development in Wearable Health

10:45-11:00 am.  Coffee Break

11:00-12:00 pm.  Panel: Ethical Reviews and Research Data Governance in Corporate Settings

12:00-12:30 pm.  Closing Discussion

This program is free and open to the public, but please register here as space is limited.

WHEN


Tuesday, June 14, 2016 from 9:30 am to 12:30 pm (EDT) – Add to Calendar

WHERE


Future of Privacy Forum – 1400 I Street Northwest Suite 450, Washington, DC 20005 – View Map


PAPERS

Read Facebook’s paper

Read Facebook’s blog

Read Jules Polonetsky’s and Dennis Hirsch’s joint op-ed in Re/code

Read Beyond IRBs: Designing Ethical Review Processes for Big Data

Student data privacy: Moving from fear to responsible use

Data has always been an inherent part of the educational process – a child’s age, correlated with her grade level, tracked to specific reading or math skills that align with that grade, measured by grades and tests which rank her according to her peers. Today this data is ever more critical. Education professionals seek understanding from what the data reflect on the teacher’s role and influence, evaluating student outcomes across classrooms. Parents seek similar measures on individual K-12 schools and districts, and desperately seek insight into the value of education at individual colleges and universities to justify the cost or debt incurred.

In the last two years, there has been a perfect storm on the topic of student data privacy. The role of technology within schools expanded at an unprecedented rate, general awareness of consumer data security and breaches increased, and student databases at the state or national level were established or proposed, which drew great public scrutiny and fear. This maelstrom yielded a tremendous output of legislative activity targeted at education technology companies, that was overwhelmingly focused on protecting and limiting the sharing and use of student data—in rare instances, to the point of forbidding research uses almost completely. There are signs that this wave of fear-driven response has finally crested, and that more measured conversations are occurring; conversations that prioritize the fundamental requirement for appropriate privacy and security, but with a clear focus on the invaluable role of research and analysis and the need to enable it.

Read the full piece on Brookings

 

Future of Privacy Forum and ConnectSafely Release Educator's Guide to Student Data Privacy

FOR IMMEDIATE RELEASE                     

May 23, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

Contact: Hani Khan, Outreach Manager, [email protected]

FUTURE OF PRIVACY FORUM AND CONNECTSAFELY RELEASE
EDUCATOR’S GUIDE TO STUDENT DATA PRIVACY

Washington, DC –  Today, the Future of Privacy Forum (FPF) and ConnectSafely are releasing the Educator’s Guide to Student Data Privacy. Technology tools and applications are changing the way schools and teachers educate students across the country. New resources are making it easier for teachers and students to communicate in and outside of the classroom making learning a 24/7 activity. When schools use technology, a student’s personal information is often collected and shared for the purpose of furthering their education. The Educator’s Guide will help teachers utilize technology in the classroom responsibly and protect their students’ privacy, explaining among other things:

The Educator’s Guide follows the success of the Parent’s Guide to Student Data Privacy, which was jointly developed by ConnectSafely and FPF to help parents understand the laws and regulations in place to protect their child’s personally identifiable information. The Educator’s Guide will help teachers understand their role in protecting students’ information and ensuring students use classroom technology in a responsible way.

“Schools across the country are rapidly integrating new technologies meant to deliver a quality education to students in more effective and efficient manner. The Educator’s Guide will be a vital new resource for teachers when helping students use educational applications and technology in a safe way that protects their data privacy,” said Kobie Pruitt, Education Policy Manager, FPF.

“As a teacher who knows first-hand how technology tools have helped my students feel empowered to find the resources they need, collaborate with one another and with me, and create professional quality work I know we are in the midst of a transformational time for education. Teachers want to both inspire and protect their students, and this guide will help them feel confident they are doing both when using technology,” said Kerry Gallagher, Director of K-12 Education for ConnectSafely.

FPF and ConnectSafely are proud to help educators implement new technologies responsibly in order to promote successful student outcomes. To view and download the Educator’s Guide to Student Data Privacy and many other online privacy and safety resources, please visit FERPA|SHERPA andConnectSafely.

###
The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. Learn more about FPF.
ConnectSafely.org is a Silicon Valley, Calif.-based nonprofit organization dedicated to educating users of connected technology about safety, privacy and security with research-based safety tips, parents’ guidebooks, advice, news and commentary on all aspects of tech use and policy.

Comprehensive Online Tracking is Not Unique to ISPs

Last week, the Senate Judiciary Committee (Subcommittee on Privacy, Technology, and the Law) held a hearing to explore the FCC’s proposed privacy rules regulating Broadband Internet Access Service providers (a subset of Internet Service Providers, or ISPs).

The discussion among top privacy regulators returned at several points to the value of consistency of privacy protection across the ecosystem. The proposed FCC rules would restrict ISPs from using customer proprietary information—defined broadly to include things like IP addresses, unique identifiers, and other personal information—for any purpose outside of providing their own services and marketing their own (or affiliates’) communications-related services, at least without seeking the customer’s affirmative Opt In consent. The justification given for these rules is that ISPs are uniquely “in a position to develop highly detailed and comprehensive profiles of their customers.” See para. 4, Notice of Proposed Rulemaking.

Near the end of the discussion, the FCC Chairman stated:

“When I go to Google [. . .] that is a decision that I am making. [. . .] I go to WebMD, and WebMD collects information on me. I go to Weather.com and Weather.com collects information on me. I go to Facebook and Facebook collects information on me. But only one entity connects all of that information, that I’m going to all those different sites, and can turn around and monetize it.” – Chairman Wheeler (approx. ~1h:30m)

This framing of the issue reflects a fundamental misunderstanding of the current online advertising ecosystem, which is fully capable of tracking individual behavior across the Internet as well as between devices.

Using Mozilla’s Lightbeam for Firefox extension, it can quickly be seen that the third party tracking industry is inter-woven and comprehensive. After installing the Firefox browser and visiting only one website (WebMD.com) (See Fig. 1, below), I have connected with 24 third party sites.

1 - webmd red sq

Fig. 1. Mozilla’s “Lightbeam for Firefox” extension demonstrates that my visit to a single website generated 24 third party connections. Circular nodes represent websites visited, and triangular nodes are third party sites. Purple lines identify when a site has stored data on the browser (cookies).

After visiting three additional sites—for a grand total of four websites—I have connected with 119 third party entities (see Fig. 2, below). Each “connection” means that entity can identify the web page the consumer is visiting and can share that data with other parties to which they are interconnected. Some parties are linked up to many web sites. But even for those that are not directly connected to a particular site, third party entities who are linked are capable of buying and selling this data at third party data exchanges. These data exchanges, by linking and compiling data from hundreds of different online and offline sources, can “match up” consumer behavior across the Internet, creating comprehensive and detailed individual profiles.

4 - cnn red sq

Fig. 2. Mozilla “Lightbeam for Firefox” display after visiting only four websites. Circular nodes represent websites visited, and triangular nodes are third party sites. Purple lines identify when a site has stored data on the browser (cookies).

The third party advertising networks and data partners visualized above use a variety of methods designed to create comprehensive profiles of a user’s entire web browsing history. This includes persistent identifiers (cookies), IP addresses, device identifiers, direct authentication (such as email addresses), or probabilistic methods (such as browser fingerprinting). For a more extensive explanation of these tracking methods, see our 2015 report on Cross-Device Tracking. Furthermore, this information can be combined with offline data (appended data), such as a user’s in-store purchase history, for an even more comprehensive consumer profile.

Many of the leading online platforms also correlate data across websites. For example, many websites (including WebMD, seen above) carry social media plug-ins that allow those social media platforms to compile browsing histories of individuals across the Internet and link that browsing activity to the same user’s social media behavior.  If a consumer browsing the Web sees a Twitter button on a website they visit,  that data goes to Twitter to help serves ads on Twitter.

Mobile apps often collect even more granular information, such as information about the user’s in-app behavior, and other mobile data such as the Calendar or Contacts. Access to some mobile data (such as Location Services) requires the user’s Opt In permission, but access to other mobile information (such as the nearby Wi-Fi networks, from which location can be inferred) sometimes does not.  Some leading apps serve ads based on knowing what other apps are installed on a user’s device. WebMD, for example, in addition to tracking and sharing the data visualized above, has a mobile app that enables it to track users across desktop and mobile platforms.

This data collection usually occurs without directly sharing explicitly personal information—rather, for security and privacy reasons, industry players typically match up individual behavior using “hashed” identifiers. And many, including Commissioner Ajit Pai, have pointed out that online tracking has generated benefits for consumers, including the availability of free and reduced-cost online content subsidized by online advertising that can be made more efficient and relevant through information about online audiences.

There have been many industry efforts in recent years to self-regulate the market in order to alleviate these privacy concerns and build consumer trust. For example, the National Advertising Initiative and Digital Advertising Alliance have enforceable codes and guidelines covering the uses of online data collected and used for online behavioral advertising (interest-based advertising). The Wireless Association (CTIA) has issued high-level voluntary guidelines around mobile data, especially geo-location. And the Future of Privacy Forum (FPF) has a Location & Ad Practices Working Group that is developing best practices and consumer awareness information around online advertising and another group that has developed best practices for data from wearables and wellness apps.

Despite industry efforts, for many advocates and consumers the comprehensive and pervasive nature of online tracking continues to be debated. But one thing is certain: it is not unique to ISPs.

Scientists Are Just as Confused About the Ethics of Big-Data Research as You

And the patchwork of review boards responsible for overseeing those risks are only slowly inching into the 21st century. Under the Common Rule in the US, federally funded research has to go through ethical review. Rather than one unified system though, every single university has its own institutional review board, or IRB. Most IRB members are researchers at the university, most often in the biomedical sciences. Few are professional ethicists.

Even fewer have computer science or security expertise, which may be necessary to protect participants in this new kind of research. “The IRB may make very different decisions based on who is on the board, what university it is, and what they’re feeling that day,” says Kelsey Finch, policy counsel at the Future of Privacy Forum. There are hundreds of these IRBs in the US—and they’re grappling with research ethics in the digital age largely on their own.

Read the full article in Wired.