De-Identification: Practice and Policy, April 13 in San Francisco

The Future of Privacy Forum, EY, and Privacy Analytics are hosting an event to share and advance practices and policies around de-identification. This all day forum will include panel discussions on topics such as emerging policy questions, de-identification case studies, implementation and best practices, and the role of controls. We encourage audience participation and knowledge sharing.

Wednesday, April 13, 2016 from 1:00 PM to 6:00 PM (PST)

San Francisco, California

The program will include:

Panel 1: Law, Self-Regulation, and Standards

This panel discusses the regulatory, operational, and technical frameworks guiding de-identification practices today, including perspectives on the evolving definition of personally identifiable information, the development of self-regulatory mechanisms, and international standard-setting efforts

Panel 2: Sector-Specific Case Studies

This panel discusses practical considerations and critical issues when implementing de-identification practices, including real world examples from in a diverse range of industries

Panel 3: The Role of Controls

This panel discusses managing de-identification in the context of comprehensive privacy and security program, including balancing technical and administrative controls, weighing the benefits and risks of data use, and evaluating safeguards for sharing data.

Closing and Group Discussion led by FPF, EY, and Privacy Analytics

* * *

Following the event, please join us for a reception sponsored by Privacy Analytics at Roy’s Restaurant, located across the street from the EY office at 575 Mission Street.

To register, please click HEREThis event is free, but space is limited. Contact Kelsey Finch at [email protected] with any questions.

 

The program is co-hosted by:

Student Privacy Pledge – Hits 250 with Launch of New Site!

The Student Privacy Pledge, a public commitment by education technology companies for the responsible handling of student data, has reached the milestone of 250 signatories. We are also pleased to announce the launch of the newly re-designed Student Privacy Pledge website. The site, studentprivacypledge.org, now provides more information, including a Frequently Asked Questions section, and is easier for visitors to navigate, find signatory companies, and inquire about signing the Pledge.

The K-12 Student Privacy Pledge was introduced by the Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) in October 2014 with 14 original signatories and took effect in January 2015 as a legally enforceable agreement for companies that provide services to schools. The twelve specific commitments in the Pledge detail ongoing industry practices that both meet the demands of families and schools and track key federal and state laws. By signing the Pledge, school service providers clearly articulate their adherence to these practices to schools and parents regarding the collection, use, maintenance, and retention of student data.

As we enter 2016, we have seen a rapid increase in inquiries and companies taking the Pledge, which continues to provide accountability for signatory school service providers. The result is a bolstering of the public trust necessary for continued technology access for school operations and student learning – technology that is critical to the nation’s continued educational and economic competitiveness.

The Pledge adds to an existing framework of student data protections, which also include existing laws, contracts, and company privacy policies. A company’s security and other commitments made under the Student Privacy Pledge are legally enforceable under Section 5 of the federal Consumer Protection Act.

“The sustained support and interest in the student privacy pledge demonstrate the commitment education service providers have to protecting student information. As many states have passed legislation on this issue, the strength of the Pledge and its commitments show the providers’ awareness of community expectations in addition to their legal responsibilities.”

– Mark MacCarthy, Senior Vice President, Public Policy, Software & Information Industry Association (SIIA)

FPF and SIIA are proud to facilitate the efforts of education technology companies to lead in the responsible use of student data by signing the Student Privacy Pledge. We look forward to a continuing increase in the number of companies joining this effort and agreeing to be held publicly accountable to the safeguards embodied in the Pledge.

Read the full text of the Student Privacy Pledge here.

19 Times Data Analysis Empowered Students and Schools

Which Students Succeed and Why?

 

“Thoughtful use of education data has tremendous potential to improve and address inequities in America’s education system. Scientists better understand how the brain incorporates new information and skills. Educators have a more accurate sense of student progress and potential risk for dropping out. Students and teachers use more detailed information about their strengths, weaknesses, and individual academic performance to diagnose and address learning gaps. Schools can correlate patterns with failing or dropping out, and intervene early with at-risk students. Districts and schools can use data to allocate resources and create institutional reform to better meet student needs in a world where students take increasingly personalized or non-traditional paths to graduation.”

Thus begins FPF’s newest paper, by Elana Zeide, which goes on to demonstrate the power of data to show school, districts, parents, and students, the trends and outcomes that are occurring, and inspire ways to make those outcomes better.

Student data, as part of the education record from each student’s school experience, is most importantly a tool for that student to reflect their achievements, and inform their future decisions. In addition, however, data across students and over time enables insights for teachers, administrators, districts, and states to identify trends, show patterns, and evaluate the success of educational changes to ensure that new programs or services achieve the desired results.

This paper identifies 19 studies – a relatively small sample – where data was successfully used to evaluate a program, create a new strategy, or delve into equity and bias issues. The appropriate protection and responsible use of student data in such studies is a fundamental value. But the power of data to shed light on current student and educational system outcomes and improve the opportunity for individual success is overwhelming.

New data analysis techniques provide the opportunities to understand and transform learning theory and practice. As Ms. Zeide concludes: “Properly used, mindfully implemented, and with appropriate privacy protections, student data is a tremendous resource to help schools fulfill the great promise of providing quality education for all.”

Read the full paper here.

Broadband Privacy and the FCC: Protect Consumers from Being Deceived and from Unfair Practices

BroadBand%20Pic

Left to right: Jon Leibowitz, Davis Polk & Wardwell LLP, Former Chairman of the Federal Trade Commission, Professor Peter Swire, Huang Professor of Law and Ethics, Scheller College of Business, Georgia Institute of Technology, Katharina Kopp, Ph.D., Director of Privacy and Data Project, Center for Democracy & Technology, Debra Berlyn, President, Consumer Policy Solutions, and Jules Polonetsky, CEO, Future of Privacy Forum.

Yesterday, the Future of Privacy Forum hosted an event to discuss the direction the FCC could take to best advance consumer protections as it considers how to regulate broadband providers’ use of consumer data. The key question was whether the FCC should adopt the “no deception or unfairness” model successfully used by the FTC and many State Attorneys General for many decades. Even local consumer regulators use this model — as Consumer Affairs Commissioner of New York City under Mayor Giuliani, I enforced NYC’s “mini FTC act” to protect consumers. Or, as some have argued, should the FCC come up with its own privacy regime of rules specific to ISPs?

Some important concepts we examined at the event included the following:

Broad Privacy Regimes or a Sectoral Approach?

Privacy and consumer advocates have long criticized the US sectoral approach to privacy, arguing that it is confusing and less effective than a broad general set of privacy rules for all data. The Obama administration embraced this view when it proposed a broad based Consumer Privacy Bill of Rights, which would have promised protection for any online personal information collected about consumers. Globally, the trend towards comprehensive privacy regulation that started in Europe has spread throughout almost all of the Western world. Katharina Kopp of the Center of Democracy and Technology noted that CDT’s long term goal was comprehensive privacy legislation across all sectors.

Does adding an additional area of sector-specific privacy legislation take a step backwards and make achieving broad privacy legislation less likely? Likely so, in my view, but even more likely so if the path the FCC takes is out of sync with the general broad approach that is applicable across the rest of the economy.

Is the FTC an Effective Enforcer of Online Privacy?

The FTC has been an aggressive actor in using its broad Section Five authority to bring numerous actions against companies of every shape or size. Tech giants Google, Microsoft and Facebook are all subject to 20 year consent decrees following FTC enforcement actions. The FTC has been able to bring actions in cases of consumer harm or deception, even when the harm or deception has been fairly conceptual, as in the Nomi case where the company failed to provide an opt-out that it wasn’t required to provide. Despite an almost certainty that no consumer entering a store had ever heard of Nomi or read its policy, the FTC took action based on its very broad view of its Section 5 Authority. Former FTC Chairman Jon Leibowitz discussed the important lead role the FTC has played in successfully policing online practices using its deception and unfairness authority.

Are ISPs unique in the types or amount of data they collect?

One important consideration for regulators is the rapid pace of change in technology and the uses of data. A decade ago, the leaders in the world of ad tracking and targeting were the companies that had access to the most data. Today, data has been democratized. Data is available to any vendor with a credit card. Blue Kai, the key data provider in Oracle’s new data division, offers more than 80 comprehensive sources of data to its customers.  Every online player, large or small, has access to detailed data about every American consumer.

Professor Peter Swire has published a new paper which provides an incredibly detailed and extensive review of the types of data collected by ISPs. Swire shows that some of the conventional wisdom which assumes that ISPs can access every bit of a consumer online activity is off base, as a number of factors limit the visibility ISPs have. Swire also shows that much of the data used by tracking and targeting companies is widely available via social networks, search engines, ad networks, app stores and other companies that collect cross-device and cross-context data about consumers.

Will Consumers See any Difference if the FTC takes a restrictive approach?

Today, any company with a budget can bid for data at advertising and data exchanges or can license “data as a service” from a wide number of providers. Restrictive FCC rules could keep ISPs out of the ad tech business, but consumers will see no change in their online experience – the ads they see will still be targeted based on data from the plethora of companies they interact with online.

Is the FTC deception and unfairness standard a license for ISPs to have wide liberty with consumer data?

The FTC deception and unfairness standards can be quite strict. They take into account context, sensitivity of data, risk of harm and a wide range of factors. But, the standard is flexible and allows the FTC to demand higher standards when appropriate and to allow more practical uses of data when appropriate.

How can the FCC promulgate consumer friendly rules here that help simplify the intertwined and complex ad tech environment?

I recently came across the announcement and the agenda for the First Annual Privacy and Data Protection Summit on May 2001. The event was presented by the 50 member strong Privacy Officers Association, the predecessor of today’s 15,000 member International Association of Privacy Professionals. The small group of us who gathered debated the best ways to provide consumer protection at a time when internet business models were still developing.

Speaking at the event, I explained to the audience how easy it was to decline web tracking and ad targeting. Just use your browser’s cookie settings! Block all cookies, block just third party cookies, or clear your cookies and ad networks would no longer recognize your browser. Consumer controls were fairly basic and effective.

How things have changed.

Today, meaningful control for consumers has become incredibly complex. Cookie controls are increasingly meaningless, because companies that fingerprint consumer devices track without cookies.  Central ad industry opt-outs are effective to decline ads targeted based on web surfing, but allow continued tracking, as well as targeting based on appended data. Apps don’t use cookies for tracking, so users who want to use the industry opt-out program need to download a special app to opt-out of app related ad targeting. Or consumers can use the”Limit Ad Tracking” settings that iOS and Android provide, but not every ad network cooperates.  And the Do Not Track option offered by web browsers?  Only about a dozen or so companies respect that setting. If you live in California, online companies need to tell you whether they respect the Do Not Track setting, unless they cooperate with the central industry opt-out program, in which case they do not need to tell you.

Has your head exploded yet?  No, then let’s keep going.

If you don’t want your home WiFi IP address linked to your home location, please add to the name of your home router the letters “_NOMAP”. Google and Mozilla will then opt you out of their location services data bases.  But for Skyhook, Microsoft and many others, find your home router MAC address and submit it at each of the opt-out pages provided by those companies.

Today, ISPs are part of the equation, as they have entered the advertising technology market. Digital signage increasingly includes tracking capabilities, as do in store Wi-Fi networks and more.

I could go on, but it should be abundantly clear that today’s online tracking and targeting options are likely only understood by a handful of experts who work at the intersection of ad tech and privacy.

Today, most of this ad targeting activity is subject to FTC jurisdiction, no matter the source of the data so an unhappy consumer can complain to that agency, regardless of the technology involved.  But the FCC is extending its privacy rules to ISPs, which would mean that consumers will need to turn to that agency if an ad was targeted with tracking or targeting enabled by an ISP.  Since ad targeting involves multiple actors, the FTC and FCC will need to cooperate on ad tech investigations, but each regulator will have a different standard for the same activity if the FCC comes up with its own regime.

The FCC proposal has been released as I write this post and seems to take a more regulatory restrictive approach, although it invites comments on other more consumer friendly paths to accomplish its consumer protection goal. I hope the FCC will take the time to understand the complex ad tech ecosystem and will consider the strong but flexible deception and unfairness rules that could provide its enforcement staff with tools that have stood the test of time.

Jules Polonetsky is CEO of the Future of Privacy Forum. He is a former Chief Privacy Officer of AOL and DoubleClick, and was the Consumer Affairs Commissioner of New York City under Mayor Guiliani.

 

 

 

 

March 10th Event : A Path Forward for Broadband Privacy

As the FCC begins to consider its role in regulating broadband provider data practices, how should it proceed? What are the practices that are at issue? What aspects of the current online ecosystem can the FCC impact? Please join us for a panel discussion to explore the issues.

Opening Comments:

Jon Leibowitz

Davis Polk & Wardwell LLP

Former Chairman of the Federal Trade Commission

Panel:

Professor Peter Swire

Huang Professor of Law and Ethics

Scheller College of Business

Georgia Institute of Technology

Senior Counsel

Alston & Bird LLP

Working Paper: Online Privacy and ISPs by Peter Swire, Justin Hemmings, and Alana Kirkland

Debra Berlyn

President

Consumer Policy Solutions

Jim Halpert

Partner

DLA Piper

Katharina Kopp, Ph.D.

Director of Privacy and Data Project

Center for Democracy & Technology

Jules Polonetsky

CEO

Future of Privacy Forum

Additional background reading:

Effective Regulators, Effective Privacy Choices

Light breakfast will be served.
WHEN
Thursday, March 10, 2016 from 8:45 AM to 10:30 AM (EST)
WHERE
1400 Eye Street Northwest – Suite 450, Washington, DC 20005

RSVP Here