De-Identification: Practice and Policy, April 13 in San Francisco

The Future of Privacy Forum, EY, and Privacy Analytics are hosting an event to share and advance practices and policies around de-identification. This all day forum will include panel discussions on topics such as emerging policy questions, de-identification case studies, implementation and best practices, and the role of controls. We encourage audience participation and knowledge sharing.

Wednesday, April 13, 2016 from 1:00 PM to 6:00 PM (PST)

San Francisco, California

The program will include:

Panel 1: Law, Self-Regulation, and Standards

This panel discusses the regulatory, operational, and technical frameworks guiding de-identification practices today, including perspectives on the evolving definition of personally identifiable information, the development of self-regulatory mechanisms, and international standard-setting efforts

Panel 2: Sector-Specific Case Studies

This panel discusses practical considerations and critical issues when implementing de-identification practices, including real world examples from in a diverse range of industries

Panel 3: The Role of Controls

This panel discusses managing de-identification in the context of comprehensive privacy and security program, including balancing technical and administrative controls, weighing the benefits and risks of data use, and evaluating safeguards for sharing data.

Closing and Group Discussion led by FPF, EY, and Privacy Analytics

* * *

Following the event, please join us for a reception sponsored by Privacy Analytics at Roy’s Restaurant, located across the street from the EY office at 575 Mission Street.

To register, please click HEREThis event is free, but space is limited. Contact Kelsey Finch at [email protected] with any questions.

 

The program is co-hosted by:

Student Privacy Pledge – Hits 250 with Launch of New Site!

The Student Privacy Pledge, a public commitment by education technology companies for the responsible handling of student data, has reached the milestone of 250 signatories. We are also pleased to announce the launch of the newly re-designed Student Privacy Pledge website. The site, studentprivacypledge.org, now provides more information, including a Frequently Asked Questions section, and is easier for visitors to navigate, find signatory companies, and inquire about signing the Pledge.

The K-12 Student Privacy Pledge was introduced by the Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) in October 2014 with 14 original signatories and took effect in January 2015 as a legally enforceable agreement for companies that provide services to schools. The twelve specific commitments in the Pledge detail ongoing industry practices that both meet the demands of families and schools and track key federal and state laws. By signing the Pledge, school service providers clearly articulate their adherence to these practices to schools and parents regarding the collection, use, maintenance, and retention of student data.

As we enter 2016, we have seen a rapid increase in inquiries and companies taking the Pledge, which continues to provide accountability for signatory school service providers. The result is a bolstering of the public trust necessary for continued technology access for school operations and student learning – technology that is critical to the nation’s continued educational and economic competitiveness.

The Pledge adds to an existing framework of student data protections, which also include existing laws, contracts, and company privacy policies. A company’s security and other commitments made under the Student Privacy Pledge are legally enforceable under Section 5 of the federal Consumer Protection Act.

“The sustained support and interest in the student privacy pledge demonstrate the commitment education service providers have to protecting student information. As many states have passed legislation on this issue, the strength of the Pledge and its commitments show the providers’ awareness of community expectations in addition to their legal responsibilities.”

– Mark MacCarthy, Senior Vice President, Public Policy, Software & Information Industry Association (SIIA)

FPF and SIIA are proud to facilitate the efforts of education technology companies to lead in the responsible use of student data by signing the Student Privacy Pledge. We look forward to a continuing increase in the number of companies joining this effort and agreeing to be held publicly accountable to the safeguards embodied in the Pledge.

Read the full text of the Student Privacy Pledge here.

19 Times Data Analysis Empowered Students and Schools

Which Students Succeed and Why?

 

“Thoughtful use of education data has tremendous potential to improve and address inequities in America’s education system. Scientists better understand how the brain incorporates new information and skills. Educators have a more accurate sense of student progress and potential risk for dropping out. Students and teachers use more detailed information about their strengths, weaknesses, and individual academic performance to diagnose and address learning gaps. Schools can correlate patterns with failing or dropping out, and intervene early with at-risk students. Districts and schools can use data to allocate resources and create institutional reform to better meet student needs in a world where students take increasingly personalized or non-traditional paths to graduation.”

Thus begins FPF’s newest paper, by Elana Zeide, which goes on to demonstrate the power of data to show school, districts, parents, and students, the trends and outcomes that are occurring, and inspire ways to make those outcomes better.

Student data, as part of the education record from each student’s school experience, is most importantly a tool for that student to reflect their achievements, and inform their future decisions. In addition, however, data across students and over time enables insights for teachers, administrators, districts, and states to identify trends, show patterns, and evaluate the success of educational changes to ensure that new programs or services achieve the desired results.

This paper identifies 19 studies – a relatively small sample – where data was successfully used to evaluate a program, create a new strategy, or delve into equity and bias issues. The appropriate protection and responsible use of student data in such studies is a fundamental value. But the power of data to shed light on current student and educational system outcomes and improve the opportunity for individual success is overwhelming.

New data analysis techniques provide the opportunities to understand and transform learning theory and practice. As Ms. Zeide concludes: “Properly used, mindfully implemented, and with appropriate privacy protections, student data is a tremendous resource to help schools fulfill the great promise of providing quality education for all.”

Read the full paper here.

Broadband Privacy and the FCC: Protect Consumers from Being Deceived and from Unfair Practices

BroadBand%20Pic

Left to right: Jon Leibowitz, Davis Polk & Wardwell LLP, Former Chairman of the Federal Trade Commission, Professor Peter Swire, Huang Professor of Law and Ethics, Scheller College of Business, Georgia Institute of Technology, Katharina Kopp, Ph.D., Director of Privacy and Data Project, Center for Democracy & Technology, Debra Berlyn, President, Consumer Policy Solutions, and Jules Polonetsky, CEO, Future of Privacy Forum.

Yesterday, the Future of Privacy Forum hosted an event to discuss the direction the FCC could take to best advance consumer protections as it considers how to regulate broadband providers’ use of consumer data. The key question was whether the FCC should adopt the “no deception or unfairness” model successfully used by the FTC and many State Attorneys General for many decades. Even local consumer regulators use this model — as Consumer Affairs Commissioner of New York City under Mayor Giuliani, I enforced NYC’s “mini FTC act” to protect consumers. Or, as some have argued, should the FCC come up with its own privacy regime of rules specific to ISPs?

Some important concepts we examined at the event included the following:

Broad Privacy Regimes or a Sectoral Approach?

Privacy and consumer advocates have long criticized the US sectoral approach to privacy, arguing that it is confusing and less effective than a broad general set of privacy rules for all data. The Obama administration embraced this view when it proposed a broad based Consumer Privacy Bill of Rights, which would have promised protection for any online personal information collected about consumers. Globally, the trend towards comprehensive privacy regulation that started in Europe has spread throughout almost all of the Western world. Katharina Kopp of the Center of Democracy and Technology noted that CDT’s long term goal was comprehensive privacy legislation across all sectors.

Does adding an additional area of sector-specific privacy legislation take a step backwards and make achieving broad privacy legislation less likely? Likely so, in my view, but even more likely so if the path the FCC takes is out of sync with the general broad approach that is applicable across the rest of the economy.

Is the FTC an Effective Enforcer of Online Privacy?

The FTC has been an aggressive actor in using its broad Section Five authority to bring numerous actions against companies of every shape or size. Tech giants Google, Microsoft and Facebook are all subject to 20 year consent decrees following FTC enforcement actions. The FTC has been able to bring actions in cases of consumer harm or deception, even when the harm or deception has been fairly conceptual, as in the Nomi case where the company failed to provide an opt-out that it wasn’t required to provide. Despite an almost certainty that no consumer entering a store had ever heard of Nomi or read its policy, the FTC took action based on its very broad view of its Section 5 Authority. Former FTC Chairman Jon Leibowitz discussed the important lead role the FTC has played in successfully policing online practices using its deception and unfairness authority.

Are ISPs unique in the types or amount of data they collect?

One important consideration for regulators is the rapid pace of change in technology and the uses of data. A decade ago, the leaders in the world of ad tracking and targeting were the companies that had access to the most data. Today, data has been democratized. Data is available to any vendor with a credit card. Blue Kai, the key data provider in Oracle’s new data division, offers more than 80 comprehensive sources of data to its customers.  Every online player, large or small, has access to detailed data about every American consumer.

Professor Peter Swire has published a new paper which provides an incredibly detailed and extensive review of the types of data collected by ISPs. Swire shows that some of the conventional wisdom which assumes that ISPs can access every bit of a consumer online activity is off base, as a number of factors limit the visibility ISPs have. Swire also shows that much of the data used by tracking and targeting companies is widely available via social networks, search engines, ad networks, app stores and other companies that collect cross-device and cross-context data about consumers.

Will Consumers See any Difference if the FTC takes a restrictive approach?

Today, any company with a budget can bid for data at advertising and data exchanges or can license “data as a service” from a wide number of providers. Restrictive FCC rules could keep ISPs out of the ad tech business, but consumers will see no change in their online experience – the ads they see will still be targeted based on data from the plethora of companies they interact with online.

Is the FTC deception and unfairness standard a license for ISPs to have wide liberty with consumer data?

The FTC deception and unfairness standards can be quite strict. They take into account context, sensitivity of data, risk of harm and a wide range of factors. But, the standard is flexible and allows the FTC to demand higher standards when appropriate and to allow more practical uses of data when appropriate.

How can the FCC promulgate consumer friendly rules here that help simplify the intertwined and complex ad tech environment?

I recently came across the announcement and the agenda for the First Annual Privacy and Data Protection Summit on May 2001. The event was presented by the 50 member strong Privacy Officers Association, the predecessor of today’s 15,000 member International Association of Privacy Professionals. The small group of us who gathered debated the best ways to provide consumer protection at a time when internet business models were still developing.

Speaking at the event, I explained to the audience how easy it was to decline web tracking and ad targeting. Just use your browser’s cookie settings! Block all cookies, block just third party cookies, or clear your cookies and ad networks would no longer recognize your browser. Consumer controls were fairly basic and effective.

How things have changed.

Today, meaningful control for consumers has become incredibly complex. Cookie controls are increasingly meaningless, because companies that fingerprint consumer devices track without cookies.  Central ad industry opt-outs are effective to decline ads targeted based on web surfing, but allow continued tracking, as well as targeting based on appended data. Apps don’t use cookies for tracking, so users who want to use the industry opt-out program need to download a special app to opt-out of app related ad targeting. Or consumers can use the”Limit Ad Tracking” settings that iOS and Android provide, but not every ad network cooperates.  And the Do Not Track option offered by web browsers?  Only about a dozen or so companies respect that setting. If you live in California, online companies need to tell you whether they respect the Do Not Track setting, unless they cooperate with the central industry opt-out program, in which case they do not need to tell you.

Has your head exploded yet?  No, then let’s keep going.

If you don’t want your home WiFi IP address linked to your home location, please add to the name of your home router the letters “_NOMAP”. Google and Mozilla will then opt you out of their location services data bases.  But for Skyhook, Microsoft and many others, find your home router MAC address and submit it at each of the opt-out pages provided by those companies.

Today, ISPs are part of the equation, as they have entered the advertising technology market. Digital signage increasingly includes tracking capabilities, as do in store Wi-Fi networks and more.

I could go on, but it should be abundantly clear that today’s online tracking and targeting options are likely only understood by a handful of experts who work at the intersection of ad tech and privacy.

Today, most of this ad targeting activity is subject to FTC jurisdiction, no matter the source of the data so an unhappy consumer can complain to that agency, regardless of the technology involved.  But the FCC is extending its privacy rules to ISPs, which would mean that consumers will need to turn to that agency if an ad was targeted with tracking or targeting enabled by an ISP.  Since ad targeting involves multiple actors, the FTC and FCC will need to cooperate on ad tech investigations, but each regulator will have a different standard for the same activity if the FCC comes up with its own regime.

The FCC proposal has been released as I write this post and seems to take a more regulatory restrictive approach, although it invites comments on other more consumer friendly paths to accomplish its consumer protection goal. I hope the FCC will take the time to understand the complex ad tech ecosystem and will consider the strong but flexible deception and unfairness rules that could provide its enforcement staff with tools that have stood the test of time.

Jules Polonetsky is CEO of the Future of Privacy Forum. He is a former Chief Privacy Officer of AOL and DoubleClick, and was the Consumer Affairs Commissioner of New York City under Mayor Guiliani.

 

 

 

 

March 10th Event : A Path Forward for Broadband Privacy

As the FCC begins to consider its role in regulating broadband provider data practices, how should it proceed? What are the practices that are at issue? What aspects of the current online ecosystem can the FCC impact? Please join us for a panel discussion to explore the issues.

Opening Comments:

Jon Leibowitz

Davis Polk & Wardwell LLP

Former Chairman of the Federal Trade Commission

Panel:

Professor Peter Swire

Huang Professor of Law and Ethics

Scheller College of Business

Georgia Institute of Technology

Senior Counsel

Alston & Bird LLP

Working Paper: Online Privacy and ISPs by Peter Swire, Justin Hemmings, and Alana Kirkland

Debra Berlyn

President

Consumer Policy Solutions

Jim Halpert

Partner

DLA Piper

Katharina Kopp, Ph.D.

Director of Privacy and Data Project

Center for Democracy & Technology

Jules Polonetsky

CEO

Future of Privacy Forum

Additional background reading:

Effective Regulators, Effective Privacy Choices

Light breakfast will be served.
WHEN
Thursday, March 10, 2016 from 8:45 AM to 10:30 AM (EST)
WHERE
1400 Eye Street Northwest – Suite 450, Washington, DC 20005

RSVP Here

Privacy and the Connected Vehicle: A Global Event, March 9 in Detroit

The Future of Privacy Forum and EY are hosting an event to advance the conversations around the management and use of personal information in the vehicle ecosystem. We will have a half day of panel discussions led by our team of privacy professionals and colleagues from the privacy and automotive space in the US and EU. If you work in the connected car ecosystem in automotive privacy, security, or compliance management, contact Lauren Smith at [email protected] to request an invitation.

Wednesday, March 9, 2016 from 8:00 AM to 12:00 PM (EST) 

Detroit, Michigan

The program will include:

Welcome Remarks

Panel 1: Legal and Self-Regulatory Standards for Automotive Privacy

This panel discusses steps companies are taking to adopt the Auto Alliance and Global Automakers Consumer Privacy Protection Principles, data issues in the automotive ecosystem, international considerations and practices, and the privacy impact of emerging technologies such as V2V, V2I.

Panel 2: Practical Considerations and Compliance

This panel discusses managing privacy in a distributed ecosystem, critical issues in managing the supply chain, responsibility for good privacy practices, and ensuring the Privacy Principles are integrated into new technologies.

Panel 3: Communications and Policy

This panel discusses managing consumer communications around privacy, security, and new features, the challenges of ensuring policymakers understand the complexity of data flows, legislative actions in the short term, and critical questions about autonomous technologies, ethics and privacy considerations.

This program is invitation-only and closed to press.

Important logistical information:

Time: Registration opens at 7:30am. Breakfast refreshments will be served. The program begins promptly at 8:00am. Lunch will be provided following closing remarks.

This live event will be available for videoconferencing in select EY offices throughout Europe: Sweden, Germany, France, and the UK.

Contact [email protected] to request an invitation.

 

 

The FBI and the iPhone in Your Pocket

Consider the data on your iPhone for a moment. Emails, pictures, passwords, credit cards, location history, contacts and more. Imagine your phone unlocked in the hands of a criminal who snatched it, or someone who wanted to embarrass you who peeked at it, or a hacker who remotely accessed it.

Today, if you have a good password protecting your phone, none of this is easy to do. Encryption ensures that without a password, your data is locked up, even if your phone was taken apart or attacked while it was booting up. Rate limiting ensures that it isn’t possible to brute force attack the phone by entering thousands of passwords. A small delay required in between password attempts ensures thousands of attempts will take a very long time. And another security feature, if enabled, will delete all the data on an iPhone, after 10 failed password attempts. With iCloud Activation, as a deterrent for thieves, if the iPhone is remotely wiped and locked then the original owner’s username and password are required to reactivate the phone for use with a wireless carrier. Without these credentials the iPhone remains encrypted and locked preventing anyone from using it. The iPhone is so secure, that even Apple has no way to get in to the phone, even if you bring it in to visit a Genius at a local Apple store or if you sent the phone back to Apple headquarters.

These protections ensure that your phone is not an easy target for thieves, anymore. iPhone thefts have plummeted by as much as 50% in some cities, after these features were introduced.

Unfortunately, the FBI now wants to put the safety of all iPhone users at risk. The FBI wants access to the iPhone of one of the San Bernardino killers. Phones used by the killers were destroyed, but a phone reportedly provided by an employer is in the hands of the FBI, but locked. The FBI has the iPhone back ups stored with Apple, until late October, but hasn’t been able to break into the iPhone to determine if there any clues stored on the device. The FBI wants Apple to devise a method of breaking in to iPhones that can be used here.

But – if Apple does so, this method will be used by others. Once Apple creates a bypass, the methodology will be analyzed, studied and exploited by sophisticated criminals at first, and then by others. There may or may not be any useful clues on the iPhone of the San Bernardino killers, but it is for certain that many criminals in the future will find data they want on the iPhones of consumers, if an exploit is created to bypass passwords, encryption and other protections.

Of course, there are many other negative implications, if iPhone security is circumvented. Repressive governments will seek to force Apple to unlock iPhones, investigating and persecuting those who oppose them. As Future of Privacy Forum Senior Fellow Peter Swire explained, “I wish there was some magic way to break security for exactly one phone, without breaking security for millions of smartphones. There isn’t.” Former White House Deputy CTO put it clearly, noting that once those back doors are there for the FBI, “all of our private communications become much more vulnerable to attack by malicious criminals and terrorists.”

The FBI needs every tool it can get to investigate terror attacks and prevent them in advance. But forcing Apple to create a tool that risks the security and privacy of every iPhone in the world is asking for a tool that will cause more harm than it prevents.

Jules Polonetsky is Executive Director of the Future of Privacy Forum.

Google Responds to Sen. Franken

Google provided a response this week to Senator Franken’s request for information on their policies and practices with regards to their Google Apps For Education (GAFE) suite of services. Ed Week reviewed Google’s letter, and asked FPF to comment on the response.

Full article here.

ACLU, Tenth Amendment Center Join Forces on Data Privacy

“In consultation with the center—a think tank that advocates strict limits on federal power—the ACLU wrote model legislation that both organizations are urging legislators around the country to support. …

“The Future of Privacy Forum—a Washington-based think tank and a co-author of the Student Privacy Pledge, a commitment by ed-tech companies to safeguard data—offered a measured endorsement of the provisions in the ACLU’s model bill.

“The forum applauded the model legislation’s language on parental-release mechanisms and its calls for teacher professional development on basic data-privacy issues, but is worried that an overly strict definition of “personally identifiable information” and the risk of personallegal liability for teachers who make mistakes could undermine both the ed-tech industry and the work of classroom educators.”

 

Read full article here.

FPF Supports Connect Safely for Safer Internet Day

Our friends at Connect Safely have put together an amazing program to support student awareness and education as part of Safer Internet Day on Tuesday, February 9, 2016.  Below is their writeup – we invite you all to participate and spread the word – particularly to the students in your life and their educational institutions – this is too good to miss!

They’ve built a stellar agenda for the 300+ students attending Safer Internet Day at Universal Studios Hollywood, and fortunately, for those of us who can’t make it in person,we can watch the live stream. Along with a professional video crew, the LA Unified School District has arranged for student journalists to use Periscope as roving reporters to live stream the offstage action including when the students are in small groups answering the tough questions and as they interact with the exhibits and walk the red carpet.

There is an exciting lineup of great speakers assembled with help from the Yale Center for Emotional Intelligence and Facebook. Young panelists from Beyond Differences, #ICANHELP, and IspirED will discuss the topic “Rejecting Hate, Building Resilience & Growing the Good Online,” moderated by college filmmaker, Instagram personality and transgender activist Leo Sheng.

Professional wrestler and reality TV star Mike “The Miz” Mizanin will speak about how it’s possible to play a bad guy on TV but be a gentle soul in real life with messaging about his own experiences with bullying.

LA Media personality Tshaka Armstrong, a parent and founder of Digital Shepards, will inspire the kids to take responsibility for making the world a better place.

Connect Safely’s newest team member – K-12 education director Kerry Gallagher – will lead the students as they break into small groups with the help of distinguished coaches and judges who will pick the top student proposals for special recognition.

Finally, Connect Safely brings in Zoë Quinn founder of Crash Override who works hard to fight the online harassment that she herself faced as “patient zero of Gamergate.”

Join the live stream and Periscope webcasts, and when you do, be part of the discussion – tweet using #SIDUS16, #SID2016 and #SaferInternetDay.

The link for the live stream:  http://saferinternetday.us/livestream/

Congratulations and thanks to Connect Safely for their excellent work to keep children and students safe on-line.  See you Tuesday!