University of Amsterdam's Summer Course on Privacy Law and Policy

The University of Amsterdam’s Institute for Information Law (IViR) is accepting applications for its fourth annual Summer Course on Privacy Law and Policy which will be held from July 4-8, 2016. The course focuses on privacy law and policy related to the internet, electronic communications, and online and social media and explores both the broader trends and the most recent developments in this rapidly changing field. The course will be held in De Rode Hoed, a historic building on one of Amsterdam’s most beautiful canals. The interactive seminars will be led by distinguished European and US academics, regulators and practitioners who will investigate the EU and US legal frameworks and how they operate together. Enrollment is limited to 25 participants.

View additional information and register

For questions, please contact the course organizer Dr. Kristina Irion.

FPF Asks Lawmakers: "Send a Message to States that Privacy is a Priority"

FPF supported Data Quality Campaign’s (DQC) recent initiative to bring an important issue about student privacy to the attention of lawmakers.  Signing on with DQC and 20 other educational and privacy groups, FPF agrees that it is critical that states have the resources they need to ensure adequate privacy protection for student data.  Since states already receive some federal funding to support their State Longitudinal Data Systems (SLDS), one way for them to fund a central responsibility for privacy concerns is to be allowed to use those funds.

As Rachel Andersen at DQC summarizes:

Why Does this Issue Matter?

Why This Particular Ask?

FPF is pleased to join in this “ask” to federal policymakers to send a clear message to states that privacy is a priority in the collection and use of student data – and to provide an avenue to the resources to support it.

A Visual Guide to Practical Data De-Identification

For more than a decade, scholars and policymakers have debated the central notion of identifiability in privacy law. De-identification, the process of removing personally identifiable information from data collected, stored and used by organizations, was once viewed as a silver bullet allowing organizations to reap data benefits while at the same time avoiding risks and legal requirements.

De-Id Infographic

However, the concept of de-identification has come under intense pressure to the point of being discredited by some critics. Computer scientists and mathematicians have come up with a re-identification tit for every de-identification tat. At the same time, organizations around the world necessarily continue to rely on a wide range of technical, administrative, and legal measures to reduce the identifiability of personal data to enable critical uses and valuable research while providing protection to individuals’ identity and privacy.

The debate around the contours of the term personally identifiable information, which triggers a set of legal and regulatory protections, continues to rage, with scientists and regulators frequently referring to certain categories of information as “personal” even as businesses and trade groups define them as “de-identified” or “non-personal.”

The stakes in the debate are high. While not foolproof, de-identification techniques unlock value by enabling important public and private research, allowing for the maintenance and use – and, in certain cases, sharing and publication – of valuable information, while mitigating privacy risk.

* * *

Omer Tene, Kelsey Finch and Jules Polonetsky have been working to address these thorny issues in a paper titled Shades of Gray: Seeing the Full Spectrum of Practical Data De-Identification. The paper is published in the Santa Clara Law Review.

Read our opinion piece via The International Association of Privacy Professionals.

Accompanying this paper is our Visual Guide to Practical Data De-Identification.

Get updates on FPF’s de-identification papers and projects.

Progress on Drone Privacy Best Practices

Today, the National Telecommunications & Information Administration (NTIA) circulated a Best Practices document that is being proposed by a diverse subgroup of stakeholders including leading privacy advocates, drone organizations and companies, and associations. The proposed Best Practices will be presented and discussed at the next meeting of the NTIA convened multi-stakeholder process concerning privacy, transparency, and accountability issues regarding commercial and private use of unmanned aircraft systems.

“These proposed draft principles recognize the value of drones for beneficial purposes, but also address in a practical way the privacy concerns they raise. Much careful negotiation and compromise went into ensuring privacy issues could be addressed in a way that is practical, so operators both large and small can comply,” said Jules Polonetsky, CEO, Future of Privacy Forum, a member of the subgroup proposing the Best Practices.

Google Provides Open Source Platform for Beacon Security

After an initial splash, news about beacon technology has been fairly quiet recently, but last week an advancement was announced that will support easier access to privacy and security capabilities on this unique technology.

Beacons are sometimes misunderstood – thought to collect or retain data on nearby people, or able to track smartphone movements without their owner’s awareness. In fact, they only transmit, never collect, data. And location tracking is possible ONLY if you have given that specific app permission to use your phone’s location functions, and if you have your Bluetooth access turned on. You can control this on your phone’s setting, and you can deny an app access to contact you via notifications.

The use of low-powered beacons has spread slowly and steadily – in stores, museums, airports or other spaces that set up a device that broadcasts a unique code. If your phone has Bluetooth turned on and you download the particular app for that location, and then you allow permission to use Bluetooth and location, the app can detect that beacon. By determining the location of your phone, the app then enables features.

Beacons positioned near an airport security checkpoint might trigger your airline’s app to show your boarding pass. A beacon in a museum might signal the museum app to show information about the artist of a painting you’re looking at. Retail-store beacons may help users locate products or indicate sale items. Beacons are inexpensive, simple to deploy and are supported by most mobile operating systems.

Since their introduction on a broader scale, retailers, shopping centers, public attractions, airports, and sports arenas have explored how to use beacons in many new and different ways. As consumers become more familiar with the advantages, they have grown to enjoy the benefits of a more personalized experience.

However, one continued challenge to a broader array of applications for beacons is that of security. They work well for a standardized response triggered by a general member of the public who enters their zone, but greater protection in general applications has been limited when needed to protect information to allow for an individualized response. Since unencrypted beacon signals are also susceptible to long-term tracking, this security shortfall has limited the pace of their increased use.

Last week, however, Google announced an offering of an open-source platform with the rollout of Eddystone-EID. Per the design team, other companies may have similar technologies, but they are proprietary without easy transparency into the process for how the encryption is achieved. This is where Eddystone-EID shines, since the technical specifications are open-source.

According to Google’s statement :

“Eddystone-EID enables a new set of use cases where it is important for users to be able to exchange information securely and privately. Since the beacon frame changes periodically, the signal is only useful to clients with access to a resolution service that maps the beacon’s current identifier to stable data. In other words, the signal is only recognizable to a controlled set of users.”

Google has developed the entire suite of Eddystone platforms as open source technology; they are available on GitHub. This newest addition – EID – turns the beacon-to-app-enabled phone into an encrypted, moving target. If another phone in the area doesn’t have the shared key, the EID representation is just gibberish. With the new tools, the exchange can’t be tracked or spoofed, and there is also access to safety features such as proximity awareness, device authentication, and data encryption on packet transmission.

Now, in addition to being able to find your way around the airport, you will be able to track your luggage without anyone else knowing which bag is yours. During sporting events, the facility can communicate with individual patrons in the “nosebleed” sections to offer them better seats, when available. A UK company will use this to offer subscribers personalized commuting information.

Introduced along with other new offerings – Beacon Tools, and Eddystone GATT-Service – this new open-source platform for secure encryption practices represents for an important moment in beacon technology for the increased security and protection of personal data.

 

Using Student Data Essential for Research that Empower Students

In our nation’s schools, we have seen widespread use of zero tolerance policies that lead to suspension, expulsion, and other extreme disciplinary measures. Do these policies work or do they cause more harm than good?

Thanks to research that studied student data over time, we now know that these procedures are not effective in preventing future misbehavior nor improving student outcomes.

Without studies that looked at this issue and others, our policies and education practices would be lacking key insights.

In Huffington Post Education, Jules Polonetsky writes about Making a Digital Difference in the Classroom With Data,”.  Jules reviews a recent FPF report written by NYU academic Elana Zeide which summarized an extensive collection of research studies that relied on student data to help gain insights used to improve student education.

Read the full report titled 19 Times Data Analysis Empowered Students and Schools.

EU-US Privacy Shield Gets Nuanced Review by EU Privacy Regulators

On April 13, 2016, the Article 29 Working Party (Working Party) released its review of the EU-US Privacy Shield (Privacy Shield), the proposed new framework for US companies to transfer data from the EU to the US. The review of the Working Party was nuanced, giving strong credit for improvements by the Privacy Shield over the previous Safe Harbor agreement for commercial uses of data and praising new protections related to government surveillance.  But the Working Party also cited various issues of concern that it wants to see addressed.

The Working Party recognized that some of the issues raised might be addressed in a future review of the Privacy Shield after the new GDPR is in place or after EU Court decisions inform the appropriate limits on bulk collection of data and surveillance. Other issues might be addressed by documents that more clearly explain the Privacy Shield or by a new glossary that explains Privacy Shield key terminology.

The Working Party also pointed out some areas where important EU concepts are not in their view captured in the Privacy Shield, such as limits on data retention, rights to object to automated processing and more.

EU Commission spokesman Christian Wigand reacted to the review saying “EU Data Protection Authorities welcome significant improvements to Privacy Shield.  We aim for adoption in June.”

“While policymakers and regulators debate the next steps on Privacy Shield, they should keep in mind who is most impacted by uncertainty about EU-US data flows, stated FPF CEO Jules Polonetsky.  “51% of the companies in Safe Harbor were there to transfer the human resources data of EU employees to the US, for payroll, promotions and bonuses.”

A previous FPF study also revealed that Safe Harbor included 152 companies who are headquartered or co-headquartered in European countries, which span across a wide range of industries and countries.

Click here to view the Working Party’s full opinion.

May 10th Event: The Higher Education Privacy Conference

The fifth annual Higher Education Privacy Conference (HEPC) will be held on Tuesday, May 10, 2016 at the George Washington University Marvin Center in Washington, DC.

The HEPC is one-day event that focuses on privacy and information management in higher education.  The event consists of a combination of speakers and smaller breakout discussion groups to foster interactivity and engagement.  Participants include higher education CIOs, security professionals, privacy professionals, compliance professionals, and general counsel.  Also participating are key individuals from industry, law firms, associations, and government regulatory agencies.

WHAT:

The Higher Education Privacy Conference

WHEN:

Tuesday, May 10, 2016, 8 AM – 5 PM

WHERE:

George Washington University Marvin Center

800 21st Street Northwest

Washington, DC 20052

Click here for more information and to register

 

The National Network to End Domestic Violence Discusses Protecting Victim Privacy While Holding Offenders Accountable

Future of Privacy Forum Advisory Board member Cindy Southworth, Executive Vice President and Founder of the Safety Net Technology Project at the National Network to End Domestic Violence (NNEDV), shared a post we thought was important. In its article, “Smartphone Encryption: Protecting Victim Privacy While Holding Offenders Accountable,” NNEDV recognizes the significance of smartphone encryption in the ability for law enforcement to hold offenders accountable, but also states that smartphone encryption does not prevent law enforcement from doing an investigation of technology-facilitated domestic violence, sexual assault, and stalking.

NNEDV points out that in most cases, it is possible for law enforcement to successfully investigate and build a domestic violence and sexual assault case without needing the perpetrator’s smartphone. It is explained that evidence of harassment via emails, texts, or social media will also exist on other technology platforms. Thus, access to the smartphone is not required.

Essentially, NNEDV contends that the issue of smartphone encryption comes down to balancing victim privacy and offender accountability. It believes that both are equally important, but neither should be compromised for the other. NNEDV suggests that instead of finding waysto get around smartphone encryption, law enforcement agencies deserve and need far more resources to investigate crimes facilitated through technology.

Click here to read the full article. 

FPF Hires Director of Communications – Melanie Bates

We are delighted to welcome Melanie Bates to the Future of Privacy Forum (FPF) as of April 11, 2016 as our new Director of Communications. In this new position, Melanie will be responsible for all FPF communications requirements including website updates, media relations, internal member communications, and social media presence. She will also assist with development of FPF’s strategic communication plan, and support availability of written or in-person representation of FPF’s position on important public policy questions on consumer privacy issues.

Melanie came to us from her role as the Director of Policy & Communications at the American Civil Liberties Union of the Nation’s Capital (ACLU-DC). Prior to ACLU-DC, she was the Legislative Director for Ward 6 Councilmember Tommy Wells at the Council of the District of Columbia. Melanie was the 2014-2015 President of the Greater Washington Area Chapter, Women’s Lawyers Division, National Bar Association (GWAC). She also served on the National Bar Association’s Board of Governors. Melanie is a graduate of the DC Bar Leadership Academy and the New Leaders Council Institute (NLC), Washington, DC Chapter.

Melanie earned her Bachelor of Science in Marketing from Hampton University in 2007 and her Juris Doctor from North Carolina Central University School of Law in 2011.

We are excited to have Melanie on board as FPF continues to grow its impact within the public policy discussion on the responsible use of data in consumer and commercial privacy issues.  For inputs or questions about FPF’s work, please contact Melanie at [email protected].