W&L Law Offers DC-based Cyber Law and Privacy Seminar with Future of Privacy Forum

Washington and Lee University School of Law has launched a new summer program in Washington, DC for students interested in studying cyber and privacy law.

The program is part of W&L’s exclusive academic partnership launched last year with the Future of Privacy Forum (FPF), a DC-based think tank that promotes responsible data privacy policies. The FPF was founded by W&L alumnus Christopher Wolf ‘80L, senior partner and former director of the Information Privacy Practice Group of Hogan Lovells.

The course, titled “Cyber Policy and Privacy Law,” will be co-taught by Professor Margaret Hu and Jules Polonetsky, CEO at FPF. The course will examine how the expanding role of the internet, big data, e-commerce, social media, and wearable technology has strained the preexisting regulatory and constitutional frameworks that have guided privacy protections under the law. The seminar will delve into these topics in both corporate and government contexts.

Read the full story.

June 14th Event: A Roundtable on Ethics, Privacy, and Research Reviews

Please join the Future of Privacy Forum (FPF) and the Ohio State University’s Program on Data and Governance in Washington, DC, on Tuesday, June 14, 2016, for a discussion of ethics, privacy and practical research reviews in corporate settings.

This timely event, which follows the White House’s call to develop strong data ethics frameworks, will convene corporate and academic leaders to discuss how to integrate ethical and privacy considerations into innovative data projects and research.  The roundtable is an important extension of FPF’s December 2015 workshop, “Beyond IRBs: Designing Ethical Review Processes for Big Data Research,” supported by the National Science Foundation and Alfred P. Sloan Foundation.

A major focus of the roundtable will be on new papers by Facebook’s Molly Jackman and Lauri Kanerva entitled, “Evolving the IRB: Building Robust Review for Industry Research” and by CDT’s Michelle De Mooy and Fitbit’s Shelten Yuen entitled “Toward Privacy Aware Research and Development in Wearable Health.”

The event is free and open to the public, but please register as space is limited.

We look forward to seeing you there.

REGISTER HERE

Agenda


9:30-9:45 am.  Welcome

9:45-10:15 am.  Paper: Evolving the IRB: Building Robust Review for Industry Research

10:15-10:45 am.  Paper: Toward Privacy Aware Research and Development in Wearable Health

10:45-11:00 am.  Coffee Break

11:00-12:00 pm.  Panel: Ethical Reviews and Research Data Governance in Corporate Settings

12:00-12:30 pm.  Closing Discussion

This program is free and open to the public, but please register here as space is limited.

WHEN


Tuesday, June 14, 2016 from 9:30 am to 12:30 pm (EDT) – Add to Calendar

WHERE


Future of Privacy Forum – 1400 I Street Northwest Suite 450, Washington, DC 20005 – View Map


PAPERS

Read Facebook’s paper

Read Facebook’s blog

Read Jules Polonetsky’s and Dennis Hirsch’s joint op-ed in Re/code

Read Beyond IRBs: Designing Ethical Review Processes for Big Data

Student data privacy: Moving from fear to responsible use

Data has always been an inherent part of the educational process – a child’s age, correlated with her grade level, tracked to specific reading or math skills that align with that grade, measured by grades and tests which rank her according to her peers. Today this data is ever more critical. Education professionals seek understanding from what the data reflect on the teacher’s role and influence, evaluating student outcomes across classrooms. Parents seek similar measures on individual K-12 schools and districts, and desperately seek insight into the value of education at individual colleges and universities to justify the cost or debt incurred.

In the last two years, there has been a perfect storm on the topic of student data privacy. The role of technology within schools expanded at an unprecedented rate, general awareness of consumer data security and breaches increased, and student databases at the state or national level were established or proposed, which drew great public scrutiny and fear. This maelstrom yielded a tremendous output of legislative activity targeted at education technology companies, that was overwhelmingly focused on protecting and limiting the sharing and use of student data—in rare instances, to the point of forbidding research uses almost completely. There are signs that this wave of fear-driven response has finally crested, and that more measured conversations are occurring; conversations that prioritize the fundamental requirement for appropriate privacy and security, but with a clear focus on the invaluable role of research and analysis and the need to enable it.

Read the full piece on Brookings

 

Future of Privacy Forum and ConnectSafely Release Educator's Guide to Student Data Privacy

FOR IMMEDIATE RELEASE                     

May 23, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

Contact: Hani Khan, Outreach Manager, [email protected]

FUTURE OF PRIVACY FORUM AND CONNECTSAFELY RELEASE
EDUCATOR’S GUIDE TO STUDENT DATA PRIVACY

Washington, DC –  Today, the Future of Privacy Forum (FPF) and ConnectSafely are releasing the Educator’s Guide to Student Data Privacy. Technology tools and applications are changing the way schools and teachers educate students across the country. New resources are making it easier for teachers and students to communicate in and outside of the classroom making learning a 24/7 activity. When schools use technology, a student’s personal information is often collected and shared for the purpose of furthering their education. The Educator’s Guide will help teachers utilize technology in the classroom responsibly and protect their students’ privacy, explaining among other things:

The Educator’s Guide follows the success of the Parent’s Guide to Student Data Privacy, which was jointly developed by ConnectSafely and FPF to help parents understand the laws and regulations in place to protect their child’s personally identifiable information. The Educator’s Guide will help teachers understand their role in protecting students’ information and ensuring students use classroom technology in a responsible way.

“Schools across the country are rapidly integrating new technologies meant to deliver a quality education to students in more effective and efficient manner. The Educator’s Guide will be a vital new resource for teachers when helping students use educational applications and technology in a safe way that protects their data privacy,” said Kobie Pruitt, Education Policy Manager, FPF.

“As a teacher who knows first-hand how technology tools have helped my students feel empowered to find the resources they need, collaborate with one another and with me, and create professional quality work I know we are in the midst of a transformational time for education. Teachers want to both inspire and protect their students, and this guide will help them feel confident they are doing both when using technology,” said Kerry Gallagher, Director of K-12 Education for ConnectSafely.

FPF and ConnectSafely are proud to help educators implement new technologies responsibly in order to promote successful student outcomes. To view and download the Educator’s Guide to Student Data Privacy and many other online privacy and safety resources, please visit FERPA|SHERPA andConnectSafely.

###
The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. Learn more about FPF.
ConnectSafely.org is a Silicon Valley, Calif.-based nonprofit organization dedicated to educating users of connected technology about safety, privacy and security with research-based safety tips, parents’ guidebooks, advice, news and commentary on all aspects of tech use and policy.

Comprehensive Online Tracking is Not Unique to ISPs

Last week, the Senate Judiciary Committee (Subcommittee on Privacy, Technology, and the Law) held a hearing to explore the FCC’s proposed privacy rules regulating Broadband Internet Access Service providers (a subset of Internet Service Providers, or ISPs).

The discussion among top privacy regulators returned at several points to the value of consistency of privacy protection across the ecosystem. The proposed FCC rules would restrict ISPs from using customer proprietary information—defined broadly to include things like IP addresses, unique identifiers, and other personal information—for any purpose outside of providing their own services and marketing their own (or affiliates’) communications-related services, at least without seeking the customer’s affirmative Opt In consent. The justification given for these rules is that ISPs are uniquely “in a position to develop highly detailed and comprehensive profiles of their customers.” See para. 4, Notice of Proposed Rulemaking.

Near the end of the discussion, the FCC Chairman stated:

“When I go to Google [. . .] that is a decision that I am making. [. . .] I go to WebMD, and WebMD collects information on me. I go to Weather.com and Weather.com collects information on me. I go to Facebook and Facebook collects information on me. But only one entity connects all of that information, that I’m going to all those different sites, and can turn around and monetize it.” – Chairman Wheeler (approx. ~1h:30m)

This framing of the issue reflects a fundamental misunderstanding of the current online advertising ecosystem, which is fully capable of tracking individual behavior across the Internet as well as between devices.

Using Mozilla’s Lightbeam for Firefox extension, it can quickly be seen that the third party tracking industry is inter-woven and comprehensive. After installing the Firefox browser and visiting only one website (WebMD.com) (See Fig. 1, below), I have connected with 24 third party sites.

1 - webmd red sq

Fig. 1. Mozilla’s “Lightbeam for Firefox” extension demonstrates that my visit to a single website generated 24 third party connections. Circular nodes represent websites visited, and triangular nodes are third party sites. Purple lines identify when a site has stored data on the browser (cookies).

After visiting three additional sites—for a grand total of four websites—I have connected with 119 third party entities (see Fig. 2, below). Each “connection” means that entity can identify the web page the consumer is visiting and can share that data with other parties to which they are interconnected. Some parties are linked up to many web sites. But even for those that are not directly connected to a particular site, third party entities who are linked are capable of buying and selling this data at third party data exchanges. These data exchanges, by linking and compiling data from hundreds of different online and offline sources, can “match up” consumer behavior across the Internet, creating comprehensive and detailed individual profiles.

4 - cnn red sq

Fig. 2. Mozilla “Lightbeam for Firefox” display after visiting only four websites. Circular nodes represent websites visited, and triangular nodes are third party sites. Purple lines identify when a site has stored data on the browser (cookies).

The third party advertising networks and data partners visualized above use a variety of methods designed to create comprehensive profiles of a user’s entire web browsing history. This includes persistent identifiers (cookies), IP addresses, device identifiers, direct authentication (such as email addresses), or probabilistic methods (such as browser fingerprinting). For a more extensive explanation of these tracking methods, see our 2015 report on Cross-Device Tracking. Furthermore, this information can be combined with offline data (appended data), such as a user’s in-store purchase history, for an even more comprehensive consumer profile.

Many of the leading online platforms also correlate data across websites. For example, many websites (including WebMD, seen above) carry social media plug-ins that allow those social media platforms to compile browsing histories of individuals across the Internet and link that browsing activity to the same user’s social media behavior.  If a consumer browsing the Web sees a Twitter button on a website they visit,  that data goes to Twitter to help serves ads on Twitter.

Mobile apps often collect even more granular information, such as information about the user’s in-app behavior, and other mobile data such as the Calendar or Contacts. Access to some mobile data (such as Location Services) requires the user’s Opt In permission, but access to other mobile information (such as the nearby Wi-Fi networks, from which location can be inferred) sometimes does not.  Some leading apps serve ads based on knowing what other apps are installed on a user’s device. WebMD, for example, in addition to tracking and sharing the data visualized above, has a mobile app that enables it to track users across desktop and mobile platforms.

This data collection usually occurs without directly sharing explicitly personal information—rather, for security and privacy reasons, industry players typically match up individual behavior using “hashed” identifiers. And many, including Commissioner Ajit Pai, have pointed out that online tracking has generated benefits for consumers, including the availability of free and reduced-cost online content subsidized by online advertising that can be made more efficient and relevant through information about online audiences.

There have been many industry efforts in recent years to self-regulate the market in order to alleviate these privacy concerns and build consumer trust. For example, the National Advertising Initiative and Digital Advertising Alliance have enforceable codes and guidelines covering the uses of online data collected and used for online behavioral advertising (interest-based advertising). The Wireless Association (CTIA) has issued high-level voluntary guidelines around mobile data, especially geo-location. And the Future of Privacy Forum (FPF) has a Location & Ad Practices Working Group that is developing best practices and consumer awareness information around online advertising and another group that has developed best practices for data from wearables and wellness apps.

Despite industry efforts, for many advocates and consumers the comprehensive and pervasive nature of online tracking continues to be debated. But one thing is certain: it is not unique to ISPs.

Scientists Are Just as Confused About the Ethics of Big-Data Research as You

And the patchwork of review boards responsible for overseeing those risks are only slowly inching into the 21st century. Under the Common Rule in the US, federally funded research has to go through ethical review. Rather than one unified system though, every single university has its own institutional review board, or IRB. Most IRB members are researchers at the university, most often in the biomedical sciences. Few are professional ethicists.

Even fewer have computer science or security expertise, which may be necessary to protect participants in this new kind of research. “The IRB may make very different decisions based on who is on the board, what university it is, and what they’re feeling that day,” says Kelsey Finch, policy counsel at the Future of Privacy Forum. There are hundreds of these IRBs in the US—and they’re grappling with research ethics in the digital age largely on their own.

Read the full article in Wired.

Multi-Stakeholder Group Finalizes Agreement on Best Practices for Drone Use

FOR IMMEDIATE RELEASE             

May 18, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

 

MULTI-STAKEHOLDER GROUP FINALIZES AGREEMENT ON

BEST PRACTICES FOR DRONE USE

Washington, DC – Today, a wide range of privacy groups and industry stakeholders participating in the National Telecommunications & Information Administration (NTIA) Multi-Stakeholder process concerning privacy, transparency, and accountability issues regarding commercial and private use of unmanned aircraft systems (drones) agreed on a set of best practices.

The best practices are intended to encourage operators to use this technology in a responsible, ethical, and respectful way. They provide enough flexibility to support innovative uses of this emerging technology, but at the same time provide firm privacy standards. The best practices acknowledge that the principles are qualified by the understanding that they are to be implemented as “reasonable” and “practical” – in order to allow flexibility for smaller operators, hobbyists or circumstances where compliance would be impractical. The full best practices document is available here.  The Future of Privacy Forum (FPF) has created an easy to read summary of the best practices to help educate drone operators that can be found here.

“Drones are already being used for search and rescue and to assist farmers, home contractors, photographers, newsgatherers, and may soon be used for wireless internet and delivery. These standards will help ensure these technologies are deployed with privacy in mind,” said Jules Polonetsky, CEO, FPF. “This agreement is also a great boost for self-regulation and multi-stakeholder efforts and demonstrates that with good leadership industry and advocates can come together to advance responsible practices.”

The list of groups supporting the best practices includes, Amazon, AUVSI, Center for Democracy and Technology, Consumer Technology Association, CTIA, FPF, Intel, X (formerly Google X), New America’s Open Technology Institute, PrecisionHawk, SIIA, Small UAV Coalition, and a wide range of news media organizations.

###

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. Learn more about FPF’s work by visiting www.fpf.org.

FPF Advisory Board Member Awarded Tenure and Named to Endowed Chair

We are pleased to share that the Samford University Board of Trustees recently voted to award tenure to FPF Advisory Board Member and Cumberland School of Law Associate Professor Woodrow “Woody” Hartzog, and to name him the W. Stancil Starnes Professor of Law.

According to Cumberland School of Law Dean Henry C. Strickland, III, Professor Hartzog follows in many traditions of the best Cumberland professors.  He is a prolific scholar, having already established himself as a global leader in his field with over 25 major articles and book chapters in the last four years (including such leading journals as the California Law Review, the Columbia Law Review, and the Michigan Law Review), and that does not include countless articles for blogs and popular media.  He is also a sought-after speaker, having given lectures by invitation to such institutions as Cambridge University, New York University, Stanford Law School, and Yale Law School.  He has also given talks to the Federal Trade Commission, Facebook, Google and testified before Congress.

Read the full announcement.

Privacy plays major role in new federal government guidance on transgender student rights

Recently, the Department of Justice and the state of North Carolina have filed counter-suits regarding the state’s so called “bathroom bill.” The North Carolina “Public Facilities Privacy & Security Act” requires students to use public restrooms that correspond with their sex assigned at birth and not with the gender with which they identify. On Friday, the Department of Education and the Department of Justice co-authored a Dear Colleague letter directed at public school districts across the country to provide guidance on transgender access. In this guidance, privacy considerations play a significant role in protecting student rights.

The primary law dealing with student privacy in public schools is the Family Educational Rights Privacy Act (FERPA), which limits access to student records to parents and to members of the school community who have a clear need for the data for educational purposes.  FERPA limits further disclosures to specified situations, and primarily to the extent that the specific student data is necessary for the educational purpose being performed. The Departments’ guidance spells out that any nonconsensual disclosure of personally identifiable information (PII), such as the student’s sex or name at birth is an unwarranted disclosure that has the potential to harm a student.

One broad exception under FERPA allows disclosure of “Directory Information,” under which a school may choose to release previously-specified information such as name, birthdate, home address, email, and school grade and activities. However, the Departments’ letter makes it clear that FERPA prevents schools from designating a student’s transgender status as directory information, and that this is due to concerns about student privacy.

FERPA is also not the only law at issue. As the letter points out, “Protecting transgender students’ privacy is critical to ensuring they are treated consistent with their gender identity. The Departments may find a Title IX violation when a school limits students’ educational rights or opportunities by failing to take reasonable steps to protect students’ privacy related to their transgender status, including their birth name or sex assigned at birth.”

Consistent with FERPA, the current guidance does not require schools to amend educational records to reflect a student’s gender identity or name change upon request, however it does emphasize a student’s right to request a change or include comments about information in the educational record they deem to be inaccurate.

Echoing the language in the law, the letter reminds schools that “If the school does not amend the record, it must inform the requestor of its decision and of the right to a hearing. If, after the hearing, the school does not amend the record, it must inform the requestor of the requestor of the right to insert a statement in the record with the requestor’s comments on the contested information… That statement must be disclosed whenever the record to which the statement relates is disclosed.”

Title IX will apply to the procedures for change requests as well since “under Title IX, a school must respond to a request to amend information related to a student’s transgender status consistent with its general practices for amending other students’ records.” That is – if a student or parent requests a change to the record, the school must respond consistent with its usual process for such requests. If the parent complains about the school’s handling of such a request, the school must “promptly and equitably” resolve it under the Title IX grievance procedure the school has in place.

The responsible collection, use, and protection of student data is critical to each child’s success in school. Those who need access to data to provide educational services must have it, but further disclosure, whether within or outside the school community must be carefully controlled, and permitted only in the context of further FERPA exceptions. Particularly in the context of sensitive data like a student’s gender or gender identity, schools must be mindful that disclosing information without consent could do irreparable harm to a student and, as is made clear in the letter, would be a breach of a student’s right to privacy.  Any deviation from these standards would constitute a clear disregard for one of our most vulnerable student populations.

The CNIL released its inspection program for 2016 revealing sectors of focus

In 2016, the CNIL plans to conduct between 400 and 450 inspections.

The total number of inspections will be divided in the following way:

The themes for the 2016 annual program cover both public and private sectors and pertain to people’s daily lives in the international context.

The three themes of focus are:

  1. Data brokers: defined as intermediaries between entities that collect personal data and entities that use such data for their economic activity. The CNIL highlights that profiling based on this data is increasingly accurate and relevant, and represents the major concern with respect to protecting privacy in the 21st In this context, the CNIL will check that data brokers comply with EU privacy law, in particular, data accuracy, information given to people, consent, people right of access and right to object, as well as security principles.
  2. SNIIRAM (Social Security Inter-regimes National Information System). This database comprises dozens of millions of files containing data such as age, gender, diagnosis, death date, city and county of residence, and treatments that were reimbursed. This data is “pseudonymized”. Inspections will aim to check the conformity of this data processing with French privacy law (Act n°78-17 of 6 January 1978 on information technology, data files and civil liberties) and the truthfulness of pseudonymization.
  3. The API-PNR system (Advance Passenger Information-Passenger Name Record System): used notably to fight terrorism.

Additionally, the CNIL will keep on collaborating with its fellow European Data Protection Authorities regarding connected devices upon the fourth Internet Sweep Day.

You can read CNIL’s original post here.