Future of Privacy Forum Releases Statement on FTC's Settlement with Retail Tracking Company

“Today’s settlement by the Federal Trade Commission with Nomi, a mobile location tracking company, demonstrates the need for companies with emerging technologies to be clear about the choices they provide consumers”, said FPF Executive Director Jules Polonetsky.

The need for transparent privacy policies in the field of mobile location analytics was the genesis behind the development of FPF’s Mobile Location Analytics “Code of Conduct”. Signatories of this binding set of privacy principles must provide consumer with clear opt-out options for tracking by MAC address or similar identifiers.

Consumers can easily and quickly opt-out of mobile location analytics by entering their device’s Wi-Fi or Bluetooth MAC addresses at www.smart-places.org.

Comcast Newsmakers: Jules Talks Consumer Privacy and Location-Based Services

In a recent episode of Comcast Newsmakers, Jules discussed the many new ways that data about your location is being used — such as navigation or helping you connect with friends. He addressed what can you do to protect the privacy of that data? A conversation with Jules Polonetsky, Executive Director and Co-Chair of the Future of Privacy Forum.

When Opting Out of Student Data Collection Isn't the Solution

Opting-out, whether for testing or other activities, is getting a lot of press in the Education world right now.  Jules and I recently wrote for EdSurge on this topic…when it is, or isn’t, the right policy decision.

The bottom line? “Opt-out rights should be an opportunity for parents to decline uses of data that truly are secondary to the functioning of our educational system – not an opportunity to avoid resolution of education policy issues that affect all students.”

Check out the full article here.

Posted by: Brenda Leong, Legal and Policy Fellow/Education Privacy

Comparing the Data Broker Bill to the Consumer Privacy Bill of Rights

Considering the privacy concerns raised by data brokers, we thought it would be useful to compare how data brokers are treated under Senator Edward Markey’s recent data broker bill, which has considerable support from privacy and consumer advocates (as well as Senators Blumenthal, Franken, and Whitehouse), and under the Consumer Privacy Bill of Rights.

The different receptions each bill has received is interesting in light of the fact that Sen. Markey’s bill echoes much of the Consumer Privacy Bill of Rights (CPBR) by giving consumers greater access to and control over personal data collected about them. While the CPBR has a broader scope and attempts to set out privacy and security obligations across sectors and industries, its provisions would still apply to data brokers and perhaps accomplish some of the same aims as the Markey bill.

Scope: Who and What Gets Covered?

The Markey bill applies exclusively to data brokers, which are defined as a “commercial entity that collects, assembles, or maintains personal information concerning an individual who is not a customer or an employee of that entity in order to sell the information or provide third party access to the information.” The CPBR applies to any “covered entity” that collects, creates, processes, retains, uses, or discloses personal data, which would include data brokers. Though it does provide several carve outs, it is unlikely that most data brokers within the meaning of the Markey bill would fall under any of the exceptions in the CPBR.

The CPBR has a far more detailed definition of personal information, which echoes the definition used by the Federal Trade Commission. It focuses on any information that is linked or linkable to a specific individual – or that is linked to a device that is associated with or routinely used by an individual. The definition sets forth a non-exhaustive list of types of personal data (notably “unique persistent identifiers” and “unique identifiers or other uniquely assigned or descriptive information about personal computing or communication devices”). It carves out de-identified data (detailing the requirements for data to be considered de-identified), deleted data, employee information and some “cyber threat indicators.”

The Markey bill does not give a precise definition of personal information but does differentiate between non-public and public record information, placing different correction requirements on data brokers for each category. The Markey bill emphasizes that non-public information is “of a private nature,” but it is unclear whether non-public information would precisely capture all of the types of data envisioned by the CPBR definition.

Transparency Obligations

Both acts would oblige data brokers to provide the individual with a clear and conspicuous notice. The Markey Bill requires the data broker to maintain an internet website to allow individuals to review information about them and to express their preferences. The CPBR does not have this requirement, but the broader bill is more precise with regards to the content and format of the notice, which shall inform the consumer about the company’s privacy and security practices.

Accuracy, Access, and Correction Rights

Both bills require data brokers to maintain reasonable procedures to ensure that personal data under their control is accurate. However, the core of both bills is focused on improving consumer access to their personal data, as well as their ability to correct any inaccuracies.

1. Access

The Markey bill and the CPBR would set out obligations for data brokers to provide consumers with access to their information upon request. In terms of access requirements, the Markey bill requires data brokers to “provide an individual a means to review any personal information or any other information that specifically identifies that individual, that the data broker collects, assembles, or maintains on that individual.” The CPBR requires that individuals be given “reasonable access to, or an accurate representation of, personal data that both pertains to such individual and is under the control of such covered entity.”

Although both bills offer an access right, the CPBR contains some limitations that could result in consumers being denied access by data brokers. Specifically, the bill states that the “degree and means of any access shall be reasonable and appropriate for the risks associated with the personal data, the risk of adverse action against the individual if the data is inaccurate, and the cost of the covered entity of providing access.” There is considerable question about how these considerations might limit access.

2. Correction

Both bills require data brokers to give individuals the ability to challenge the accuracy and completeness of any personal data they hold about a consumer. If a consumer can prove an inaccuracy, the Markey bill requires the data broker to correct the information. It is interesting to note that the CPBR would allow a data broker to decline to correct an inaccuracy in cases where the use of incorrect data cannot result in an adverse action against an individual, but then gives consumers the right to demand the information be deleted.

While the CPBR places a number of limitations on consumer access, the Markey bill places similar limits on the ability of consumers to correct information. For example, the CPBR limits access requests that are “frivolous and vexatious,” and the Markey bill allows data brokers to deny requests to correct information that it believes are “frivolous or irrelevant.”

Individual Control and Accountability

The CPBR emphasizes the importance of individual control, and would require data brokers to provide individuals with “reasonable means to control the processing of personal data about them in proportion to the privacy risk to the individual and consistent with context.” However, the bill is largely silent as to what “reasonable means” could entail, but it allows for companies to satisfy the right of individual control by permitting individuals to request that their personal information be de-identified. The Markey bill is more direct, giving individuals the right to stop data brokers from using, sharing, or selling their personal information for marketing purposes through an opt-out mechanism.

The Markey bill’s accountability obligation consists only of an auditing requirement, requiring each data broker to “establish measures that facilitate the auditing or retracing of any internal or external access to, or transmission of, any data containing personal information collected, assembled, or maintained by the covered data broker.” The accountability obligations are much broader in the CPBR. It includes, but is not limited to, employee training, audits, “privacy by design,” and contractual requirements.

***

Despite the completely different reception each bill received, this short analysis suggests that both would impose similar obligations on data brokers. Both work to improve transparency around data practices and to improve consumer’s access to the vast array of personal information being held by data brokers.  There is no question many provisions in the CPBR have been sharply criticized, but the bill could largely facilitate the same goals as the Markey bill. In some respects, the broad nature of the CPBR even allows the bill to go further than the Markey bill, offering important security obligations and contextual considerations that are not addressed by the Markey bill at all.

-Joseph Jerome and Bénédicte Dambrine

FPF Senior Fellow Peter Swire Receives Privacy Leadership Award

The Future of Privacy Forum congratulates our Senior Fellow, Peter Swire, on receiving the 2015 Privacy Leadership Award from the International Association of Privacy Professionals. Peter has worked with FPF since 2010 on a wide range of privacy and cyber-security issues, such as encryption, Big Data, and many more. His current work with FPF includes research on de-identification, Mutual Legal Assistance Treaties, and privacy for the Internet of Things.

Here are Peter’s remarks from the March 6 IAPP Summit, when receiving the award:

 

I’d like to thank the Academy.

I am honored to receive this award and humbled to do so with many people in this audience who have inspired me and done so much to protect privacy internationally, and over so many years.

My thanks to the IAPP, its Board, and Trevor Hughes for his amazing leadership.  Isn’t this Summit amazing?

This moment has led me to reflect on how I first became involved in privacy, and why.  I would highlight four things.

First, I have had a life-long fascination with the intersection of technology, policy, and law.  I love science fiction, and I especially love stories about how people and societies respond to new technological challenges.  In many ways that’s what we do as privacy professionals.

Second, I love doing research. How can we make sense of the complex issues that face us?  My first article on the law of the Internet was in 1993, and the issues have kept coming fast and furious ever since.

Third, I am drawn to public service and solving real-world problems.  Some of those experiences were mentioned in the introduction by Jim – my work as Chief Counselor for Privacy in the Clinton Administration, the efforts to craft a global DNT standard, and then President Obama’s Review Group on Intelligence and Communications Technology.

Fourth, working on privacy gives me an opportunity to teach, and hopefully inspire, a new generation of students and privacy professionals.  One great pleasure of attending IAPP functions is the opportunity to talk with former students and see how they have grown into leaders in their own right.

Today and moving forward, I feel fortunate to be part of some amazing organizations, as we study and address some of the most pressing privacy problems in the world.

First, is Georgia Tech, my new home since 2013.  Each fall I co-teach a privacy and technology course with the one and only Annie Antón. One exciting part about being at Georgia Tech is the work we are doing to bring technologists and engineers together with law and policy —  and we have multiple research streams on IoT, cybersecurity, and numerous other topics.

Second, I recently started as a Senior Counsel with Alston & Bird.  Jim Harvey and David Keating lead an outstanding team of privacy and cyber lawyers.  I am excited to be solving real-world problems for clients in this new setting.  I welcome all of you to come speak with us if we can be of assistance.

Next, I continue as a fellow with the Future of Privacy Forum.  Jules Polonetsky leads the day-to-day efforts with his incredible energy and intelligence, including a growing list of successful self-regulatory agreements.  Chris Wolf, the other co-founder, many of you know for the grace, class, and insight he brings to every endeavor.

Finally, on the list of organizations I am proud to be affiliated with, the Center for Democracy and Technology this year welcomed Nuala O’Connor as its new leader.  I was on a panel just yesterday with Nuala, and it will be exciting for all of us to see where she will lead.

A passion for research, solving practical problems, providing thought leadership, and articulating moral vision – those are themes, I hope, for the work of many of you in this audience.

We are fortunate to be privacy professionals in this era when privacy is at the center of so many important debates in our society.  My thanks to the IAPP for this award today.  My bigger thanks to all of you for what we can build together in the years to come.

White House Consumer Privacy Bill Starts an Important Conversation

This afternoon the White House released a discussion draft of its Consumer Privacy Bill of Rights Act. Jules Polonetsky and Chris Wolf issued the following response:

Today’s release of the text of Consumer Privacy Bill of Rights demonstrates the U.S.’s continuing commitment to advance privacy protection for consumers.

Although the current system of FTC enforcement actions and strong sectoral laws provide important tools to address privacy harms, the ideas proposed in the bill are certain to help frame the discussion about privacy practices by companies and not-for profits in the future.  International data regulators should recognize that this bill is not a critique of the current system, but the opening of a nuanced conversation that seeks to balance benefit and risk, while being considerate of consumer rights.

The impact of the bill is not likely to be legislative, but the ideas it raises will have impact on the privacy debate.  Key concepts in the bill that will advance the privacy discussion in a very practical way include the focus on context to shape appropriate uses of data, the recognition that assessing benefit and risk is important, and the consideration of Privacy Review Boards as internal or external structures that could help assess beneficial uses of data that would otherwise be constrained by law.

What Privacy Papers Should Policymakers Be Reading?

Each year, FPF invites privacy scholars and authors interested in privacy questions to submit articles and papers to be considered by members of our Advisory Board, with an aim toward showcasing those articles that should inform any conversation about privacy among policymakers in Congress, as well as at the Federal Trade Commission and in other government agencies. For our fifth annual Privacy Papers for Policymakers, we received a record number of submissions covering topics ranging from data use in elections, government surveillance, the always-present cloud, and even the emergence of app stores for the brain.

However, in a year where the White House launched a review into the privacy implications of “Big Data,” scholars and privacy advocates were particularly focused on looking at how algorithms are changing our society – and what that means for individuals’ privacy. Our Advisory Board selected papers that addressed this challenge head-on. It also selected papers that describe how information about consumers is being collected, gathered, and used across the Internet, and what role the FTC should specifically play in policing the privacy and data security practices around those activities.

Our top privacy papers for 2014 are, in alphabetical order:

Big Data’s Disparate Impact

Solon Barocas & Andrew Selbst

Four Privacy Myths

Neil Richards

Free: Accounting for the Costs of the Internet’s Most Popular Price

Chris Jay Hoofnagle & Jan Whittington

The Scope and Potential of FTC Data Protection 

Woodrow Hartzog & Dan Solove

The Scored Society

Danielle Citron & Frank Pasquale

The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future 

Pam Dixon and Robert Gellman

These papers illuminate concerns that will continue to drive privacy debates in 2015. Already in the new year, we have seen the White House push new proposals to address student privacy and identity theft. The Internet of Things has made headlines, moving from something that is coming to something that is here.

We want to thank EY for their special support of this project. And we thank the scholars, advocates, and Advisory Board members that are engaged with us to explore the future of privacy. We look forward to celebrating the formal release of FPF’s Privacy Papers for Policymakers digest at an event the evening of March 3rd, ahead of the IAPP Global Privacy Summit. If you are interested in attending, please reach to us at [email protected].

The Student Privacy Pledge and Security

We know it is critical for ed tech companies to get security right.

The Student Privacy Pledge developed by FPF and SIIA requires signatories to maintain “a comprehensive security program that is reasonably designed to protect the security . . . of personal student information . . . appropriate to the sensitivity of the information.” “Reasonableness” in this context is not a subjective standard, open to interpretation by each company, but rather a standard used and interpreted across a range of contexts by the Federal Trade Commission. It is also the basis of California’s new Student Online Privacy Protection Act.

A company’s security and other commitments made under the Student Privacy Pledge are legally enforceable. Under Section 5 of the Consumer Protection Act, the Federal Trade Commission (FTC) can take action against companies that commit deceptive trade practices. It is a form of deception to make a public statement such as signing the Student Privacy Pledge but then implementing practices that do not conform to those public statements. The FTC and various State Attorneys General have brought enforcement actions against companies that made privacy promises to their consumers and then violated those promises.

Companies with security practices that fall short can therefore face legal liability. The pledge does not designate specific security technologies, because those measures need to be tailored to the service, context and sensitivity of the protected information.  What constitutes reasonable may depend on the specific company and nature of the data that it handles, and must evolve over time as new threats and solutions emerge.

For services that hold sensitive student data, login password encryption or equally protective measures are basic measures that companies must implement. Of course, effective security requires ongoing training of company employees, and toward that end, we have also kicked off a series of workshops starting next week to help companies further hone their security and privacy practices.

When a company signs the Pledge, they publicly commit to its responsible and appropriate standards for student privacy and data security, and the pledge allows the public – the media, parents, educators and federal regulators – to hold these companies accountable. It’s exactly this sort of public scrutiny that makes the pledge an effective means for ensuring data accountability. This accountability requires that all stakeholders understand its security standard, enforceability and other elements of the Student Privacy Pledge.

-FPF and SIIA

White House Return to Big Data Focuses on Price Discrimination

Today, the White House released an interim progress report detailing the Administration’s efforts on privacy in big data since its landmark report last spring. The update highlights the President’s recent calls for new privacy legislation, including efforts on student privacy and the Consumer Privacy Bill of Rights, and also calls for deeper understanding of differential pricing — or what is commonly called price discrimination. The White House Council of Economic Advisors released a companion report, exploring how companies can use the information they collect to more effectively charge different prices to different customers.

The nineteen-page report notes that industry is already using big data for targeted marketing and beginning to experiment with personalized pricing. According to FPF Senior Fellow Peter Swire, the report is “a readable discussion of price discrimination from an economists’ perspective. In a non-ideological way, it explains the terminology used by professional economists.”

Much of the report summaries existing concerns about the use of big data and price discrimination in general. Most economists, the report notes, consider price discrimination in the context of differences among consumers in their willingness to pay for good. While the report focuses on this sort of value-based pricing, it also notes the need for further discussion about the impact of big data on “risk-based” pricing, where sellers charge prices based on differences in the cost of serving different groups.Without much elaboration, the report cautions that “[b]ig data encourages risk-based pricing by enabling more fine-grained measurement of various risks,” citing the ability to track individual driving behaviors as a potential example.

As for the rise of value-based pricing, the report concedes that “current knowledge is mainly anecdotal.” It suggests that companies are either “moving slowly or remaining quiet, perhaps due to fears that consumers will respond negatively, but also because the methods are still being developed.”

It concludes that many concerns about price discrimination could be addressed through existing antidiscrimination, privacy, and consumer protection laws, and it recommends that companies provide more transparency about industry data practices. Swire further explained that two quotations distill the report’s key takeaways:

 (1) “The challenge for policy in this area will be to promote the application of big data where it can discourage excessive risk-taking and help solve adverse selection problems, while preventing unfair discrimination against consumers who have little control over newly-measureable risk factors.”

(2) “However, given the speed at which both the technology and business practices are evolving, commercial applications of big data deserve ongoing scrutiny, particularly where companies may be using sensitive information in ways that are not transparent to users and fall outside the boundaries of existing regulatory frameworks.”

 While further conversation about the potentially negative impacts of big data are warranted, the report takes a bullish approach toward concerns about price discrimination. As our digital footprints grow, the report states that broad trends suggest price discrimination is not yet having a negative impact on online consumer activities, and instead, consumers “are making use of the Internet and the variety of choices and tools it provides to ensure that they get a good deal.”

-Joseph Jerome, Policy Counsel

 

 

Student Privacy Pledge Crosses Milestone with 100 Signatories

Media Contacts:

FPF: Nicholas Graham, (571) 291-2967, [email protected]

SIIA: Sabrina Eyob, (202) 789-4480, [email protected]

PR Agency: Farrah Kim, (202) 568-8986, [email protected]

 

STUDENT PRIVACY PLEDGE CROSSES MILESTONE WITH 100 SIGNATORIES

Responsible Privacy Practices Affirmed by  

Growing Number of Ed-Tech Companies

 

WASHINGTON, D.C. – Wednesday, February 4, 2015 – The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) today announced that the groundbreaking Student Privacy Pledge now has 108 signatories. The pledge is a list of 12 commitments that school service providers have made to affirm K-12 student information is kept private and secure.

 

The Pledge was launched in October 2014 with 14 signatory companies, grew to 75 by early January, and has now reached a milestone – surpassing 100 signatories. The recent increase was fueled in part by President Obama’s strong support of the Pledge, announced on January 12th as part of a suite of policy proposals designed to further student privacy.

 

Unlike proposed legislative or regulatory actions, which may not go into effect for some period of time, the Pledge is binding and enforceable as soon as each company signs it. Signatory companies are listed online at www.studentprivacypledge.org.

 

“Passing 100 signatories to the Student Privacy Pledge is a clear affirmation of the industry’s commitment to the responsible use of student data,” said Jules Polonetsky, executive director, FPF. “We are grateful to the President for championing the Pledge, and we applaud the companies on the Pledge for their leadership on this issue.”

 

“The Pledge has strong momentum, with more than 100 high-tech companies signing to articulate their safeguarding of student information,” said Mark Schneiderman, SIIA’s senior director of education policy. “Along with existing laws and school agreements, the Pledge is part of a strong legal framework that ensures teachers and students can feel safe about technology use in school.”

 

In addition to the Pledge, SIIA and FPF continue other student privacy leadership efforts. On February 17-18 in Washington, D.C., FPF – in partnership with ReThink Education and with participation from SIIA – is organizing and hosting its first-ever Student Privacy Boot Camp for start-ups, small, and medium-sized ed-tech companies. Similarly, SIIA has provided an analysis of existing and new student privacy laws for its member companies. These and related programs help ensure vendors handling student data understand and comply with privacy laws and best practices.

 

The Student Privacy Pledge outlines a dozen commitments regarding the responsible collection, maintenance, and use of student personal information. The Pledge was developed by FPF and SIIA with guidance from school service providers, educator organizations, and other stakeholders following a convening by U.S. Representatives Jared Polis (CO) and Luke Messer (IN). The Pledge has also been endorsed by the National PTA and the National School Boards Association, among others.

The full text of the Pledge, more information about how to support it, and a list of current signatories are available at studentprivacypledge.org.

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.  For more information, visit fpf.org.

About SIIA

SIIA is the leading association representing the software and digital content industries. SIIA represents approximately 800 member companies worldwide that develop software and digital information content. SIIA provides global services in government relations, business development, corporate education and intellectual property protection to the leading companies that are setting the pace for the digital age. For more information, visit www.siia.net.