Future of Privacy Forum Comment Regarding Senator Bill Nelson's Report, “Children's Connected Toys: Data Security and Privacy Concerns”
Today, Senator Nelson’s office released a report outlining several privacy and security implications of “connected toys” that the office identified based on conversations with six major toy manufacturers. The report emphasizes the unique sensitivity of children’s personal information; urges toymakers to build privacy and security into their toys from the inception; and suggests that the FTC has authority to monitor and bring enforcement actions under Section 5 and the Children’s Online Privacy Protection Act (COPPA).
“Connected toys can help entertain and educate kids,” said Stacey Gray, Policy Counsel at the Future of Privacy Forum. “But, as Senator Nelson makes clear, companies cannot play around with children’s data. If toymakers run afoul of the strong requirements of COPPA, the monetary penalties can be financially devastating. Leading companies are building trust by providing enhanced disclosures and implementing strong security standards – others should follow suit. I commend Senator Nelson for pressing this important issue.”
Two weeks ago, FPF released a white paper, Kids & the Connected Home: Privacy in the Age of Connected Dolls, Talking Dinosaurs, and Battling Robots, detailing the privacy and security implications of the diverse range of “smart toys” and “connected toys” available today. The paper provides a thorough legal analysis of how COPPA applies to connected toys. Further, FPF urges companies to provide enhanced disclosures regarding their toys. For example, although not required by COPPA, companies can provide notices on toy packaging that make it easy for parents to understand at the point of sale whether they will be asked to consent to the toy’s collection of their child’s information. Finally, the paper details a number of important security steps that leading toy manufacturers are taking; Senator Nelson’s report mentions several of these steps, for example, implementing strong security standards (HTTPS / SSL) to prevent toys from communicating with unauthorized devices or servers.
The future for connected toys is promising. Toymakers that follow leading privacy and security best practices, including those described in Kids & the Connected Home and Senator Nelson’s report, will mitigate financial risks under COPPA and support a thriving connected toy marketplace.
December 14: Lorrie Cranor with FPF Capital Area Academic Network and Consumer Business Dialogue
FPF’s Capital Area Academic Network and Consumer Business Dialogue invites you to join us for a discussion with:
FTC Chief Technologist Lorrie Faith Cranor
During this joint meeting of the FPF Capital Area Academic Network and Consumer Business Dialogue, Lorrie Faith Cranor will discuss her role as FTC Chief Technologist, and her academic research and policy development priorities.
Lorrie Faith Cranor joined the US Federal Trade Commission as Chief Technologist in January 2016. She is on leave from Carnegie Mellon University where she is a Professor of Computer Science and of Engineering and Public Policy, Director of the CyLab Usable Privacy and Security Laboratory (CUPS), and Co-director of the MSIT-Privacy Engineering masters program. She also co-founded Wombat Security Technologies, an information security awareness training company. Cranor has authored over 150 research papers on online privacy and usable security, and has played a central role in establishing the usable privacy and security research community, including her founding of the Symposium on Usable Privacy and Security. She was previously a researcher at AT&T Labs-Research. Cranor holds a doctorate in Engineering and Policy from Washington University in St. Louis. She is a Fellow of the ACM and IEEE.
* * * *
Lunch will be served
* * * *
If you are unable to join us in person, you may join via dial-in. Please just select “RSVP-Dial-in” under the registration link.
New Survey Finds Parents Support School Tech and Data, But Want Privacy Assurances
FOR IMMEDIATE RELEASE
December 8, 2016
Contact: Melanie Bates, Director of Communications, [email protected]
New Survey Finds Parents Support School Tech and Data, But Want Privacy Assurances
Washington, DC – Today, the Future of Privacy Forum (FPF) released a new survey, Beyond One Classroom: Parental Support for Technology and Data Use in Schools. The survey asked parents to comprehensively outline their goals and concerns about the use of technology and student data. Their answers, and the conclusions that can be drawn from them, should inform the debate regarding local, state, and national policies concerning K-12 education and data use.
Beyond One Classroom follows FPF’s 2015 survey, which showed that parents were generally aware of and understood the technology used in their children’s schools, but lacked knowledge of many of the specific laws and practices that provide guidelines and important protections for children’s information.
“Parents are the strongest advocates for their children’s educational success, and all other stakeholders in the educational system should embrace the opportunity to communicate and work with parents as partners in addressing these issues,” said Amelia Vance, FPF Policy Counsel.
The survey found the rates of technology use in schools – both by students and parents – went up by 20% since 2015 (see below graph). Not only are students using more technology provided by schools, but more parents are using school-related technology to supervise their child’s education process, and to communicate with the school.
The key findings of Beyond One Classroom indicate that the closer the use of data is to individual classrooms and to the parent’s child, the more strongly parents support, and desire, the benefits of student data collection and use. According to most parents, the most convincing reasons to use individual student information are to:
Identify students who are struggling so that schools can provide appropriate support earlier (85%);
Personalize the learning process by identifying the strengths and weaknesses of individual students (82%); and
Help schools build profiles on individual students, such as those used to predict best fits for future vocations or professions (57%).
The results point out that as data use becomes less directly tied to students, parents still want to comprehend the benefit to the classroom. Moreover, parents support research that can be used in a school or classroom to directly benefit students.
“Communicating and demonstrating these additional benefits to parents is key to establishing and maintaining trust in an ongoing relationship between parents, their communities, and the schools and vendors that serve them,” Vance said.
The findings also illustrated that parents may be seeing the value school districts gain from the use of traditionally “sensitive” information. Support for the collection and use of parents’ marital status, family income, and social security numbers all increased significantly:
parental marital status — 45%, up 8%,
family income — 37%, up 10%, and
Social Security numbers — 35%, up 11%.
Over half of parents of school age children now agree that race and ethnicity are data that is appropriate for collection and use by schools.
“The use of this type of data, appropriately controlled and protected, is critical for research that identifies potentially discriminatory policies and practices, and it is heartening to see that parents appreciate the value that this data can provide when it is used responsibly,” Vance said.
“Overall, 2016 showed the increasing prevalence of technology use by both parents and students, increasing levels of support by parents of the appropriate collection and use of data by schools, and continued strong belief in the possibilities of technology to improve their child’s educational opportunities,” said Brenda Leong, FPF’s Senior Counsel and Director of Operations. “The goals for educators, advocates, and policymakers remain to communicate policies clearly; establish transparent practices; and work with parents as key partners in the educational system to achieve the best learning outcomes for our children.”
Beyond One Classroom was produced with funding provided from the Bill & Melinda Gates Foundation.
###
The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.
Parents Support School Tech and Data, But Want Privacy Assurances: FPF 2016 Parent Survey
In 2015, the Future of Privacy Forum (FPF) set out to gain a better understanding of what public school parents actually know and want concerning the use of technology and collection of data in their children’s schools, as well as their perspectives on the benefits and risks of student data use within the educational system. Media reports routinely quote parents who are afraid or reluctant to support the use of technology, electronic education records, and student data within their own schools or throughout the educational process. The original survey sought to understand the views of parents, the critical stakeholders in the education policy discussion. Results showed that parents were generally aware of and understood the technology used in their children’s schools, but lacked knowledge of many of the specific laws and practices that provide guidelines and important protections for children’s information.
Since we reached those conclusions, the public conversation on this topic has barely slowed. Therefore, this year we returned to parents to find out – has their understanding grown? Have their concerns changed? And we had new questions to ask, as technology is used in ever-expanding ways, and the effects of those newly passed laws begin to be felt.
Today, we are releasing our 2016 survey, Beyond One Classroom: Parental Support for Technology and Data Use in Schools. Largely, our findings parallel those from last year. Unsurprisingly, it remains true that the closer the use of the data is to the individual classroom and to their own child, the more strongly parents support, and desire, the benefits of student data collection and use. As data use becomes less directly tied to students, parents still want to comprehend the benefit to the classroom. Parents support research that can be used in a school or classroom to directly benefit students.
What changed from last year to now? Technology use is spreading, fast. Almost eighty percent of parents are now using school-related technology to keep up with their child’s educational progress, and ninety percent of children are using technology provided or recommended by their school.
In addition, parents increasingly see the value school districts gain from the use of a variety of personal data – with growing percentages saying that in addition to categories like grades and attendance, it is appropriate for schools to use data concerning disciplinary records and participation in school lunch programs. Even more noteworthy, parents may be seeing the value of broader research based on analysis that necessarily includes traditionally “sensitive” information. Support for the collection and use of parents’ marital status, family income, and social security numbers all increased significantly; perhaps most importantly, over half of parents of school age children now agree that race and ethnicity are data that is appropriate for collection and use by schools. The use of this type of data, appropriately controlled and protected, is critical for research that identifies potentially discriminatory policies and practices, and it is heartening to see that parents appreciate the value this data can provide when it is used responsibly.
One new finding was that nearly all parents of school age children believe they should be informed with whom and for what purpose their child’s record is being shared. In addition to schools telling parents when the record is shared for educational purposes, we also asked about an increasing trend from state laws that may be limiting the rights of parents: while schools have the ability to share the educational record with partner vendors for core educational functions, parents may want to protect their ability to authorize use of their child’s electronic record to external third parties – for example, for tutoring programs, non-school-sponsored educational clubs or activities, or financial aid and advance educational programs.
However, many state laws are being written that either prohibit this parental control altogether, or narrowly limit it to colleges and prospective employers only. Those leaving the school system for their next opportunity are not the only ones who may wish to use their own record for expanded purposes, and parents overall want this ability. Some want it via “opt-in,” others prefer “opt-out” or by direct request only, but less than half agree that it should be limited to colleges and employers only, or prohibited altogether. Policymakers should take note and instead of legislating this limitation, should leave it to schools and their communities to make this decision for themselves. As FPF’s Jules Polonetsky and Brenda Leong wrote in a recent article on this topic, “Parents, as those most in-tune with their individual child’s needs, have the right to be an active partner and make the final decision about additional sharing and use of their child’s information.”
An important area that remains a prime target for better communication and awareness is helping parents understand current laws and practices that protect student data. Slightly fewer parents than last year felt confident that they know what federal laws currently protect student data, or what those laws require. This is such a clear issue that advocates and educators at all levels should focus part of their future outreach on making parents aware of these existing requirements. The FPF Parent Guide To Student Privacy, written in cooperation with the National PTA and ConnectSafely, can be a great start in providing parents with that information. The Foundation for Excellence in Education recently released a “Student Data Privacy Communications Toolkit” that provides districts and states with templates for webpage content, letters to send home to parents, and many other key ways to communicate.
Overall, our 2016 survey showed the increasing prevalence of technology use by both parents and students, increasing levels of support by parents of the appropriate collection and use of data by schools, and continued strong belief in the possibilities of technology to improve their child’s educational opportunities. The goals for educators, advocates, and policymakers remain to communicate policies clearly; establish transparent practices; and work with parents as key partners in the educational system to achieve the best learning outcomes for our children.
Beyond One Classroom was produced with funding provided from the Bill & Melinda Gates Foundation.
Brenda Leong is senior counsel and director of operations at the Future of Privacy Forum. Amelia Vance is policy counsel for education privacy at the Future of Privacy Forum.
Uber and Location Permission
Uber recently announced that its iOS app will require access to location data either “Always” or “Never.” Given some of the confusion about the change, we are writing to help consumers better understand what Uber modified and why.
For context, it is important to understand how smartphone location permissions work. Until 2014, on the two major smartphone platforms—iOS and Android—granting the location permission to an app meant that app had access to that user’s location at all times, whether or not the app was open.
In 2014, with the rollout of iOS 8, Apple created new location permission options, which allowed iOS app developers to offer users the option to permit location access “While Using the App” in addition to the preexisting options of “Never” or “Always.”
Under iOS 8 and until recently, Uber provided users with all three options. When users selected the “While Using the App” option, Uber only received location data when the app was actually open and in the foreground of a user’s phone – i.e. when a user was actively calling an Uber or checking the location of the vehicle on the map.
As a result, key information about how far riders were from the vehicle they were seeking to meet was not available to the company. The only location information available was the pin denoting the location the rider chose originally when hailing the Uber. If the rider minimized the app or switched to a different app and then exited their building around the corner from the main door, or if they were down the block or across the street from the pin’s location, accurate location information was not available to help the Uber driver find them.
Similarly, at drop-off, for riders using the “While Using the App” option, Uber did not receive information about whether the user was dropped off at the intended location-any route information was garnered solely from the driver’s phone. For example, if the map led the driver to a location where users had to cross a dangerous street and walk around the building to actually get to the entrance, or need to wave down the driver to get picked up again because they were not taken to the intended location, that information is today usually not available to Uber; this lack of data limits Uber’s ability to correct the error for future passengers.
Because of these shortcomings, Uber is now asking users to allow the company to access location information when riders may not have the app in the foregrounds of their phone – specifically from the time of hailing the Uber to 5 minutes after drop off. Although doing so technically grants Uber access to location “Always”, the company has committed to access location only for specified purposes: to improve pickups, drop offs, customer service and to enhance safety.
Some critics have objected to Uber’s change, focusing on the fact that the company will be able to collect user location for 5 minutes after drop off. Some Uber users want to keep where they are heading after drop off confidential or don’t want that information stored in a database. Perhaps so, but those users would be far wiser to turn off location settings for their phone entirely, given the number of apps that could be collecting their location information. Riders can still do so when exiting the car, or they can choose to “Never” share their location with Uber and instead type in their location and destination.
Location information is often sensitive and users should think carefully before allowing apps to access location data. Many apps don’t need location data and request it simply to share it with advertising companies or other third parties. But here, the convenience and service improvements intended by this change promise real value for riders.
What apps are getting your location and are they collecting it “Always” or only “When Using the App”? On iOS, checking Settings/Privacy/Location Services to find this information and to revoke permission from any app that doesn’t have a good reason to be collecting your location.
FPF and FOSI understand that connected toys are creating opportunities for interactive play and education, but also creating new privacy and security challenges. Toys that can become a child’s closest friend, play games, and provide advice through the use of sophisticated cloud-based computing and personal information are raising questions about how to ensure families can make appropriate choices about how data is collected and used. Learn more about Kids & The Connected home below.
Kids & The Connected Home: Privacy in the Age of Connected Dolls, Talking Dinosaurs, and Battling Robots
FOR IMMEDIATE RELEASE
December 1, 2016
Contact:
Melanie Bates, Future of Privacy Forum, 202-596-9837, [email protected]
Emma Morris, Family Online Safety Institute, 202-775-0158, [email protected]
Kids & The Connected Home: Privacy in the Age of
Connected Dolls, Talking Dinosaurs, and Battling Robots
Connected toys can raise privacy concerns, particularly when they use children’s personal information
The Children’s Online Privacy Protection Act (COPPA) applies to many connected toys, helping to ensure that parents are in control of their kids’ data
COPPA does not apply to general connected home devices that serve families
Leading companies should go beyond legal requirements to build trust in toys that can connect to online services
The connected toy marketplace is still young, and many toymakers could take additional steps to make their data practices clear and to better secure children’s data
FPF and FOSI understand that connected toys are creating opportunities for interactive play and education, but also creating new privacy and security challenges. Toys that can become a child’s closest friend, play games, and provide advice through the use of sophisticated cloud-based computing and personal information are raising questions about how to ensure families can make appropriate choices about how data is collected and used.
“At FPF, we recognize the benefits that connected home technologies can provide to individuals, families, and kids,” said Jules Polonetsky, FPF’s CEO. “We also know that privacy issues can make or break adoption of connected home tech – particularly questions about whether kids’ privacy and security are sufficiently safeguarded. Children are playing with dolls that listen and talk, interactive animals, and apps that link toys to digital services. As connected toys become more popular, it is important for toymakers to be transparent about their data practices and to mitigate security risks. Federal law provides key safeguards, but more can be done to build trust.”
“The new world of connected toys offers an extraordinary range of opportunities for learning, exploring and just plain fun,” said Stephen Balkam, FOSI’s Founder and CEO. “However, data is the difference between these ‘smart’ toys and traditional ones. Parents need to be aware of how a toy collects, shares, and stores their child’s information. Industry must ensure the safety and security of that data and find innovative and effective ways to inform parents of how their child’s information is being used.”
Kids & The Connected Home describes the current landscape of connected toys, identifying what distinguishes them from conventional toys and other smart toys. The white paper analyzes existing regulations under COPPA that have established important safeguards for information collected from children, and how those regulations apply. Stacey Gray, FPF Policy Counsel, points out that shopping for connected toys often happens in retail stores, where COPPA does not require a privacy disclosure.
“Parents should be able to understand at the point of sale—before bringing it home to their child—whether or not they will later be asked to consent to the toy’s collection of their child’s personal information,” Stacey said. “A full privacy policy on the box is not likely to be helpful, but some sort of cue will help parents decide before purchasing whether they are comfortable with the toy or whether they would like to do more research.” As a prime example, Kids & The Connected Home cites a packaging label notice on Fischer-Price’s connected toy, Smart Toy.
The report also provides several leading privacy and security practices that can help companies build trust, such as: 1) Determine when local processing, remote processing, and third-party sharing is appropriate, and mitigate security risks for the selected approach to data processing; 2) Ensure that strong encryption standards prevent the toy from communicating with unauthorized devices or servers; and 3) Do not use passwords that cannot be changed by users, and do not share a single default password between toys.
On July 20, 2016, FPF, FOSI, and Christian Science Monitor Passcode hosted Kids & the Connected Home in Washington, DC. This event featured discussion by a diverse group of industry experts about kids, connected toys and devices, and privacy. In Kids & The Connected Home, FPF & FOSI discuss and expand upon the issues raised at that event, which concerned the emergence of connected toys and their social and legal implications. Throughout, the report addresses key questions that animate the discussion around children and the connected home:
Connected Toys. Does COPPA apply to connected toys? And does the screen-less nature of many connected toys suggest that an update to COPPA may be required to adequately address privacy concerns?
Connected Homes. Does COPPA apply to general connected home devices that serve families?
Parental Controls. Do parents have appropriate controls and information to make well-informed decisions regarding their children’s interactions with the connected home and toys? If not, how can this be addressed?
Data Security. How do we ensure that connected toys are sufficiently secure?
“Trust is a crucial precondition for widespread adoption of connected toys,” said John Verdi, FPF’s VP of Policy. “Parents must be satisfied that the digital products they invite into their homes will safeguard children’s privacy and keep information secure.”
###
The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.
The Family Online Safety Institute (FOSI) is an international, non-profit organization which works to make the online world safer for kids and their families. FOSI convenes leaders in industry, government and the non-profit sectors to collaborate and innovate new solutions and policies in the field of online safety. Through research, resources, events and special projects, FOSI promotes a culture of responsibility online and encourages a sense of digital citizenship for all. Learn more about FOSI by visiting www.fosi.org.
FPF Comments on NHTSA’s Federal Automated Vehicles Policy
FPF Comments on NHTSA’s Federal Automated Vehicles Policy
Today, the Future of Privacy Forum (FPF) submitted comments regarding the Department of Transportation’s National Highway Traffic Safety Administration Request for Comment on the Federal Automated Vehicles Policy guidance published in the Federal Register on September 23, 2016.
FPF commended NHTSA for their forward-looking Federal Automated Vehicles Policy guidance and its acknowledgement that privacy will play a key role in promoting trust in connected vehicles. FPF believes that the Guidance and its emphasis on privacy is an important first step in building that trust.
Automated vehicle technologies hold tremendous potential to transform the safety and convenience of the vehicles in which we ride. According to NHTSA’s research, a full 94 percent of the 35,092 fatalities in U.S. motor vehicle accidents last year could be attributed to human error. NHTSA is right to recognize that evolving technologies can reduce the number of accidents on our roads, and also have the potential to increase mobility for the elderly and Americans with disabilities who may be constrained from driving altogether. FPF applauds NHTSA for releasing guidance that will create guidelines that enable these technologies to enter the market, while retaining the flexibility necessary for them to evolve and improve along the way.
Some safety technologies under development may hinge on the ability of cars to detect and understand what is around them better than a human driver. In addition, decisions that were previously manual or mechanized may now be algorithmic, relying on data inputs collected from each of the many new kinds of sensors and computing being built into vehicles.
As we welcome these new technologies, it is critical that at the front-end of the connected car revolution, we build responsible data practices into connected cars—just as we have in other new and unfamiliar technologies that have disrupted other sectors. Being optimistic about the benefits of new data uses does not mean we need to be naive about the risks. As highly automated vehicles develop and as we better understand the nature of the data and what is needed for these vehicles to operate, we also need to be sensitive to the privacy concerns that develop.
But it is nearly impossible today to anticipate today the full range of the privacy questions or concerns that will arise given the diversity of technologies, uses, and models being considered today, and those we can not yet imagine. This is especially true as these new technologies begin to transform the relationship of consumers to vehicles altogether, such as through fleet-based and other models.
As these policies do advance, it will be critical to ensure alignment between Federal, State, and self-regulatory guidance for the automated vehicle ecosystem. Consistency between federal, state, and self-regulatory regimes in this space are critical given that automotive companies design systems at a national and global level. Patchwork legislation could impede interoperability or render vehicles incapable of driving across state lines. Instead, NHTSA should encourage states to follow NHTSA’s example in issuing guidance that can be easily updated in light of rapidly evolving technology, rather than adopting this guidance as law at this time.
Privacy Papers 2016: Spotlight on the Winning Authors
Today, FPF announced the winners of the 7th Annual Privacy Papers for Policymakers (PPPM) Award. This Award recognizes leading privacy scholarship that is relevant to policymakers in the United States Congress, at U.S. federal agencies, and for data protection authorities abroad.
From a record number of nominated privacy-related papers published in the last year, five were selected by Finalist Judges, after having been first evaluated highly by a diverse team of academics, advocates, and industry privacy professionals from FPF’s Advisory Board. Finalist Judges and Reviewers agreed that these papers demonstrate a thoughtful analysis of emerging issues and propose new means of analysis that can lead to real-world policy impact, making them “must-read” privacy scholarship for policymakers.
by Jennifer Daskal, Associate Professor, American University Washington College of Law
Jennifer Daskal is an Associate Professor of Law at American University Washington College of Law. She teaches and writes in the fields of criminal law, national security law, and constitutional law. From 2009-2011, Daskal was counsel to the Assistant Attorney General for National Security at the Department of Justice and, among other things, served on the Secretary of Defense and Attorney General-led Detention Policy Task Force. Prior to joining DOJ, she was the senior counterterrorism counsel at Human Rights Watch, worked as a staff attorney for the Public Defender Service for the District of Columbia, and clerked for the Honorable Jed S. Rakoff. She spent two years before joining WCL’s faculty as a national security law fellow and adjunct professor at Georgetown Law Center.
Daskal is a graduate of Brown University, Harvard Law School, and Cambridge University, where she was a Marshall Scholar. Recent publications include The Un-Territoriality of Data, 326 Yale L.J. 326 (2015); Pre-Crime Restraints: The Explosion of Targeted, Non-Custodial Prevention, 99 Cornell L. Rev. 327 (2014); After the AUMF, 5 Harvard Nat’l Sec. L. J. 115 (2014) (co-authored with Steve Vladeck); and The Geography of the Battlefield: A Framework for Detention and Targeting Outside the ‘Hot’ Conflict Zone, 171 Penn. L. Rev. 1165 (2013). Daskal has published op-eds in the New York Times, Washington Post, International Herald Tribune, L.A. Times, and Salon.com, and she has appeared on BBC, C-Span, CNN, MSNBC, and NPR, among other media outlets. She is an Executive Editor of and regular contributor to the Just Security blog.
by Joshua A. Kroll, Engineer, Security Team, Cloudflare; Joanna Huey, Princeton University; Solon Barocas, Princeton University; Edward W. Felten, Princeton University; Joel R. Reidenberg, Stanley D. and Nikki Waxberg Chair in Law, Fordham University School of Law; David G. Robinson, Upturn; and Harlan Yu, Upturn
Joshua Kroll is an Engineer working on cryptography and Internet security at the web performance and security company Cloudflare. He is also an affiliate of the Center for Information Technology Policy at Princeton University, where he studies the relationship between computer systems and human governance of those systems, with a special focus on accountability. His previous work spans cryptography, software security, formal methods, Bitcoin, and cybersecurity policy. He holds a PhD in Computer Science from Princeton University, where he received the National Science Foundation Graduate Research Fellowship in 2011.
by Danielle Keats Citron, Professor of Law, University of Maryland Carey School of Law
Danielle Keats Citron is the Morton & Sophia Macht Professor of Law at the University of Maryland Francis King Carey School of Law. Her work focuses on information privacy, cyber law, automated systems, and civil rights. She received the 2005 “Teacher of the Year” award.
Professor Citron is the author of Hate Crimes in Cyberspace (Harvard University Press 2014). Cosmopolitan and Harper’s Bazaar nominated her book as one of the “Top 20 Best Moments for Women” in 2014; Boston University Law Review held an online symposium on her book in 2015. Her current work focuses on the privacy policymaking of state attorneys general. Professor Citron’s scholarship has appeared, or is forthcoming, in Boston University Law Review (twice), California Law Review, George Washington Law Review, Hastings Law Journal, Michigan Law Review (twice), Minnesota Law Review, Notre Dame Law Review, Southern California Law Review, Washington University Law Review, Washington Law Review (twice), Washington & Lee Law Review, U.C. Davis Law Review, and others. Her opinion pieces have been featured in The Atlantic, New York Times, TIME, CNN, Guardian UK, New Scientist, and Slate. She has appeared on National Public Radio, HBO’s John Oliver Show and the New York Times video series. She is a technology contributor at Forbes.com and a member of Concurring Opinions.
by Kirsten Martin, Assistant Professor of Strategic Management & Public Policy, George Washington University School of Business; and Helen Nissenbaum, Professor, Media, Culture, and Communication & Computer Science, New York University
Kirsten Martin is an assistant professor of strategic management & public policy at the George Washington University’s School of Business. She is the principle investigator on a three-year grant from the National Science Foundation to study online privacy. Martin is also a member of the advisory board of the Future Privacy Forum and the Census Bureau’s National Advisory Committee for her work on privacy and the ethics of “big data.” Martin has published academic papers in Journal of Business Ethics, First Monday, Business and Professional Ethics Journal, and Ethics and Information Technology and is co-author of the textbook Business Ethics: A managerial approach. She has written teaching cases for the Business Roundtable Institute for Corporate Ethics including cases on Google in China as well as Bailouts and Bonuses on the financial crisis. She is regularly asked to speak on privacy and the ethics of big data.
Martin earned her BS in engineering from the University of Michigan and her MBA and PhD from the University of Virginia’s Darden Graduate School of Business. Her research interests center on online privacy, corporate responsibility, and stakeholder theory.
Before beginning her academic career, Martin worked at Sprint Telecommunications developing corporate strategy and Internet solutions. She also provided information system consulting services for Anderson Consulting (currently Accenture) to clients in the coal, pharmaceutical, telecommunication, and oil and gas industries.
Helen Nissenbaum is Professor of Media, Culture, and Communication, and Computer Science, at New York University, where she is also Director of the Information Law Institute. Her eight books include Obfuscation: A User’s Guide for Privacy and Protest, with Finn Brunton (MIT Press, 2015), Values at Play in Digital Games, with Mary Flanagan (MIT Press, 2014), and Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford, 2010). Her research has been published in journals of philosophy, politics, law, media studies, information studies, and computer science. Grants from the National Science Foundation, Air Force Office of Scientific Research, and the U.S. Department of Health and Human Services Office of the National Coordinator have supported her work on privacy, trust online, and security, as well as studies of values embodied in design, search engines, digital games, facial recognition technology, and health information systems.
Recipient of the 2014 Barwise Prize of the American Philosophical Association, Prof. Nissenbaum has contributed to privacy-enhancing software, including TrackMeNot (for protecting against profiling based on Web search) and AdNauseam (protecting against profiling based on ad clicks). Both are free and freely available.
Nissenbaum holds a Ph.D. in philosophy from Stanford University and a B.A. (Hons) from the University of the Witwatersrand. Before joining the faculty at NYU, she served as Associate Director of the Center for Human Values at Princeton University.
Risk and Anxiety: A Theory of Data Breach Harms
(Full paper available pending publication)
by Daniel Solove, Professor of Law, George Washington University Law School; and Danielle Citron, Professor of Law, University of Maryland Carey School of Law
Daniel J. Solove is the John Marshall Harlan Research Professor of Law at the George Washington University Law School. He is also the founder of TeachPrivacy, a company that provides privacy and data security training programs to businesses, schools, healthcare institutions, and other organizations. An internationally-known expert in privacy law, Solove has been interviewed and quoted by the media in several hundred articles and broadcasts, including the New York Times, Washington Post, Wall Street Journal, USA Today, Chicago Tribune, the Associated Press, ABC, CBS, NBC, CNN, and NPR.
He has written numerous books including Nothing to Hide: The False Tradeoff Between Privacy and Security (Yale 2011), Understanding Privacy (2008), The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (Yale 2007), and The Digital Person: Technology and Privacy in the Information Age (NYU 2004). He has also written several textbooks including Information Privacy Law (Aspen, 5th ed. 2015), Privacy Law Fundamentals (IAPP, 3d ed. 2015), Privacy and the Media (Aspen, 2d ed. 2015), Privacy, Law Enforcement, and National Security (Aspen, 1st ed. 2015), Consumer Privacy and Data Protection (Aspen, 1st ed. 2015), and Privacy, Information, and Technology (Aspen Publishing, 3rd ed. 2012). All of these books were co-authored with Paul M. Schwartz.
Additionally, Professor Solove has written more than 50 law review articles in the Harvard Law Review, Yale Law Journal, Stanford Law Review, Columbia Law Review, NYU Law Review, Michigan Law Review, U. Pennsylvania Law Review, U. Chicago Law Review, California Law Review, Duke Law Journal, and many others. He has also written shorter works for Wired, Scientific American, the Washington Post, and several other magazines and periodicals.
Danielle Keats Citron is the Morton & Sophia Macht Professor of Law at the University of Maryland Francis King Carey School of Law. Her work focuses on information privacy, cyber law, automated systems, and civil rights. She received the 2005 “Teacher of the Year” award.
Professor Citron is the author of Hate Crimes in Cyberspace (Harvard University Press 2014). Cosmopolitan and Harper’s Bazaar nominated her book as one of the “Top 20 Best Moments for Women” in 2014; Boston University Law Review held an online symposium on her book in 2015. Her current work focuses on the privacy policymaking of state attorneys general. Professor Citron’s scholarship has appeared, or is forthcoming, in Boston University Law Review (twice), California Law Review, George Washington Law Review, Hastings Law Journal, Michigan Law Review (twice), Minnesota Law Review, Notre Dame Law Review, Southern California Law Review, Washington University Law Review, Washington Law Review (twice), Washington & Lee Law Review, U.C. Davis Law Review, and others. Her opinion pieces have been featured in The Atlantic, New York Times, TIME, CNN, Guardian UK, New Scientist, and Slate. She has appeared on National Public Radio, HBO’s John Oliver Show and the New York Times video series. She is a technology contributor at Forbes.com and a member of Concurring Opinions.
The Finalist Judges also selected four papers for Honorable Mention on the basis of their uniformly strong reviews from the Advisory Board.
Ambiguity in Privacy Policies and the Impact of Regulation, by Professors Joel Reidenberg, Fordham University School of Law, Jaspreet Bhatia, Carnegie Mellon University, Travis Breaux, Carnegie Mellon University, and Thomas B. Norton, Fordham University
The winning authors have been invited to join FPF and Honorary Co-Hosts Senator Edward J. Markey, Congressman Joe Barton, and Congresswoman Diana DeGette, to present their work at the U.S. Senate with policymakers, academics, and industry privacy professionals. This annual event will be held on January 11, 2017, the day before the Federal Trade Commission’s PrivacyCon. FPF will subsequently publish a printed digest of summaries of the winning papers for distribution to policymakers, privacy professionals, and the public. To RSVP, please visit privacypapers.eventbrite.com.
This Year's Five Must-Read Privacy Papers: The Future of Privacy Forum Announces Recipients of Annual Privacy Award
FOR IMMEDIATE RELEASE
November 16, 2016
Contact: Melanie Bates, Director of Communications, [email protected]
This Year’s Five Must-Read Privacy Papers: The Future of Privacy Forum Announces Recipients of Annual Privacy Award
Washington, DC – Today, the Future of Privacy Forum (FPF) announced the winners of the 7th Annual Privacy Papers for Policymakers (PPPM) Award. The PPPM Award recognizes leading privacy scholarship that is relevant to policymakers in the United States Congress, at U.S. federal agencies, and for data protection authorities abroad. The winners of the 2017 PPPM Award are:
Accountable Algorithms, by Joshua A. Kroll, Engineer, Security Team, Cloudflare; Joanna Huey, Princeton University; Solon Barocas, Princeton University; Edward W. Felten, Princeton University; Joel R. Reidenberg, Stanley D. and Nikki Waxberg Chair in Law, Fordham University School of Law; David G. Robinson, Upturn; and Harlan Yu, Upturn
Privacy of Public Data, by Kirsten Martin, Assistant Professor of Strategic Management & Public Policy, George Washington University School of Business; and Helen Nissenbaum, Professor, Media, Culture, and Communication & Computer Science, New York University
Risk and Anxiety: A Theory of Data Breach Harms, by Daniel Solove, Professor of Law, George Washington University Law School; and Danielle Citron, Professor of Law, University of Maryland Carey School of Law
From a record number of nominated privacy-related papers published in the last year, these five were selected by Finalist Judges, after having been first evaluated highly by a diverse team of academics, advocates, and industry privacy professionals from FPF’s Advisory Board. Finalist Judges and Reviewers agreed that these papers demonstrate a thoughtful analysis of emerging issues and propose new means of analysis that can lead to real-world policy impact, making them “must-read” privacy scholarship for policymakers.
The Finalist Judges also selected four papers for Honorable Mention: Biometric Cyberintelligence, by Professor Margaret Hu, Washington & Lee University School of Law; Ambiguity in Privacy Policies and the Impact of Regulation, by Professors Joel Reidenberg, Fordham University School of Law, Jaspreet Bhatia, Carnegie Mellon University, Travis Breaux, Carnegie Mellon University, and Thomas B. Norton, Fordham University; Data Driven Discrimination at Work, by Professor Pauline Kim, Washington University in Saint Louis School of Law; and Friending the Privacy Regulators, by Professor William McGeveran, University of Minnesota Law School.
“Policymakers are grappling with privacy issues that are more sophisticated than ever, and academic scholarship can provide a much-needed source of innovative thinking and new ideas,” said Jules Polonetsky, FPF’s CEO. “Through this Award, we aim to bring the very best of academic privacy scholarship to the people who are crafting real-world policy. These are the ‘must-reads’ for any well-informed policymaker who wants to make a difference in privacy.”
The winning authors have been invited to join FPF and Honorary Co-Hosts Senator Edward J. Markey and Congressman Joe Barton and Congresswoman Diana DeGette, Co-Chairs of the Congressional Bi-Partisan Privacy Caucus,to present their work at the U.S. Senate with policymakers, academics, and industry privacy professionals. This annual event will be held on January 11, 2017, the day before the Federal Trade Commission’s PrivacyCon. FPF will subsequently publish a printed digest of summaries of the winning papers for distribution to policymakers, privacy professionals, and the public.
PPPM is free, open to the general public, and widely attended. To RSVP, please visit privacypapers.eventbrite.com.
###
The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.