Looking Back at Smart Cities Week at FPF

If you’ve been in Washington, DC this week, you may have noticed a certain buzz in the air – and not just from the wifi-connected streetlights on Pennsylvania Avenue. It’s Smart Cities Week, and D.C. has been humming all week with urban leaders, leading companies, tech and civic innovators, open data gurus, and advocates and academics from all around the globe. Throughout the week, these diverse stakeholders have come together to ensure we realize the full potential of smart cities.

Here at FPF, we kicked off the week by convening a roundtable addressing “Privacy in the Smart City: Searching for a Middle Ground.” In a room filled with urban and industry thought leaders, academics, and consumer advocates, we worked to tackle difficult questions about how we can secure the social benefits of connected, data-driven technologies in cities while also protecting individual privacy and autonomy.

Featuring speakers such as Claire Barrett, the Chief Privacy Officer of the U.S. Department of Transportation, Barney Krucoff, the Chief Data Officer for the District of Columbia, and Michael Mattmiller, the Chief Technology Officer for the City of Seattle, the roundtable strove to identify and advance responsible data practices in a hyperconnected world.

That same day, the City of New York and more than 20 partner cities around the U.S. announced a new Guideline for the Internet of Things (IoT) which will provide a framework “to help government and our partners responsibly deploy connected devices and IoT technologies in a coordinated and consistent manner.” The first guideline stresses Privacy + Transparency and calls on cities to “protect and respect the privacy of residents and visitors” and to be “open and transparent about the ‘who, what, where, when, why and how’ of data collection, transmission, processing and use.” Other key commitments focus on Data Management, to ensure data quality and accessibility; Infrastructure, to deploy and use IoT in ways that maximize public benefit; Security, to ensure public IoT systems are protected and resilient; and Operations + Sustainability, to deploy IoT in ways that ensure their financial, operational, and environmental sustainability.

Smart Cities were also the focus of key federal initiatives this week. First, the White House announced over $80 million in new federal investments in the Smart Cities Initiative. This also doubled the number of participating cities and communities, now rising to over 70. The goal of the Smart Cities Initiative is “to make it easier for cities, Federal agencies, universities, and the private sector to work together to research, develop, deploy, and testbed new technologies that can help make our cities more inhabitable, cleaner, and more equitable.”

Later in the week, the White House also hosted its inaugural “Open Data Innovation Summit,” which sought celebrate the growing number of data-sharing initiatives and to ensure that the momentum for open data will continue. With federal databases leading the way, the day highlighted how open government data can spark innovation, drive inclusivity, and improve communities globally.

Of course, one of the greatest risks of opening government datasets to the public is the possibility that individuals may be re-identified from those datasets, compromising citizens’ privacy. Given this key concern, it is especially timely that the National Institute of Science and Technology (NIST) has released a draft of its special publication on “De-Identifying Government Datasets” for comment (to comment, see here). This guidance, by Simson Garfinkel, should be a valuable tool for every government agency.

Throughout the week, these new frameworks and conversations have shined light on several important aspects of the smart city landscape. And while it may not always be easy to balance the risks and opportunities of new civic technologies, or the competing goals of open government and individual privacy, this week proves that we’re off to a good start.

For more information on FPF’s Smart Cities working group, contact Kelsey Finch at [email protected].

October 6th Event: "Owned: How the Internet of Things Took Our Property and Privacy"

FPF’s Capital-Area Academic Network invites you to join us for a discussion of:

“Owned: How the Internet of Things Took Our

Property and Privacy”

Chapter 5: Private Property

with Author Joshua Fairfield

Professor of Law, Washington and Lee University School of Law

October 6, 2016; 12:00 PM – 2:00 PM

WATCH LIVE

REGISTER

Summary

This chapter examines the constantly shifting relationship between property and privacy. It suggests that property can be used as a bulwark, a shield, a floor, and a foundation for privacy. Property law’s characteristics of clarity, robustness, and default exclusion help to fortify privacy rights that are fuzzy and fragile, rights that too often operate only when the consumer has taken costly steps to protect her interests. Property law serves to shore up privacy’s weakest points.

Joshua Fairfield

is an internationally recognized law and technology scholar, specializing in digital property, electronic contract, big data privacy, and virtual communities. He has written on the law and regulation of e-commerce and online contracts and on the application of standard economic models to virtual environments. Professor Fairfield’s current research focuses on big data privacy models and the next generation of legal applications for cryptocurrencies. His articles on protecting consumer interests in an age of mass-market consumer contracting regularly appear in top law and law-and-technology journals, and policy pieces on consumer protection and technology have appeared in the New York Times, Forbes, and the Financial Times, among other outlets. Before entering the law, Professor Fairfield was a technology entrepreneur, serving as the director of research and development for language-learning software company Rosetta Stone.

Professor Fairfield consults with U.S. government agencies, including the White House Office of Technology and the Homeland Security Privacy Office, on national security, privacy, and law enforcement within online communities and as well as on strategies for protecting children online. From 2009 to 2012, he provided privacy and civil liberties oversight for Intelligence Advance Research Project Activity (IARPA) research programs in virtual worlds. In 2012-13 he was awarded a Fulbright Grant to study trans-Atlantic privacy law at the Max Planck Institute for Research on Collective Goods in Bonn, Germany. He was elected a member of the American Law Institute in 2013.

* * * *

Lunch will be served

* * * *

Feel free to share this invitation with any of your colleagues who may be interested in attending.  This workshop is sponsored by the FPF Capital-Area Academic Network (FPF-CAN) which was created to support networking opportunities for academics and other researchers interested in privacy broadly defined. If you would like to be added to the FPF-CAN mailing list, please email Lauren Smith ([email protected]).

Future of Privacy Forum Awarded National Science Foundation Grant to Support Industry-Academic Collaboration on National Privacy Research Priorities

The Future of Privacy Forum (FPF) has received a $300,000, two-year grant from the National Science Foundation (NSF) to establish a Privacy Research and Data Responsibility Research Coordination Network (RCN). The RCN will produce a community of academic researchers and industry practitioners to support industry-academic cooperation to address research priorities identified in the Administration’s recently released National Privacy Research Strategy (NPRS). The NPRS identifies several priorities for privacy research including increasing transparency of data collection, sharing, use and retention; ensuring that information flows and use are consistent with privacy rules; and reducing privacy risks of analytical algorithms.

The NSF grant will provide FPF with the necessary support to launch and regularly convene the RCN, drawing on its relationships with industry chief privacy officers, academic researchers, regulators, lawyers, social scientists, civil rights advocates, social philanthropists, and government actors, to facilitate communication and sharing of information and ideas. The RCN will prompt industry action and advocate for privacy-aware approaches to collecting and processing personal information in a manner that respects individual privacy, equality and fairness.

Activities under the grant will include:

“The overarching goal of the National Privacy Research Strategy is to produce knowledge and technology that will enable individuals, commercial entities, and the government to benefit from transformative technological advancements, enhance opportunities for innovation, and provide meaningful protections for personal information and individual privacy,” said Jules Polonetsky, FPF’s CEO. “At a time when industry actors are the custodians of a wide range of consumer data, bringing together corporate, academic, and advocacy constituents is critical to making practical privacy progress.”

The RCN will inform the public debate on privacy, provide useful information to policymakers, and contribute to the development of systems and products used to help society realize the benefits of networked information technology without sacrificing privacy and individual rights.

For more information, and to learn how to become involved with the efforts of the RCN, click here or visit fpf.org/rcn.

First Take: Privacy in the Federal Automated Vehicles Policy

In the federal guidance for autonomous vehicles issued yesterday, the Department of Transportation and the National Highway Traffic Safety Administration have wisely recognized that privacy will play a key role in promoting trust in connected vehicles. This guidance and its emphasis on privacy is an important first step in building that trust.

The highly-anticipated federal guidance calls for companies to complete annual 15-point safety assessments, and outlines model state policies and regulatory tools that will enable adoption of autonomous vehicles on U.S. roads. As FPF has highlighted previously, new vehicle technologies have the potential to greatly reduce motor vehicle deaths while increasing overall safety and convenience—94% of the 35,000 motor vehicles deaths annually are caused by human error, many of which could be prevented by new accident-avoidance technologies. These life-saving technologies increasingly rely on the new types of data and sensors built into connected cars.

The DOT guidance calls for entities involved in manufacture, design, testing or sale of highly automated vehicle systems in the United States to submit a safety assessment letter that outlines their adherence to the guidelines on 15 specific topics, including privacy and cybersecurity. The letters should be issued at least four months before active public road testing begins on a new automated feature, and again after significant updates. The assessments will initially serve as an optional exercise, but the agencies indicated that they may become mandatory after a public comment period kicked off by yesterday’s guidance and a potential future rulemaking.

The core privacy component of the guidance highlights privacy principles set forward in the White House Consumer Privacy Bill of Rights (CPBR) and the Alliance of Automobile Manufacturers and Global Automakers’ Privacy Principles. It calls for manufacturers, as well as other entities, to take steps to protect consumer privacy by focusing on the Fair Information Privacy Principles of transparency; choice; respect for context; data minimization, de-identification and retention; security; integrity and access; and accountability. The specific application of these concepts for satisfactory completion of the safety assessments will be negotiated in the public comment and review periods to come.

Unlike other aspects of the assessment, privacy is a focus for vehicles operating with lower levels of automation; not just for Highly Automated Vehicles. Much of the document only applies to levels 3-5 of autonomy, but the “Cross-Cutting Areas of Guidance” under which the core privacy sections of the document fall apply to all connected vehicles. While more highly automated vehicles must rely more on data than others, cars of all automation levels increasingly incorporate new data-reliant technologies.

Data sharing is also a core component of the guidance, based on the understanding that safety and product improvements may be best achieved through sharing de-identified vehicle data among industry and regulators. The document also makes clear that NHTSA and the DOT increasingly see themselves relying on new vehicle data to implement their safety and oversight missions, through either special or general authority, and potentially by calling for enhanced data collection tools such as enhanced data recorders (EDRs).

The call for de-identification in data sharing leads the DOT to articulate its reliance on the definition of “personal data” in the Consumer Privacy Bill of Rights, as “data that are under the control of a covered entity, not otherwise generally available to the public through lawful means, and are linked, or as a practicable matter linkable by the covered entity, to a specific individual, or linked to a device that is associated with or routinely used by an individual” (emphasis added). Linkability to an individual is considered key, as the guidance cites both the “as a practical matter linkable” standard from the CPBR and the “reasonably linkable” standard set forth by the FTC. This definition is discussed in a footnote under the “Data Recording and Sharing Section.” Although the definition does not appear in the privacy section, this standard may represent the DOT’s current operational definition of “personal data.”

A future in which new kinds of mobility will expand transportation opportunities for all segments of society will depend on broad collection and use of data to ensure maximum safety and convenience for consumers. This framework certainly allows that use, and creates accountability guidelines that ensure data drives benefits for consumers and society. We look forward to being engaged in the comment and expert-driven processes to refine and implement this guidance.

Supporting Parental Choice for Student Data

Today, the Future of Privacy Forum (FPF) released Supporting Parental Choice for Student Data. The paper discusses the importance of trusting parents to make the final decision on when and where to share their child’s educational information outside of the school environment.

The paper acknowledges valid concerns – that parents may not realize or understand everything they’ve been asked to share, to whom the data will be sent, or all the purposes the data can be used for.  However, it asserts that the right solution is not to completely prohibit parental consent and make it illegal, but simply to make it rigorous and informed, and to ensure any data shared will only be used for expressly authorized purposes.

Parents, as those most in-tune with their individual child’s needs, have the right to be an active partner and make the final decision about additional sharing and use of their child’s information.

Read Supporting Parental Choice for Student Data

Read FPF’s op-ed in The Hill

FPF Statement on the Department of Transportation's Federal Automated Vehicles Policy

Image Source: https://www.transportation.gov/AV

Today, the Department of Transportation released, Federal Automated Vehicles Policy, which provides guidelines for the safe deployment of automated safety technology. Lauren Smith, FPF Policy Counsel, made the following statement:

“NHTSA has wisely recognized that privacy will play a key role in promoting trust in connected vehicles. Today’s guidance is an important first step in building that trust.

A future in which new kinds of mobility will expand transportation opportunities for all segments of society will depend on broad collection and use of data to ensure maximum safety and convenience for consumers. Today’s framework certainly allows that use, and creates accountability guidelines that ensure data drives benefits for consumers and society.”

READ BLOG POST

READ OP-ED

Student Privacy Pledge Reaches Milestone of 300 Signatories

FOR IMMEDIATE RELEASE            

September 12, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

STUDENT PRIVACY PLEDGE REACHES MILESTONE OF 300 SIGNATORIES

Washington, DC – The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) are pleased to announce that the Student Privacy Pledge has passed a new milestone – over 300 ed tech company signatories. The Pledge is a list of commitments that school service providers can make to affirm that K-12 student information is kept private and secure. It has been endorsed by President Obama, the National PTA, and the National School Boards Association.

Companies who take the Student Privacy Pledge commit to twelve legally enforceable obligations, including that they will not sell student personal information, and will not collect or use student personal information other than what is needed for the given educational purposes. The commitments in the Pledge concisely detail existing federal law and regulatory guidance regarding the collection and handling of student data, and encourage service providers to more clearly articulate these practices.

The Pledge was introduced by FPF and SIIA in October 2014 with 14 signatory companies, and it took effect in January 2015 as a legally enforceable agreement for signing companies that provide services to schools. Since then, the number of Pledge signatories has substantially increased, reaching 200 in November 2015, and now passing 300.

“As students return to school for the Fall and teachers develop their curricula to incorporate the benefits of data and technology, companies that take the Pledge are ensuring that they are accountable for how they safeguard student data,” said Jules Polonetsky, CEO, Future of Privacy Forum.

“The continued strength in growth of this Pledge is indicative of the recognition within the industry of our duty to safeguard students and their personal information,” said Brendan Desetti, SIIA’s Director of Education Policy. “The Pledge’s enforceable provisions have also driven a rapid growth of the privacy-minded culture within companies today that places privacy first in the development process alongside functionality.”

The process for becoming a Pledge signatory is often an opportunity for companies to review their own privacy policies and make helpful updates—for example, to make the policy clearer or more understandable to parents and teachers. When a company requests to be added to studentprivacypledge.org, the team at FPF first reads that company’s privacy policy, and although FPF does not certify compliance, companies are only added as Pledge signatories if their policies do not contain any obvious inconsistencies with the text of the Pledge. Some frequent issues include the following:

FPF and SIIA are proud to facilitate the efforts of education technology companies who demonstrate industry leadership in protecting student privacy by signing the Student Privacy Pledge. Companies and organizations wishing to review the full text of the Pledge and consider participation are invited to visit www.studentprivacypledge.org or email [email protected].

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts and includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. For more information, visit www.fpf.org.

About SIIA

SIIA is the leading association representing the software and digital content industries. The Education Technology Industry Network (ETIN) of SIIA serves and represents more than 200 of SIIA’s 800 member companies worldwide that provide educational software applications, digital content, online learning services and related technologies across the K-20 sector. SIIA-ETIN shapes and supports the industry by providing leadership, advocacy, government relations, corporate education, intellectual property protection, business development opportunities and critical market information. For more information, visit www.siia.net.