Shedding Light on Smart City Privacy

Fpf Smart City 1200 545

Today, the Future of Privacy Forum is releasing a new tool for municipal and technology leaders: a visual guide “Shedding Light on Smart City Privacy.” This tool will help citizens, companies, and communities understand the technologies at the heart of smart city and smart community projects – and their potential impact on privacy. It also connects stakeholders with guidance documents, best practices, and other resources designed to help them implement new technologies in privacy-protective ways.

Cities and communities generate data through a vast and growing network of connected technologies that power new and innovative services ranging from apps that can help drivers find parking spots to sensors that can improve water quality. Such services improve individual lives and make cities more efficient. While smart city technologies can raise privacy issues, sophisticated data privacy programs can mitigate these concerns while preserving the benefits of cities that are cleaner, faster, safer, more efficient, and more sustainable.

Shedding Light on Smart City Privacy highlights the wide range of connected technologies and services appearing throughout our communities – everything from streetlights that measure air and noise pollution to smart electric grids to buses that re-route based on demand. The visual guide also provides important context to these new technologies and services, allowing visitors to sort technologies and services based on what sectors they might serve, what other technologies enable them, and who within their communities might use or deliver them.

The visual guide also describes some of the top privacy concerns raised by smart city technologies and services, both for individuals and for communities. It describes key tools for mitigating those risks, including robust privacy programs, transparency and consent, de-identification, vendor management, and data minimization.

Finally, the tool also acts as a central repository for privacy-related guidance documents, best practices, reports, codes of conduct, and other resources that can help local policymakers, technologists, and citizens navigate these complex issues and integrate digital services in privacy-protective ways.

As cities and communities become more connected, it is critical that they learn to leverage the benefits of a data-rich society while minimizing threats to individual privacy and civil liberties. Our new guide provides a useful tool to help all smart city and community stakeholders hold important discussions and make informed decisions about their privacy policies and practices.

The tool launched at RightsCon Brussels on March 31, 2017.

This tool highlights:

Future of Privacy Forum Releases Interactive Tool for Understanding the Technologies Powering Smart Cities

Sanfranciso Airport

FOR IMMEDIATE RELEASE             

March 31, 2017

Contact: Melanie Bates, Director of Communications, [email protected]

Future of Privacy Forum Releases Interactive Tool for Understanding the

Technologies Powering Smart Cities

Brussels, Belgium – Today, the Future of Privacy Forum (FPF) released Shedding Light on Smart City Privacy, a new tool designed to help citizens, companies, and communities understand the technologies at the heart of smart city and smart community projects as well as their potential impact on privacy. The guide was released by FPF Policy Counsel, Kelsey Finch, during the panel Cities of the Future, Data of the Present: Protecting Privacy and Fostering Development(Link Expired) at RightsCon Brussels, a conference exploring the societal impact of technology and policy. 

“As cities and communities become more connected, it is critical that they learn to leverage the benefits of a data-rich society while minimizing threats to individual privacy and civil liberties,” Finch said. “Our new guide provides a useful tool to help all smart city and community stakeholders hold important discussions and make informed decisions about their privacy policies and practices.”

Shedding Light on Smart City Privacy highlights the wide range of connected technologies and services appearing throughout our communities – everything from streetlights that measure air and noise pollution to smart electric grids to buses that re-route based on demand. The visual guide also provides important context to these new technologies and services, allowing users to sort technologies and services based on what sectors they might serve, what other technologies enable them, and who within their communities might use or deliver them.

The visual guide also acts as a central repository for privacy-related guidance documents, best practices, reports, codes of conduct, and other resources that can help local policymakers, technologists, and citizens navigate these complex issues and integrate digital services in privacy-protective ways. The guide can be accessed at fpf.org/2017/03/30/smart-cities/

###

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

Chasing the Golden Goose: What is the path to effective anonymisation?

Searching for effective methods and frameworks of de-identification often looks like chasing the Golden Goose of privacy law. For each answer that claims to unlock the question of anonymisation, there seems to be a counter-answer that declares anonymisation dead. In an attempt to de-mystify this race and un-tangle de-identification in practical ways, the Future of Privacy Forum and the Brussels Privacy Hub joined forces to organize the Brussels Privacy Symposium on De-identification – “Identifiability: Policy and Practical Solutions for Anonymisation and Pseudonymisation”. The event brought together researchers from the US and the EU, having academic, regulatory and industry background, discussing their latest solutions for such an important problem. Discussion of the selected research papers was preceded by the presentation of an overview report on “Preserving the Utility of Data and Privacy of Individuals” by Deutsche Telekom.

This contribution (published in Privacy In Germany (PinG2017)) looks at the work of invited researchers in detail, puts it in context and aggregates its results for the essential debate on anonymisation of personal data. The overview shows that there is a tendency to stop looking at anonymisation/identifiability in binary language, with the risk-based approach gaining the spotlight and the idea of a spectrum of identifiability already generating practical solutions, even under the General Data Protection Regulation.

READ ARTICLE

The Brussels Privacy Symposium was made possible with the generous support of our Lead Founding Sponsor Deutsche Telekom and additional support from our Founding Sponsors Information Technology Industry Council, Microsoft, SAP, and TomTom. The Brussels Privacy Symposium is supported by a grant from the National Science Foundation.

Smart Cities

Cities and communities generate data through a vast and growing network of connected technologies that power new and innovative services ranging from apps that can help drivers find parking spots to sensors that can improve water quality. Such services improve individual lives and make cities more efficient. While smart city technologies can raise privacy issues, sophisticated data privacy programs can mitigate these concerns while preserving the benefits of cities that are cleaner, faster, safer, more efficient, and more sustainable.

The Top 10: Student Privacy News (Feb-March 2017)

The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories.
  1. New America has released an ethical framework to help colleges use predictive analytics to benefit students (this report follows their previous report discussing the promise and peril of predictive analytics last fall).
  2. Predictive Analytics was a hot topic in the past month. In addition to the New America report and FPF’s SXSWedu panel last week, The Hechinger Report reported that students are worried ed tech will predict failure before they have a chance to succeed. This article argues that the UK should “ditch the humans, leave [routine interventions with students] to the machine” and raises some interesting points. RealClearEducation says it is “Time for a Better Dropout Diagnosis.” Inside Higher Ed has an article that worries that predictive analytics’ probabilities can create self-fulfilling prophecies, and then posted the responses of companies that work on analytics.
  3. Chicago Public Schools had a major breach of student data, including detailed information about hundreds of K-8 students and the disability services they received that was posted publicly on their website in an excel spreadsheet of district expenditures (also see articles from the parent who found the breach at Diane Ravitch and the Parent Coalition for Student Privacy). This may be a good time to remember the Student Privacy Principles, which say that everyone who handles data, including administrative, financial, and medical staff, should be trained in how to protect it.
  4. Many privacy groups are raising concerns about a bill that would amend the 2016 California Electronic Communications Privacy Act (see this fantastic summary about how the law applies to schools from F3 Law). This law unintentionally applies to schools and significantly limits districts’ control over the electronic devices they issue to employees and students (and likely violates some school responsibilities under CIPA as I discuss in my previous surveillance and privacy report). However, privacy advocates say that this amendment acts as a “sledgehammer, not a scalpel” and undermines student privacy.
  5. The National School Boards Association released “Data Security for Schools: A Legal and Policy Guide for School Boards.”
  6. The SLDS Grant Program has released a “Data Governance Manual Rubric” and an “Attributes of Effective Data Products Checklist.”
  7. A major student data tool – the Data Retrieval Tool, which automatically filled in FAFSA applications with information from tax returns through a data connection with the IRS – went down without notification due to “concerns that information from the tool could potentially be used by identity thieves.” Completing the FAFSA will now be much more difficult for students and their families, and it is unknown when the tool will be back up.
  8. One of my favorite student privacy scholars, Elana Zeide, has published a new law review article on “The Limits of Education Purpose Limitations” in FERPA and state laws.
  9. The Association of Public & Land-Grant Universities (APLU) partnered with colleges and universities to develop 14 case studies highlighting ways in which institutions use student level data to improve student outcomes.
  10. Immigration data and schools continues to be a hot button issue.

*Want more news stories? Email Amelia Vance at avance AT fpf.org to subscribe to our student privacy newsletter.

Regulating the Online Advertising Market: Yesterday, Today, and Tomorrow

Today, the Senate Committee on Commerce, Science, and Transportation held a hearing to examine the broad policy issues facing the Federal Communications Commission (FCC). Commissioners Pai, Clyburn, and O’Rielly outlined their priorities for the FCC, and answered questions about their proposed plans—including for the future of net neutrality and privacy of data collected online.

As we have written previously, the online advertising ecosystem is complex. In considering the role the FCC should play in regulating broadband providers, it is important to understand the history of online advertising, the current realities of an inter-connected system, and the emerging technologies that are allowing for increasingly more detailed audience profiles. In the absence of comprehensive federal privacy legislation, it will be critical in years ahead to understand the nuances of the data landscape.

Early Days of Cookie-Based Tracking and the DoubleClick-Abacus Debate

On June 4, 1999, internet advertising network DoubleClick announced that it had agreed to acquire Abacus, a major data broker, in order to link its own cookie-based web surfing data with the data broker’s information about consumer catalogue purchases. At the time, cookies were the primary mechanism for online tracking, and privacy concerns focused on how to regulate cookie-based tracking, especially when used by third party ad networks who were considered invisible to the average user. Other players, such as AOL, Yahoo, AltaVista, and Lycos were less of a concern to regulators because consumers had a direct relationship with these entities and were assumed to understand that data was being exchanged.

In response to these concerns, industry created self-regulatory regimes, including the principles and commitments outlined in the Network Advertising Initiative (NAI). Following this debate, concerns arose in the 2000’s about “adware,” or advertising “spyware,” the free programs that many consumers downloaded, unaware that the cost of the free download was ad tracking and often pop-ups. Standards and remedies arose to address this new technology, including as a result of efforts by the Anti-Spyware Coalition to advocate for practical ways to address the issues.

By the end of the 2000’s, attention shifted to major search engines, social networks, and commerce platforms. Unlike ad networks, these companies were viewed to have direct relationships with consumers through which they could explain their practices, but the breadth and scale made them the focus of much of the privacy debates in the US and globally.

Enter Apps and the Internet of Things

After the introduction of smartphones in the late 2000’s, along came a new system of mobile apps, which have become a primary alternative for consumers to access online content and services. Unlike traditional web browsing, apps do not support cookies, and their standards and permissions are enforced by the operating systems (Apple’s iOS and Google’s Android). This framework raises different privacy concerns, as developers are in some ways more limited in their possibilities for data collection, but in other ways able to collect more granular data from the smartphone’s large array of sensors.

Today, discussion has turned to the Internet of Things, as companies seek to track users across computers, tablets, phones, Smart TVs, and a range of other connected devices entering the market—including personal assistants, wearable devices, fitness trackers, refrigerators, thermostats, and security cameras. These devices, often inter-connected to comprise a “Smart Home” of automated and responsive systems, present new privacy challenges, as many policymakers seek to understand how and when they may be used to create advertising audience profiles.

Cross-Device Marketing and Emerging Proximity Technology

The drive to succeed in the online advertising market is intense and competitive. Consumers today frequently prefer and are accustomed to free online content, making it difficult to launch online businesses supported only by subscription fees. Advertising sets the terms for companies large and small, including news publishers, social networks, search engines, apps, and video streaming services. Even companies that are able to charge fees often depend on ads for some portion of their revenue.

Although consumers today have access to an expanding variety of new tools, devices, and platforms, this fragmenting of the consumer audience across platforms has been distressing to advertisers. In the past, a brand could advertise on leading TV networks and feel confident in reaching its intended audiences. Today those same audiences are accessing content across a variety of platforms, including in web browsers, mobile browsers, and apps, and across a variety of devices, including computers, smartphones, tablets, and Smart TVs. To re-link that audience, advertisers have pressed companies to connect the identities of consumers across devices and platforms.

In addition to the strong incentives to link consumers across devices and platforms, the scope and the scale of data being appended from a wide range of sources—both online and offline—is growing. As we described in our cross-device report, Oracle’s BlueKai has linked more than 80 sources of data to online IDs that can be used to target consumers based on specific audience profiles or purchasing intents—e.g. “Back to School Shopper” or “Graduation Gift Buyer.” Because targeted ads sell for many times the price of untargeted ads, Oracle’s competitors are not far behind.

The technology continues to advance. Today, the newest data being added to consumer profiles is proximity data, or information about the places or things that can be detected as being in close proximity to a consumer. Proximity data is distinguishable from geo-location data, which has long been integrated into user profiles or available for location-based targeting. Although proximity data can sometimes be used to infer a user’s location—such as through beacons—it can also be used to detect nearby connected devices and allow for inferences based on the presence of those devices.

Proximity data involves the detection of signals from nearby connected devices

For example, app developers can design apps to detect nearby devices equipped with Bluetooth or Wi-Fi. When these devices are on, they broadcast SSIDs, or network names to allow users to connect with them, and MAC addresses, or unique manufacturer-assigned identifiers. MAC addresses are assigned to manufacturers by the IEEE in dedicated ranges, and as a result often correspond in predictable ways with particular types of devices. For example, if a consumer’s app detects a nearby MAC address starting with 000C8A (a range assigned to Bose), this indicates that the consumer is near to, and therefore likely owns, a Bluetooth-enabled Bose speaker. Detection of a nearby MAC address starting with 1800DB (assigned to Fitbit, Inc.) indicates the consumer probably wears a Fitbit and may be interested in health and fitness. 00199D or 006B9E? Different models of Vizio Smart TVs.

Consider the wide range of items that today broadcast Wi-Fi or Bluetooth SSIDs and MAC addresses: cars, smart toys, tablets, mobile headsets, soda machines . . . all easily and publicly detectable. Many aggregators are also able to detect the beacons that retailers are deploying, providing detail about what stores you have been nearby, or products to which you have been in proximity.

How to Manage a Complex Consumer Privacy Landscape

Much of the current debate about the appropriate regulatory regime for ISPs has focused on whether ISPs are unique in the type or scale of data they can collect. Much of the web surfing data that ISPs can collect has been widely available in the market for more than a decade. The business trends in the data market that need to be understood are those driven by consolidation of data companies who are merging with ad tech companies or marketing platforms, the linking of devices across platforms, the additions of location and proximity data, and the uses of machine learning to enhance targeting.

How should we protect consumer privacy in this complex environment? In an ideal world, comprehensive federal privacy legislation would provide a baseline level of protection for all consumers. Given the low likelihood of this path, the FTC has a long history and deep expertise on the subject of privacy across platforms and devices. Although jurisdictional questions remain to be resolved by federal courts or by Congress, the FTC has a broad mandate to protect consumers against deceptive or unfair practices in commerce. Over the years, the agency has brought cases against ISPs, apps, major online platforms, and even most recently a Smart TV manufacturer. In 2015, the Commission created the Office of Technology Research and Investigation (OTech) to conduct technical research, investigate compliance, and provide guidance to consumers and businesses. Currently, FTC staff are examining the way unmanned aerial vehicles (UEVs) collect and handle data, and hosting an “Internet of Things Challenge” to help consumers evaluate security vulnerabilities in Smart Home devices.

Critics have pointed out that the FTC’s statutory rulemaking authority is more limited than that of other agencies. Nonetheless, the FTC has shown that it can make strong de facto “rules” by the broad manner in which it interprets its authority. In the recent Vizio case, for example, the FTC determined that sharing consumers’ TV viewing histories, linked not to names but IP addresses and device identifiers, was an unfair sharing of sensitive personal information without clear user notice and consent. This decision sets the baseline for all other Smart TV manufacturers, who now have a clearer standard for responsible data collection in the TV environment.

According to George Washington University’s Professor Daniel Solove and Samford University’s Woody Hartzog:

“The FTC privacy jurisprudence is the broadest and most influential regulating force on information privacy in United States—more so than nearly any privacy statute and any common law tort. Some assume that standards set by the FTC would be more generous to ISPs than the rules that the FCC was advancing, particularly around sensitive data. But the history of the FTC belies that argument and the agency is quite likely to use its authority to ensure ISPs set high standards for use of data about consumers. However those standards will also likely be technologically neutral, holding all the companies in the long linked data ecosystem accountable.” – “The FTC and the New Common Law of Privacy”

Connected cars, drones, wearables, smart TVs, smart homes, smart toys – all provide valuable services to consumers. But as these systems and platforms become increasingly interconnected, consumer protection requires a regulator with a broad scope and deep expertise in the advertising, data, and technology markets. The FTC has the flexibility, breadth of knowledge, and expertise to ensure that consumers are protected in this complex, data-driven world.

FPF Comments on NITRD’s Smart Cities and Communities Federal Strategic Plan

Last week, Future of Privacy Forum (FPF) submitted comments regarding the National Coordination Office for Networking and Information Technology Research and Development’s (NITRD) Request for Comment on the Draft Smart Cities and Communities Federal Strategic Plan, published in the Federal Register on January 9, 2017.

FPF commended NITRD for its forward-looking guidance and the acknowledgement that privacy will play a key role in promoting trust in smart cities and communities. FPF believes the guidance and its emphasis on privacy is an important first step in building that trust.

The path forward for city/community innovation in both the U.S. and globally lies through data and knowledge-sharing, best practices, and collaboration. Federal support to advance secure, privacy-preserving data sharing is critical to achieving this goal. There are several key domains that are ripe for Federal support and should be considered for NITRD’s next steps: 1) de-identification resources, training, and expertise; 2) privacy risk assessment frameworks; and 3) formation of a network of privacy leaders for smart cities/communities.

One of the greatest risks of sharing government datasets or opening them to the public is the possibility that individuals may be re-identified or singled out from those datasets, revealing data about them that could be embarrassing, damaging or even life threatening. Governments and scholars have recently begun to tackle the difficult question of publishing and de-identifying record-level government data. For example, the City of San Francisco published the first iteration of an “Open Data Release Toolkit” in 2016. FPF and the City of Seattle are currently developing an “Open Data Risk Assessment” in collaboration with a community advisory board and local academics, to be published in July 2017.

Current Privacy Impact Assessments (PIA) practice includes detailed frameworks to help privacy professionals understand and quantify privacy risks. However, traditional private sector PIAs do not necessarily account for the unique risks created by smart city/community projects. Decision-makers must also assess, prioritize, and to the extent possible, quantify a project’s benefits in order to understand whether assuming the risk is ethical, fair, legitimate and cost-effective. Federally-supported guidance or convenings to help city/community leaders assess the sensitivity of particular data points would further strengthen city/communities’ ability to collect, use, share, and dispose of data in a consistent and privacy-preserving manner.

Currently, many local governments and officials lack the institutional resources and knowledge to assess and manage the range of privacy risks that might arise from the use of smart city/community technologies and services. Federal support for a network of city/community privacy leaders and a central repository of common tools, terminology, and training would enable privacy-preserving systems to scale across application areas and geographic boundaries.

The Draft Smart Cities and Communities Federal Strategic Plan is a productive first step in establishing a consistent path forward for smart city/community innovation. FPF looks forward to remaining engaged as the guidance evolves.