Comments after the FTC Workshop on Big Data: Tool for Inclusion or Exclusion?

Today, FPF filed an additional set of comments in the wake of the FTC’s fall workshop, Big Data: A Tool for Inclusion or Exclusion?  The comments focus on some of the challenges around defining what exactly “Big Data” is, and the increasing need to have a firmer ethical framework for having conversations about data use. It calls on federal agencies do more to address what legal restrictions already cover potential big data uses and recommends industry establish clearer procedures and institutions to figure out what ethical considerations ought to apply in the absence of laws.

Beacons Help Blind Travelers Navigate Airport

 

San Francisco Airport is testing a beacon system to help blind travelers navigate around one of its new terminals.

Working with beacon company Indoo.rs, SFO has set up hundreds of beacons all over the terminal. Each beacon broadcasts signals using Bluetooth low energy. If a user downloads the app onto his or her smartphone, the beacon signals can connect to the phone and push notifications with information to the phone when a user gets within range of each beacon. For the blind or visually impaired, the mobile phone uses voiceover technology to announce points of interest, like bathrooms or close by coffee shops. If the program is successful, it could be launched throughout the entire airport.

This great use of beacon technology to help blind travelers navigate airports is just one of the reasons we like beacon technologies, when paired with apps providing a clear and explicit opt-in.

Image Source

Google Taps the YubiKey for Better Account Security

With identity theft and cybersecurity issues in the news seemingly on a daily basis, better tools to protect our data – and our privacy – are always welcome. For some time, FPF has endorsed the use of two-factor authentication as an “extra” step consumers can take to protect their accounts across a variety of online services. While everyone at FPF uses two-factor authentication for everything from our email accounts to our social media networks, two-factor authentication can be cumbersome and inconvenient. Every time one logs into a different account on a different machine, a code has to be retrieved from a mobile phone. Lose the phone, and you have to hope you have a set of paper-based fallback authentication codes.

Enter the physical security key. Yubico’s new “YubiKey” physical security key, for example, supports both USB and NFC communications and makes two-factor authentication as simple (and as fun) as tapping the device itself. The device is supported by the FIDO Alliance, a non-profit group dedicated to creating strong, interoperable authentication standards. The FIDO protocols that YubiKey follows use standard public key cryptography, and build in privacy by design. This means that the security key cannot track you across services as you log-in, and that local authentication information never leaves the device.

Today Google announced its support for the key, making its use with use with the Chrome browser and Google Accounts enrolled in two-factor authentication much easier.

Google has been interested in physical security keys for a while. The company has been using them internally for years, and last year used its experiences to publish an in-depth look at “Authentication at Scale.” In it, Google explains that investing in password alternatives like public-key-based technologies will help users more easily protect sensitive accounts – like their primary email account or a company’s spokesperson account – from common security threats.

After registering the device with your Google Account, the YubiKey can be easily used to sign into Google services by simply installing the key into a USB port and tapping it when prompted. No more reaching for your cell phone every time you encounter a new computer or clear your cookies. Instead, just plug in and tap! When you’re done, just stick the YubiKey on your keychain and go – it’s waterproof, battery-free and nearly indestructible.

In addition to being potentially easier to use, there are also significant security benefits to using a physical key like YubiKey. Physical security keys can’t be phished and can’t be fooled by fake look-alike websites. They also reduce the threat of man-in-the-middle attacks. And because YubiKey is touch-based, malware can’t silently activate it when you’re looking away. The key can also help users keep an eye out for suspicious activity. For example, when you onto a computer for the first time, you’ll be prompted to tap. If your account suddenly receives a log-in request from a new location, it will also trigger a tap request.  If your YubiKey doesn’t authenticate the request, then your account stays locked and Google can flag the failed log-in attempt. Although this may be a small step for security, it is a huge leap for usability.

The benefits from an enterprise standpoint are obvious, but physical keys also point toward a more secure future for consumers, as well. Online, passwords are roughly analogous to the keys we all use to lock our front doors, but the proliferation of online services and the need for ever-stronger and more varied passwords have overwhelmed consumers. Two-factor authentication has helped to make our cell phones our de facto “keys” to the Internet, but permanent security keys may offer even better online security – and convenience. With Google’s support, FPF looks forward to seeing how devices like the YubiKey develop in the future.

-Kelsey Finch & Joseph Jerome, Policy Counsels

"Databuse" as the Future of Privacy?

Is “privacy” such a broad concept as to be meaningless from a legal and policy perspective? On Tuesday, October 14th, the Center for Democracy & Technology hosted a conversation with Benjamin Wittes and Wells Bennett, frequently of the national security blog, Lawfare, to discuss their recent scholarship on “databuse” and the scope of corporate responsibilities for personal data.

Coming from a world of FISA and ECPA, and the detailed statutory guidance that accompanies privacy in the national security space, Wittes noted that privacy law on the consumer side is vague and amorphous, and largely “amounts to don’t be deceptive and don’t be unfair.” Part of the challenge, as number privacy scholars have noted, is that privacy encompasses a range of different social values and policy judgments. “We don’t agree what value we’re protecting,” Wittes said, explaining that government privacy policies have values and distinctions such as national borders and citizen/non-citizen than mean something.

Important distinctions are much less easier to find in consumer privacy. Wittes’ initial work on “databuse” in 2011 was considerably broader and more provocative, applying to all data controllers — first and third party, but his follow-up work with Bennett attempted to limit its scope to the duties owed to consumers exclusively by first parties. According to the pair, this core group of duties “lacks a name in the English language” but “describe a relationship best seen as a form of trusteeship.”

Looking broadly at law and policy around data use, including FTC enforcement actions, the pair argue that there is broad consensus that corporate custodians face certain obligations when holding personal data, including (1) obligations to keep it secure, (2) obligations to be candid and straightforward with users about how their data is being exploited, (3) obligations not to materially misrepresent their uses of user data, and (4) obligations not to use them in fashions injurious to or materially adverse to the users’ interests without their explicit consent. According to Wittes, this core set of requirements better describes reality than any sort of “grandiose conception of privacy.”

“When you talk in the broad language of privacy, you promise consumers more than the legal and enforcement system can deliver,” Wittes argued. “If we want useful privacy policy, we should focus on this core,” he continued, noting that most of these requirements are not directly required by statute.

Bennett detailed how data uses fall into three general categories. The first, a “win/win” category,” describes where the interests of business and consumers align, and he cited the many uses of geolocation information on mobile devices as a good example of this. The second category reflects cases where businesses directly benefit but consumers face a neutral value proposition, and Bennett suggested online behavioral advertising fit into this second category. Finally, a third category of uses are when businesses benefit at consumer’s expense, and he argued that regulatory action would be appropriate to limit these behaviors.

Bennett further argued that this categorization fit well with FTC enforcement actions, if not the agency’s privacy rhetoric. “FTC report often hint at subjective harms,” Bennett explained, but most of the Commission’s actions target objective harms to consumers by companies.

However, the broad language of “privacy” distorts what harms the pair believe regulators — and consumers, as well — are legitimately concerned about. Giving credit to CDT for initially coining the term “databuse,” Wittes defines the term as follows:

[T]he malicious, reckless, negligent, or unjustified handling, collection, or use of a person’s data in a fashion adverse to that person’s interests and in the absence of that person’s knowing consent. . . . It asks not to be left alone, only that we not be forced to be the agents of our own injury when we entrust our data to others. We are asking not necessarily that our data remain private; we are asking, rather, that they not be used as a sword against us without good reason.

CDT’s Justin Brookman, who moderated the conversation, asked whether (or when) price discrimination could turn into databuse.

“Everyone likes [price discrimination] when you call it discounts,” Wittes snarked, explaining that he was “allergic to the merger of privacy and antidiscrimination laws.” Where personal data was being abused or unlawful discrimination was transpiring, Wittes supported regulatory involvement, but he was hesitant to see both problems as falling into the same category of concern.

The conversation quickly shifted to a discussion of the obligations of third parties — or data brokers generally — and Wittes and Bennett acknowledged they dealt with the obligations of first parties because its an easier problem. “We punted on third parties,” they conceded, though Wittes’ background in journalism forced him to question how “data brokers” were functionally different from the press. “I haven’t thought enough about the First Amendment law,” he admitted, but he wasn’t sure what principle would allow advocates to divine “good” third parties and “bad” third parties.

But if the pair’s theory of “databuse” can’t answer every question about privacy policy, at least we might admit the term should enter the privacy lexicon.

-Joseph Jerome, Policy Counsel

Promoting Innovation, and Protecting Privacy in the Classroom

Today, FPF announces the release of two new student privacy related papers.  They are:

Who Is Reading Whom Now: Privacy in Education from Books to MOOCs (Jules Polonetsky and Omer Tene (October 7, 2014), Vanderbilt Journal of Entertainment & Technology Law, Forthcoming. Available at SSRN: http://ssrn.com/abstract=2507044) and,

Student Data: Trust, Transparency and the Role of Consent (October 2014, Jules Polonetsky and Joseph Jerome, available at http://fpf.org/wp-content/uploads/FPF_Education_Consent_StudentData_Oct2014.pdf).

The first paper, Who Is Reading Whom Now: Privacy in Education from Books to MOOCs, discusses how education technologies, while presenting tremendous opportunities to make education more personalized, accountable, collaborative and engaging through social media, gamification and interactive content, also engender privacy concerns arising from the combination of enhanced data collection with highly sensitive information about children and teens.

The paper seeks to separate the core ed tech privacy issues from the broader policy debates surrounding education standardization, the Common Core, longitudinal data systems and the role of business in education. It argues that decision makers should unpack the distinct policy issues surrounding ed tech to facilitate levelheaded discussion and appropriate policy responses. It further separates privacy problems related to “small data,” the personalization enabled by optimization solutions that “read students” even as they read their books, from concerns about “big data” analysis and measurement, including algorithmic biases, discreet discrimination, narrowcasting and chilling effects.

The paper proposes a broad range of solutions, from deployment of traditional privacy tools, such as contractual and organizational governance mechanisms, to greater data literacy by teachers and increased parental involvement. It warns against kneejerk reactions to ed tech data processes, which may unwittingly accentuate a growing equity gap between the “haves” and “have nots.” It advocates enhanced transparency to increase parental engagement in and understanding of the role of technology in their children’s education. The paper builds on the authors’ previous work to balance big data rewards against privacy risks, while complying with applicable regulation.

The second paper, Student Data: Trust, Transparency and the Role of Consent, discusses how over the past decade, new technologies in schools have generated an “explosion of data” for public school systems to use and analyze. Accordingly, the Department of Education has identified the use of student data systems to improve education as a top national priority even as an increased focus on data has raised legitimate privacy concerns.

With parents worried that student data is being used for marketing purposes without appropriate contractual and legal safeguards, a “notice and choice” regime has had intuitive appeal.  However, the paper shows that providing parents more notice and choice may do little to further student privacy while at the same time unintentionally excluding children from necessary education services.

The paper discusses the practical implications of consent requirements; explores how existing federal laws protect student data; and compares the activities of data vendors and the role of individual consent in the health and financial sectors. It argues that parents should not be resigned to opt-out of embracing new technologies as a result of legitimate privacy concerns. The paper concludes that policymakers must devise better ways to inform parents about how their children’s data is being used, and to provide students and parents with better tools to inform learning.

Other FPF efforts on student privacy include a resource web site at studentprivacycompass.org, and the launch of a Student Privacy Pledge with SIIA and leading tech companies.  See www.studentprivacypledge.org to sign on to the pledge or for more details.

To learn more about FPF’s ongoing student privacy work, contact Brenda Leong, FPF Education Privacy Counsel at [email protected].

White House Office of Science and Technology Policy blog

We are pleased to see the White House Office of Science and Technology take note of the Student Privacy Pledge.  Read more at Promoting Innovation and Protecting Privacy in the Classroom, October 9, 2014.

K-12 Student Privacy Pledge Announced

Today Future of Privacy Forum (FPF) and Software and Information Industry Association (SIIA) provided the following press release.  Additional FPF Resources and Publications on this topic are listed below the announcement.

Leading K-12 School Service Providers Announce Pledge

To Advance Student Data Privacy Protection

New Effort Builds on Legal Protections to Enhance Confidence in the Handling of Student Personal Information; Addresses Sales, Retention, Security, Profiling, Advertising and More

Washington, D.C. – October 7, 2014 – The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) today announced a K-12 school service providers Pledge to safeguard student privacy built around a dozen commitments regarding the collection, maintenance, and use of student personal information.

An initial leadership group of major school service providers are joining SIIA and FPF to introduce and sign the Pledge. The group is made up of some of the leading names in education technology, including Amplify, Code.org, DreamBox Learning, Edmodo, Follett, Gaggle, Houghton Mifflin Harcourt, Knewton, Knovation, Lifetouch, Microsoft, MIND Research Institute, myON (a business unit of Capstone), and Think Through Math. SIIA and FPF will continue reaching out to educators, parent groups and companies over the next several months to promote the pledge and garner participation from other school service providers.

“These commitments clearly and concisely articulate a set of expectations that parents and education officials have for the safeguarding of children’s sensitive data,” said Jules Polonetsky, executive director and co-chair, Future of Privacy Forum. “The Pledge will enhance the trust between families, schools and third party service providers necessary to support the safe and effective use of student information for student, teacher and school success.”

“We introduce this Pledge as a clear industry commitment to safeguard the privacy and security of all student personal information,” said Mark Schneiderman, senior director of education policy, Software & Information Industry Association. “Current law provides extensive restrictions on the use of student information, and this industry pledge will build on and detail that protection to promote even greater confidence in the appropriate use of student data.”

The commitments are intended to detail ongoing industry practices that meet and go beyond all federal requirements and to encourage service providers to more clearly articulate these practices to further ensure confidence in how they handle student data. The Pledge would apply to all student personal information whether or not it is part of an “educational record” as defined by federal law, and whether collected and controlled by the school but warehoused offsite by a service provider or collected directly through student use of a mobile app or website assigned by their teacher. It would apply to school service providers whether or not there is a formal contract with the school.

The Pledge will make clear that school service providers are accountable to:

The Pledge was developed by the FPF and SIIA with guidance from the school service providers, educator organizations, and other stakeholders following a convening by U.S. Representatives Jared Polis (CO) and Luke Messer (IN).

“The potential of using student data to drive effective instruction and personalize education is promising,” said Rep. Polis. “While there can be tremendous benefits from this data, we must ensure that there are appropriate safeguards to protect student privacy. I am pleased that these companies have taken an important step in making a commitment to parents, educators, and communities. This voluntary pledge can help address parents’ legitimate concerns about privacy issues and help keep us on track towards new and exciting educational developments for all students.”

“I applaud these companies for their commitment to protecting the privacy of sensitive student information,” said Congressman Luke Messer.  “Their willingness to comply with these standards and not misuse student information shows they care about protecting student privacy while leveraging technology to unleash each child’s talents and enhance their ability to learn and succeed.”  Messer, who serves on the House Committee on Education and the Workforce, added that the principles  “will better inform the debate about what, if any, legislative remedy may be needed to ensure that child privacy is protected and this information is only used for academic purposes.”

“While technology is a powerful tool for teaching and learning, it is imperative that students’ personal information is protected at all times,” said Otha Thornton, president of National PTA. “National PTA applauds K-12 school service providers that have pledged to safeguard student data and privacy and effectively communicate with parents about how student information is used and protected. We look forward to even more support going forward.”

“This industry-led pledge to honor student data privacy is an important step in the right direction,” said Thomas J. Gentzel, executive director, National School Boards Association. “Those vendors who opt to take the pledge are demonstrating their public commitment to responsible data practices in a manner that will help support school boards’ efforts to safeguard student privacy.”

School service providers support schools – including their teachers, students and parents – to manage student data, carry out school operations, support instruction and learning opportunities, and develop and improve products/services intended for educational/school use. In so doing, it is critical that school service providers effectively communicate with parents, teachers and education officials about how student information is used and safeguarded.

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.  For more information, visit fpf.org.

About SIIA

SIIA is the leading association representing the software and digital content industries. SIIA represents approximately 800 member companies worldwide that develop software and digital information content. SIIA provides global services in government relations, business development, corporate education and intellectual property protection to the leading companies that are setting the pace for the digital age. For more information, visit www.siia.net.

About National PTA

National PTA® comprises millions of families, students, teachers, administrators, and business and community leaders devoted to the educational success of children and the promotion of parent involvement in schools. PTA is a registered 501(c) (3) nonprofit association that prides itself on being a powerful voice for all children, a relevant resource for families and communities, and a strong advocate for public education. Membership in PTA is open to anyone who wants to be involved and make a difference for the education, health, and welfare of children and youth.

About NSBA

About Us NSBA represents state school boards associations and their more than 90,000 local school board members. We believe education is a civil right, and public education is America’s most vital institution.

Media Contacts

FPF Nicholas Graham, 571-291-2967 [email protected]

SIIA Sabrina Eyob, 202-789-4480 [email protected]

 

Further FPF materials can be found at:

The Pledge and more information about how to support the Pledge is available at http://studentprivacypledge.org/.

studentprivacycompass.org  FERPA|SHERPA aims to provide service providers, parents, school officials, and policymakers with easy access to those materials to help guide responsible uses of student’s data.

The Ethics of Student Privacy: Building Trust for Ed Tech

 

Do Beacons Track You? No, You Track Beacons

 

BuzzFeed News today reports that phone booths in NYC are tracking people and can send them ads.

Let’s explain this rapidly spreading new technology we often see described inaccurately.

First, let’s step back and understand how your phone or apps on your phone that you grant permission to access your location are able to determine where you are.

In general, there are a number of useful reasons for your phone to know its location. Location can be useful to find a lost phone, for example, or so that apps can provide location-related services like map services, finding friends nearby or location specific offers.

Apps can access your phone’s location services – with your permission. iPhone apps must use Apple’s location services, while Google Play apps generally use Google’s location services. These location services leverage GPS signals and the location of nearby cell towers to estimate where you are. In recent years, these location services also began noting the MAC addresses of nearby Wi-Fi networks, adding additional precision. These can be accurate to within 10-12 feet, depending on your proximity to Wi-Fi routers.

What’s new is the advent of low-powered beacon technologies, where stores, museums, airports or other spaces set up a device that broadcasts a unique code. If your phone has Bluetooth turned on and you download an app that you allow permission to use Bluetooth and location, the app can detect that beacon. If the app has a relationship with the beacon provider, the app can determine more precisely where you are. This enables features like allowing an app to display information on your phone about a certain museum painting when you are front of that painting, or a retailer app or restaurant app or any other app that you have enabled can send precisely targeted offers to your exact location – in front of a display at a store, or in this case, near a specific phone booth. This is possible ONLY if you have given that specific app permission to use notification functions. Your phone also allows you to deny an app access to contact you via notifications.

Beacons themselves don’t collect any data. They do not send marketing messages to your phone. They broadcast location marks that your phone and apps using your phone can take advantage of to understand more precisely where you are.

Unlike mobile location analytics technologies, beacons do not detect or collect MAC addresses. For more info about mobile location analytics, go to smart-places.org.

In short, beacons are not tracking your phone. In fact, your phone is tracking them.

We don’t suggest that beacons or other location technologies don’t raise privacy issues. Apps using location must do an effective job at explaining why and what they collect. Apps “listening” for Bluetooth beacons require that you grant them permission “always.” In iOS 8, users who allow apps to always have location will be reminded by the operating system and invited to turn off location or allow the app to continue to collect location. Apps collecting location, of any sort, should make sure that they have policies to delete the history of a user’s location. There is real work to be done in this area, to ensure that location services are used in ways that benefit users, and properly manage information. A good way to start is by understanding the technology.

 

Photo: Phil Roeder: Flickr

Facebook Moves Forward with an Ethics Review Panel

Jules and Omer opine in the Hill on today’s announcement from Facebook, commenting about the “essential accountability mechanism for companies that are struggling to balance the risks and opportunities of big data.”

The Hill

“Facebook’s announcement — establishing guidelines, review processes, training and enhanced transparency for research projects — marks another milestone in the emergence of data ethics as a crucial component of corporate governance programs.

With the proliferation of personal data generated from smartphones, apps, social networks and ubiquitous sensors, companies have come under increasing pressure to put in place internal institutional review processes more befitting of academic philosophy departments than corporate boardrooms.”  Continue at:

http://thehill.com/blogs/pundits-blog/technology/219620-facebook-calls-in-the-philosophers