SIIA/FPF Webinar: Responding to Student Privacy Concerns
Does your company collect or maintain personal student data? Are you looking for a set of guidelines you can adopt to demonstrate your adherence to data privacy best practices?
On October 7, more than a dozen leading K-12 school service providers announced their signing of a pledge to advance student data privacy protection. Since then, two dozen additional organizations have signed the pledge introduced by the Software & Information Industry Association (SIIA) and the Future of Privacy Forum (FPF).
Why is it essential for your organization to also take the pledge? Register today for SIIA’s November 17 webinar, directed to school service providers, to learn background and contextual information, understand specific pledge commitments, garner perspectives of signatory companies and education leaders, and get your questions answered about the pledge and the signature process.
Presenters include:
Inna Barmash, Associate General Counsel, Amplify
Jules Polonetsky, Executive Director and Co-chair, Future of Privacy Forum
Mark Schneiderman, Senior Director of Education Policy, Software & Information Industry Association (SIIA)
Cameron Kerry Queries Whether Law Enforcement Is Really "Going Dark"
Writing in Forbestoday, Cam Kerry, formerly of the Department of Commerce and a member of the FPF Advisory Board, discusses some of the challenges facing law enforcement as technology continues to race past the law. In recent weeks, FBI Director James Comey has criticized tech companies like Apple and Google for embracing stronger levels of encryption, encryption that increasingly hampers the ability of law enforcement to get access to information.
Kerry notes that not only did last year’s Snowden revelations made “it hard to argue the U.S. government lacks visibility into communications,” but he also recognizes a fundamental tension between the needs of law enforcement and technological innovation. “[Law enforcement’s] main mission to catch the bad guys constrains the airing of civil liberties and privacy issues that matter to Internet users, providers and others,” he writes. “[F]rom a technical standpoint, the FBI’s front door is a hacker’s or spy’s back door.”
There’s the rub: any lawful intercept solution will be exploited by third parties. Without a basic recognition of this fact, the honest debate the FBI Director is seeking may be difficult to properly frame.
Jules Polonetsky Statement Following Home Depot Announcement
Today, The Home Depot released new findings from its investigation of the company’s recent payment data breach. Jules Polonetsky, Executive Director of the Future of Privacy Forum, had the following statement:
More important than legal compliance after a breach is a company’s efforts to make sure that consumer concerns are addressed. It’s great to see The Home Depot take this extra step of notifying individuals whose email addresses were located in files apparently taken during a previously-reported payment breach. Since passwords or other protected account information wasn’t affected, there is no legal obligation for the company to disclose that email addresses have been taken, but clearly consumers affected will benefit from The Home Depot’s consumer outreach and can be on guard against suspicious emails.
"Big Data: Putting Heat on the Hate" by Chris Wolf and Jules Polonetsky
Today, Re/code ran an essay by Chris Wolf and Jules Polonetsky, marking the five year signing of the Hate Crimes Prevention Act. The two discussed big data’s ability to put the “heat on hate,” concluding that while “[t]hese are still the early days of development for big data and civil rights . . . it is becoming ever more clear that big data’s value for empowering groups and fighting discrimination is incumbent upon its continued application to novel endeavors.”
Android 5.0, Lollipop: Major New Privacy Features
Earlier this month, Google announced the final release of Android 5.0 Lollipop, also known as Android L. Lollipop includes a number of valuable new privacy features worth special applause.
Default Encryption
New phones and tablets with Lollipop come with encryption automatically turned on to help protect data on lost or stolen devices or from anyone who lacks password access. Android devices running Jelly Bean and Kit Kat operating systems have had the capability of encrypting data, but this feature had to be expressly activated by users. Now, all users will automatically benefit from the added protection.
Guest Mode
Lollipop allows others to use your device in a restricted guest mode where they don’t have access to your personal data. In the guest user mode, first time guest users will see a newly-installed system with only the stock apps. Guest users can then sync their contacts, email, and photos from their Google account or install apps from the Play Store.
Screen Pinning
Have you ever shown someone a photo, and then watched in a panic as they kept swiping to see other photos on your phone? Screen pinning comes in handy when you have a friend or family member who you want to show something on your phone without letting them see other sensitive information. Screen pinning locks the screen to the content you wish to display – a photo or a video or a specific text screen – until you put in your password.
Android Smart Lock
If you’re tired of entering your pin or swipe pattern every time you want to use your phone, Smart Lock may be a feature you want to use. You can program any Bluetooth device to be a trusted device. When your phone is within range of a trusted device, the phone will allow unlock and allow you to simply swipe to use. If the phone moves farther away from the trusted device, the phone will go back to being locked and you will need to enter their pin or swipe pattern. The cool thing about this is that trusted devices do not need to be stationary; users can pair their phones with a wearable or even a car. For example, users can have their phone stay unlocked when near their Fitbit or laptop and can be sure that if they walk away from it, the phone will automatically lock.
New Notification Settings
With Lollipop, you will now be able to control the visibility of notifications when the device is locked. You can set notifications from particular friends as ‘sensitive’ or ‘blocked’ to control their visibility. You can also turn on a Priority mode, which acts like a Do Not Disturb sign and allows the phone to only show the most important notifications.
BLE Mac-Address Rotation
Bluetooth MAC addresses broadcast by mobile devices are one of the identifiers often tracked by retailers or other venues in order to create location analytics reports. See FPF’s Smart-Places privacy code for how this works and for opt-out options. In Lollipop, Bluetooth MAC addresses will now rotate when the device is scanning (central mode) and advertising (peripheral mode), making it harder for your phone to be tracked via its Bluetooth MAC address.
WiFi SSID Suppression
Many phones broadcast the recent SSIDs of routers to which they have connected, in order to make reconnecting easier. If the last router your phone connected with was your home router, perhaps named “Smith Family home”, this information can be detected by the next WifI network you join. The new WiFi SSID suppression in Lollipop eliminates the broadcasting of recently connected routers (except for hidden SSIDs, which need to be broadcast for automatic connection).
Encryption by Default is a Very Big Deal
Turning on encryption by default is a major step forward for users and will do a great deal to ensure that the sensitive information on users phones can’t be accessed without their permission. Many of these other features also add user friendly options that should make day to day use of Android devices more convenient, more private and more secure.
In popular imagination, big data can apparently do everything and anything. Its evangelists would suggest data holds near magical potential to change the world, while skeptics increasingly worry that it poses the biggest civil rights threat of our generation. Even if reality likely falls somewhere in between, there is little question that the advent of big data has altered our conversations about privacy. Privacy has been in tension with technological advances since Louis Brandeis worried that “recent inventions and business methods”— such as the widespread availability of the Kodak camera to consumers — necessitated a “right to be let alone.” Yet the phenomenon of big data, alongside the emerging “Internet of Things,” makes it ever more difficult to be left entirely alone. The ubiquitous collection and unparalleled use of personal information is breaking down some of society’s most common conceptions of privacy. Law and policy appear on the verge of redefining how they understand privacy, and data collectors and privacy advocates are trying to present a path forward.
In Big Data: Catalyst for a Privacy Change, Joseph Jerome discusses the rise of big data and the role of privacy in both the Fourth Amendment and consumer contexts, and argues that the future of privacy will have the be built upon a foundation of trust.
Comments after the FTC Workshop on Big Data: Tool for Inclusion or Exclusion?
Today, FPF filed an additional set of comments in the wake of the FTC’s fall workshop, Big Data: A Tool for Inclusion or Exclusion? The comments focus on some of the challenges around defining what exactly “Big Data” is, and the increasing need to have a firmer ethical framework for having conversations about data use. It calls on federal agencies do more to address what legal restrictions already cover potential big data uses and recommends industry establish clearer procedures and institutions to figure out what ethical considerations ought to apply in the absence of laws.
Beacons Help Blind Travelers Navigate Airport
San Francisco Airport is testing a beacon system to help blind travelers navigate around one of its new terminals.
Working with beacon company Indoo.rs, SFO has set up hundreds of beacons all over the terminal. Each beacon broadcasts signals using Bluetooth low energy. If a user downloads the app onto his or her smartphone, the beacon signals can connect to the phone and push notifications with information to the phone when a user gets within range of each beacon. For the blind or visually impaired, the mobile phone uses voiceover technology to announce points of interest, like bathrooms or close by coffee shops. If the program is successful, it could be launched throughout the entire airport.
This great use of beacon technology to help blind travelers navigate airports is just one of the reasons we like beacon technologies, when paired with apps providing a clear and explicit opt-in.
Google Taps the YubiKey for Better Account Security
With identity theft and cybersecurity issues in the news seemingly on a daily basis, better tools to protect our data – and our privacy – are always welcome. For some time, FPF has endorsed the use of two-factor authentication as an “extra” step consumers can take to protect their accounts across a variety of online services. While everyone at FPF uses two-factor authentication for everything from our email accounts to our social media networks, two-factor authentication can be cumbersome and inconvenient. Every time one logs into a different account on a different machine, a code has to be retrieved from a mobile phone. Lose the phone, and you have to hope you have a set of paper-based fallback authentication codes.
Enter the physical security key. Yubico’s new “YubiKey” physical security key, for example, supports both USB and NFC communications and makes two-factor authentication as simple (and as fun) as tapping the device itself. The device is supported by the FIDO Alliance, a non-profit group dedicated to creating strong, interoperable authentication standards. The FIDO protocols that YubiKey follows use standard public key cryptography, and build in privacy by design. This means that the security key cannot track you across services as you log-in, and that local authentication information never leaves the device.
Today Google announced its support for the key, making its use with use with the Chrome browser and Google Accounts enrolled in two-factor authentication much easier.
Google has been interested in physical security keys for a while. The company has been using them internally for years, and last year used its experiences to publish an in-depth look at “Authentication at Scale.” In it, Google explains that investing in password alternatives like public-key-based technologies will help users more easily protect sensitive accounts – like their primary email account or a company’s spokesperson account – from common security threats.
After registering the device with your Google Account, the YubiKey can be easily used to sign into Google services by simply installing the key into a USB port and tapping it when prompted. No more reaching for your cell phone every time you encounter a new computer or clear your cookies. Instead, just plug in and tap! When you’re done, just stick the YubiKey on your keychain and go – it’s waterproof, battery-free and nearly indestructible.
In addition to being potentially easier to use, there are also significant security benefits to using a physical key like YubiKey. Physical security keys can’t be phished and can’t be fooled by fake look-alike websites. They also reduce the threat of man-in-the-middle attacks. And because YubiKey is touch-based, malware can’t silently activate it when you’re looking away. The key can also help users keep an eye out for suspicious activity. For example, when you onto a computer for the first time, you’ll be prompted to tap. If your account suddenly receives a log-in request from a new location, it will also trigger a tap request. If your YubiKey doesn’t authenticate the request, then your account stays locked and Google can flag the failed log-in attempt. Although this may be a small step for security, it is a huge leap for usability.
The benefits from an enterprise standpoint are obvious, but physical keys also point toward a more secure future for consumers, as well. Online, passwords are roughly analogous to the keys we all use to lock our front doors, but the proliferation of online services and the need for ever-stronger and more varied passwords have overwhelmed consumers. Two-factor authentication has helped to make our cell phones our de facto “keys” to the Internet, but permanent security keys may offer even better online security – and convenience. With Google’s support, FPF looks forward to seeing how devices like the YubiKey develop in the future.
-Kelsey Finch & Joseph Jerome, Policy Counsels
"Databuse" as the Future of Privacy?
Is “privacy” such a broad concept as to be meaningless from a legal and policy perspective? On Tuesday, October 14th, the Center for Democracy & Technology hosted a conversation with Benjamin Wittes and Wells Bennett, frequently of the national security blog, Lawfare, to discuss their recent scholarship on “databuse” and the scope of corporate responsibilities for personal data.
Coming from a world of FISA and ECPA, and the detailed statutory guidance that accompanies privacy in the national security space, Wittes noted that privacy law on the consumer side is vague and amorphous, and largely “amounts to don’t be deceptive and don’t be unfair.” Part of the challenge, as number privacy scholars have noted, is that privacy encompasses a range of different social values and policy judgments. “We don’t agree what value we’re protecting,” Wittes said, explaining that government privacy policies have values and distinctions such as national borders and citizen/non-citizen than mean something.
Important distinctions are much less easier to find in consumer privacy. Wittes’ initial work on “databuse” in 2011 was considerably broader and more provocative, applying to all data controllers — first and third party, but his follow-up work with Bennett attempted to limit its scope to the duties owed to consumers exclusively by first parties. According to the pair, this core group of duties “lacks a name in the English language” but “describe a relationship best seen as a form of trusteeship.”
Looking broadly at law and policy around data use, including FTC enforcement actions, the pair argue that there is broad consensus that corporate custodians face certain obligations when holding personal data, including (1) obligations to keep it secure, (2) obligations to be candid and straightforward with users about how their data is being exploited, (3) obligations not to materially misrepresent their uses of user data, and (4) obligations not to use them in fashions injurious to or materially adverse to the users’ interests without their explicit consent. According to Wittes, this core set of requirements better describes reality than any sort of “grandiose conception of privacy.”
“When you talk in the broad language of privacy, you promise consumers more than the legal and enforcement system can deliver,” Wittes argued. “If we want useful privacy policy, we should focus on this core,” he continued, noting that most of these requirements are not directly required by statute.
Bennett detailed how data uses fall into three general categories. The first, a “win/win” category,” describes where the interests of business and consumers align, and he cited the many uses of geolocation information on mobile devices as a good example of this. The second category reflects cases where businesses directly benefit but consumers face a neutral value proposition, and Bennett suggested online behavioral advertising fit into this second category. Finally, a third category of uses are when businesses benefit at consumer’s expense, and he argued that regulatory action would be appropriate to limit these behaviors.
Bennett further argued that this categorization fit well with FTC enforcement actions, if not the agency’s privacy rhetoric. “FTC report often hint at subjective harms,” Bennett explained, but most of the Commission’s actions target objective harms to consumers by companies.
However, the broad language of “privacy” distorts what harms the pair believe regulators — and consumers, as well — are legitimately concerned about. Giving credit to CDT for initially coining the term “databuse,” Wittes defines the term as follows:
[T]he malicious, reckless, negligent, or unjustified handling, collection, or use of a person’s data in a fashion adverse to that person’s interests and in the absence of that person’s knowing consent. . . . It asks not to be left alone, only that we not be forced to be the agents of our own injury when we entrust our data to others. We are asking not necessarily that our data remain private; we are asking, rather, that they not be used as a sword against us without good reason.
CDT’s Justin Brookman, who moderated the conversation, asked whether (or when) price discrimination could turn into databuse.
“Everyone likes [price discrimination] when you call it discounts,” Wittes snarked, explaining that he was “allergic to the merger of privacy and antidiscrimination laws.” Where personal data was being abused or unlawful discrimination was transpiring, Wittes supported regulatory involvement, but he was hesitant to see both problems as falling into the same category of concern.
The conversation quickly shifted to a discussion of the obligations of third parties — or data brokers generally — and Wittes and Bennett acknowledged they dealt with the obligations of first parties because its an easier problem. “We punted on third parties,” they conceded, though Wittes’ background in journalism forced him to question how “data brokers” were functionally different from the press. “I haven’t thought enough about the First Amendment law,” he admitted, but he wasn’t sure what principle would allow advocates to divine “good” third parties and “bad” third parties.
But if the pair’s theory of “databuse” can’t answer every question about privacy policy, at least we might admit the term should enter the privacy lexicon.