Radio Interview – Lauren Smith, FPF Policy Counsel, Discusses the "Textalyzer"

Today, Lauren Smith, FPF Policy Counsel, joined The Takeaway to discuss the legal issues behind the “Textalyzer,” a technology that can tap into a driver’s phone, and whether or not it is the best deterrent to prevent texting and driving.

Listen to the full interview.

Challenges with the Implementation of a Right to be Forgotten in Canada

Today, Eloïse Gratton, Partner and National Co-Leader, Privacy and Data Security Practice Group, Borden Ladner Gervais LLP, and Jules Polonetsky, CEO, Future of Privacy Forum, filed a joint-submission paper to the Office of the Privacy Commissioner of Canada (OPC), as part of their consultation and call for essays on online reputation ending today (April 28, 2016). The OPC has recently chosen reputation and privacy as one of its priorities for the next five years and is currently focusing its attention on the reputational risks stemming from the vast amount of personal information posted online and on existing and potential mechanisms for managing those risks. In January 2016, the OPC published a discussion paper, entitled “Online Reputation, What are they saying about me?” in which it asks if a right to be forgotten can find application in the Canadian context and if so, how.

The paper entitled “Privacy above all other Fundamental Rights? Challenges with the Implementation of a Right to be Forgotten in Canada” explores whether importing a right to be forgotten that would allow individuals to stop search engines from providing links to information deemed irrelevant, no longer relevant, inadequate or excessive would be advisable in Canada.

Read Eloïse’s full blog describing the paper.

Read article describing the paper in the Canadian Bar Association’s National Magazine

The Future of Privacy Forum and EY Examine Speech Recognition and Smart Devices in New Paper

FOR IMMEDIATE RELEASE

April 28, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

 

THE FUTURE OF PRIVACY FORUM AND EY

EXAMINE SPEECH RECOGNITION AND SMART DEVICES IN NEW PAPER

Washington, DC – Today, the Future of Privacy Forum (FPF), in collaboration with Ernst & Young LLP, released Always On: Privacy Implications of Microphone-Enabled Devices, a new paper that explores how speech recognition technology fits into a broader scheme of “always listening” technologies. The paper identifies emerging practices by which manufacturers and developers can alleviate privacy concerns and build consumer trust in the ways that data is collected, stored, and analyzed.

Is your Smart TV listening to your conversations? Are your children’s toys spying on your family? These types of questions are increasingly raised as the next generation of internet-connected devices enter the market.

“While we work to ensure that the appropriate privacy protections are in place, it is important to remember that the benefits of speech recognition are undeniable,” said Jules Polonetsky, CEO, FPF. “Hands-free control of technology improves the lives of people with physical disabilities, makes healthcare and other professional services more efficient through accurate voice dictation, enhances automobile safety, and makes everyday tasks more convenient.”

“Increasingly ‘smart’ devices challenge the product development lifecycle,” said Sagi Leizerov, an Executive Director with Ernst & Young LLP and EY’s Global Privacy leader. “The implications of new features, how those features should be made known to impacted individuals, the decision of what the default setting should be and what privacy controls should be provided, are at the heart of building trust when adopting additional ‘Internet of Things’ solutions in the daily lives of consumers.”

FPF and EY conclude that the colloquial term “always on” is often not an effective way to describe the range of technologies that use audio and video recording hardware. Instead, three general categories of microphone-enabled devices are proposed:

(1)  manually activated (requiring a press of a button, a flip of a switch, or other intentional physical action);

(2)  speech activated (requiring a spoken “wake phrase”); and

(3)  always on devices (devices, such as home security cameras, that are designed to constantly transmit data, including devices that “buffer” to allow the user to capture only the most recent period of time).

“Ultimately, companies should keep consumers’ expectations in mind when designing the default frameworks of a device,” said Stacey Gray, Legal and Policy Fellow, FPF. “Our expectations will evolve more quickly in some areas than others, and so the manufacturers of devices that are introducing microphones for the first time—like televisions and toys—should go the extra distance to provide additional transparency and in many cases greater levels of control and choice.”

 

###

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. FPF includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. Learn more about FPF’s work by visiting www.fpf.org.

Always on: Privacy Implications of Microphone-Enabled Devices

Today, the Future of Privacy Forum (FPF), in collaboration with Ernst & Young LLP, released Always On: Privacy Implications of Microphone-Enabled Devices, a new paper that explores how speech recognition technology fits into a broader scheme of “always listening” technologies. In an opinion piece for Recode, Stacey Gray explains:

Is your smart TV listening to your conversations? Are your children’s toys spying on your family?

These questions are being raised as the next generation of Internet-connected devices enters the market. Such devices, often dubbed “always on,” include televisions, cars, toys and home personal assistants, many of which now include microphones and speech-recognition capabilities.

Voice is an increasingly useful interface to engage with our devices. Consider the Amazon Echo, which can be activated by spoken command (“Alexa”), Mattel’s Hello Barbie, or Apple’s familiar personal assistant Siri, which can be activated by spoken command (“Hey, Siri”). The growing prevalence of voice as the primary way to interact with devices enables companies to collect, store and analyze increasing amounts of personal data. But consumers don’t always understand when and in what ways these devices are actually collecting information.

Additional Resources

FPF Testifies at NHTSA Meeting on Autonomous Vehicles

Lauren Smith, FPF Policy Counsel, testified today at the National Highway Traffic Safety Administration’s (NHTSA) second public meeting on autonomous vehicles. The NHTSA is seeking input on planned guidelines for the safe deployment and operation of automated vehicles.

Lauren’s testimony focused on the benefits of autonomous vehicles and the importance of proper data management. “The benefits for facilitating the deployment of autonomous vehicles are so compelling and policymakers should be doing all they can to smooth and speed the way for these technologies to improve as quickly as possible,” Lauren stated. “Applying current best practices around data privacy, paired with existing federal enforcement mechanisms, should facilitate, not stall, this opportunity.”

Read our written testimony.

University of Amsterdam's Summer Course on Privacy Law and Policy

The University of Amsterdam’s Institute for Information Law (IViR) is accepting applications for its fourth annual Summer Course on Privacy Law and Policy which will be held from July 4-8, 2016. The course focuses on privacy law and policy related to the internet, electronic communications, and online and social media and explores both the broader trends and the most recent developments in this rapidly changing field. The course will be held in De Rode Hoed, a historic building on one of Amsterdam’s most beautiful canals. The interactive seminars will be led by distinguished European and US academics, regulators and practitioners who will investigate the EU and US legal frameworks and how they operate together. Enrollment is limited to 25 participants.

View additional information and register

For questions, please contact the course organizer Dr. Kristina Irion.

FPF Asks Lawmakers: "Send a Message to States that Privacy is a Priority"

FPF supported Data Quality Campaign’s (DQC) recent initiative to bring an important issue about student privacy to the attention of lawmakers.  Signing on with DQC and 20 other educational and privacy groups, FPF agrees that it is critical that states have the resources they need to ensure adequate privacy protection for student data.  Since states already receive some federal funding to support their State Longitudinal Data Systems (SLDS), one way for them to fund a central responsibility for privacy concerns is to be allowed to use those funds.

As Rachel Andersen at DQC summarizes:

Why Does this Issue Matter?

Why This Particular Ask?

FPF is pleased to join in this “ask” to federal policymakers to send a clear message to states that privacy is a priority in the collection and use of student data – and to provide an avenue to the resources to support it.

A Visual Guide to Practical Data De-Identification

For more than a decade, scholars and policymakers have debated the central notion of identifiability in privacy law. De-identification, the process of removing personally identifiable information from data collected, stored and used by organizations, was once viewed as a silver bullet allowing organizations to reap data benefits while at the same time avoiding risks and legal requirements.

De-Id Infographic

However, the concept of de-identification has come under intense pressure to the point of being discredited by some critics. Computer scientists and mathematicians have come up with a re-identification tit for every de-identification tat. At the same time, organizations around the world necessarily continue to rely on a wide range of technical, administrative, and legal measures to reduce the identifiability of personal data to enable critical uses and valuable research while providing protection to individuals’ identity and privacy.

The debate around the contours of the term personally identifiable information, which triggers a set of legal and regulatory protections, continues to rage, with scientists and regulators frequently referring to certain categories of information as “personal” even as businesses and trade groups define them as “de-identified” or “non-personal.”

The stakes in the debate are high. While not foolproof, de-identification techniques unlock value by enabling important public and private research, allowing for the maintenance and use – and, in certain cases, sharing and publication – of valuable information, while mitigating privacy risk.

* * *

Omer Tene, Kelsey Finch and Jules Polonetsky have been working to address these thorny issues in a paper titled Shades of Gray: Seeing the Full Spectrum of Practical Data De-Identification. The paper is published in the Santa Clara Law Review.

Read our opinion piece via The International Association of Privacy Professionals.

Accompanying this paper is our Visual Guide to Practical Data De-Identification.

Get updates on FPF’s de-identification papers and projects.

Progress on Drone Privacy Best Practices

Today, the National Telecommunications & Information Administration (NTIA) circulated a Best Practices document that is being proposed by a diverse subgroup of stakeholders including leading privacy advocates, drone organizations and companies, and associations. The proposed Best Practices will be presented and discussed at the next meeting of the NTIA convened multi-stakeholder process concerning privacy, transparency, and accountability issues regarding commercial and private use of unmanned aircraft systems.

“These proposed draft principles recognize the value of drones for beneficial purposes, but also address in a practical way the privacy concerns they raise. Much careful negotiation and compromise went into ensuring privacy issues could be addressed in a way that is practical, so operators both large and small can comply,” said Jules Polonetsky, CEO, Future of Privacy Forum, a member of the subgroup proposing the Best Practices.

Google Provides Open Source Platform for Beacon Security

After an initial splash, news about beacon technology has been fairly quiet recently, but last week an advancement was announced that will support easier access to privacy and security capabilities on this unique technology.

Beacons are sometimes misunderstood – thought to collect or retain data on nearby people, or able to track smartphone movements without their owner’s awareness. In fact, they only transmit, never collect, data. And location tracking is possible ONLY if you have given that specific app permission to use your phone’s location functions, and if you have your Bluetooth access turned on. You can control this on your phone’s setting, and you can deny an app access to contact you via notifications.

The use of low-powered beacons has spread slowly and steadily – in stores, museums, airports or other spaces that set up a device that broadcasts a unique code. If your phone has Bluetooth turned on and you download the particular app for that location, and then you allow permission to use Bluetooth and location, the app can detect that beacon. By determining the location of your phone, the app then enables features.

Beacons positioned near an airport security checkpoint might trigger your airline’s app to show your boarding pass. A beacon in a museum might signal the museum app to show information about the artist of a painting you’re looking at. Retail-store beacons may help users locate products or indicate sale items. Beacons are inexpensive, simple to deploy and are supported by most mobile operating systems.

Since their introduction on a broader scale, retailers, shopping centers, public attractions, airports, and sports arenas have explored how to use beacons in many new and different ways. As consumers become more familiar with the advantages, they have grown to enjoy the benefits of a more personalized experience.

However, one continued challenge to a broader array of applications for beacons is that of security. They work well for a standardized response triggered by a general member of the public who enters their zone, but greater protection in general applications has been limited when needed to protect information to allow for an individualized response. Since unencrypted beacon signals are also susceptible to long-term tracking, this security shortfall has limited the pace of their increased use.

Last week, however, Google announced an offering of an open-source platform with the rollout of Eddystone-EID. Per the design team, other companies may have similar technologies, but they are proprietary without easy transparency into the process for how the encryption is achieved. This is where Eddystone-EID shines, since the technical specifications are open-source.

According to Google’s statement :

“Eddystone-EID enables a new set of use cases where it is important for users to be able to exchange information securely and privately. Since the beacon frame changes periodically, the signal is only useful to clients with access to a resolution service that maps the beacon’s current identifier to stable data. In other words, the signal is only recognizable to a controlled set of users.”

Google has developed the entire suite of Eddystone platforms as open source technology; they are available on GitHub. This newest addition – EID – turns the beacon-to-app-enabled phone into an encrypted, moving target. If another phone in the area doesn’t have the shared key, the EID representation is just gibberish. With the new tools, the exchange can’t be tracked or spoofed, and there is also access to safety features such as proximity awareness, device authentication, and data encryption on packet transmission.

Now, in addition to being able to find your way around the airport, you will be able to track your luggage without anyone else knowing which bag is yours. During sporting events, the facility can communicate with individual patrons in the “nosebleed” sections to offer them better seats, when available. A UK company will use this to offer subscribers personalized commuting information.

Introduced along with other new offerings – Beacon Tools, and Eddystone GATT-Service – this new open-source platform for secure encryption practices represents for an important moment in beacon technology for the increased security and protection of personal data.