IoT Devices Should Deal with Privacy Impacts for People with Disabilities

FOR IMMEDIATE RELEASE

January 31, 2019

IoT Devices Should Deal with Privacy Impacts for People with Disabilities

FPF Recommends Approaches to Incorporate Privacy, Accessibility by Design

WASHINGTON, DC – The Future of Privacy Forum today released The Internet of Things (IoT) and People with Disabilities: Exploring the Benefits, Challenges, and Privacy Tensions. This paper explores the nuances of privacy considerations for people with disabilities using IoT services and provides recommendations to address privacy considerations, which can include transparency, individual control, respect for context, the need for focused collection and security.

“Internet of Things devices in homes, cars and on our bodies can improve the quality of life for people with disabilities—if they are designed to be accessible and account for the sensitive nature of the data they collect,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “We expect this first-of-its-kind paper to inspire collaboration among advocates, academia, government, and industry to ‘bake in’ privacy and accessibility from the start of the design process.”

“Data-driven innovation has created new tools that can improve disabled people’s safety, mobility, and independence, leading to enhanced privacy,” said Henry Claypool, Policy Director of the Community Living Policy Center at the University of California, San Francisco, Technology Consultant to the American Association of People with Disabilities and FPF Senior Fellow. “However, companies and advocates should recognize that the IoT can bring unique privacy considerations.”

FPF recommends companies and policymakers follow these recommendations to improve the experiences of people with disabilities when they use IoT-enables devices and respect their privacy:

  1. Prioritize inclusive design. Accessibility and the privacy of people with disabilities should not be an afterthought for the IoT and new technology developers—people with disabilities should be included in the design of IoT technologies. The appropriate timing for integrating accessibility is during the earliest possible stage of design.
  2. Promote research. In order to successfully build the IoT with universal or accessible design, research—both qualitative and quantitative—is needed to understand how people with disabilities utilize the IoT and feel about the current privacy landscape of the IoT.
  3. Privacy by Design approaches should consider people with disabilities. Companies should take into account the sensitive nature of the data collected from the IoT used by people with disabilities and address those consideration in the design of IoT products.
  4. Foster cross-sector collaborations. Advocates, academia, government, and industry should work together to develop IoT solutions that meet the needs of people with disabilities.
  5. Enhance awareness of data risks and benefits. Policymakers should consider not only the potential enhanced risks that people with disabilities face when using the IoT, but also the enhanced autonomy that these very same technologies provide. Members of the disability community should consider becoming engaged in policy processes and voicing their views on the privacy challenges that they face when using IoT devices and services.

IoT devices and services are empowering people with disabilities to participate more fully and autonomously in everyday life by reducing some needs for human intermediaries or accommodations. In addition to the potential benefits of IoT devices and services for people with disabilities, unique privacy risks and challenges can be raised by the collection, use, and sharing of user data. Depending on the circumstances, privacy can be enhanced or diminished by IoT technologies, creating potential tensions between privacy gains and losses.

FPF received support for the paper from the Comcast Innovation Fund and consulted with the American Association of People with Disabilities (AAPD) Technology Forum.

FPF and Comcast Innovation Fund host event today in Washington, DC

Today, Thursday, January 31, 2019, 4:30-5:30pm ET, FPF and the Comcast Innovation Fund are hosting an event about the IoT and people with disabilities at the XFINITY Store in Chinatown, 715 7th St. NW, Washington, DC 20001. Remarks and a panel discussion will be followed by audience Q&A, refreshments and networking. The remarks and panel discussion will be streamed via Facebook Live at https://www.facebook.com/FutureofPrivacy/.

###
The Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

Media Contact:

Nat Wood

[email protected]

410-507-7898

FPF Report: IoT Devices Should Deal with Privacy Impacts for People with Disabilities

FPF has released The Internet of Things (IoT) and People with Disabilities: Exploring the Benefits, Challenges, and Privacy Tensions. This paper explores the nuances of privacy considerations for people with disabilities using IoT services and provides recommendations to address privacy considerations, which can include transparency, individual control, respect for context, the need for focused collection and security.

IoT devices and services are empowering people with disabilities to participate more fully and autonomously in everyday life by reducing some needs for human intermediaries or accommodations. In addition to the potential benefits of IoT devices and services for people with disabilities, unique privacy risks and challenges can be raised by the collection, use, and sharing of user data. Depending on the circumstances, privacy can be enhanced or diminished by IoT technologies, creating potential tensions between privacy gains and losses.

FPF recommends companies and policymakers follow these recommendations to improve the experiences of people with disabilities when they use IoT-enables devices and respect their privacy:

  1. Prioritize inclusive design. Accessibility and the privacy of people with disabilities should not be an afterthought for the IoT and new technology developers—people with disabilities should be included in the design of IoT technologies. The appropriate timing for integrating accessibility is during the earliest possible stage of design.
  2. Promote research. In order to successfully build the IoT with universal or accessible design, research—both qualitative and quantitative—is needed to understand how people with disabilities utilize the IoT and feel about the current privacy landscape of the IoT.
  3. Privacy by Design approaches should consider people with disabilities. Companies should take into account the sensitive nature of the data collected from the IoT used by people with disabilities and address those consideration in the design of IoT products.
  4. Foster cross-sector collaborations. Advocates, academia, government, and industry should work together to develop IoT solutions that meet the needs of people with disabilities.
  5. Enhance awareness of data risks and benefits. Policymakers should consider not only the potential enhanced risks that people with disabilities face when using the IoT, but also the enhanced autonomy that these very same technologies provide. Members of the disability community should consider becoming engaged in policy processes and voicing their views on the privacy challenges that they face when using IoT devices and services.

FPF received support for the paper from the Comcast Innovation Fund and consulted with the American Association of People with Disabilities (AAPD) Technology Forum.

FPF and Comcast Innovation Fund host event today in Washington, DC

Today, Thursday, January 31, 2019, 4:30-5:30pm ET, FPF and the Comcast Innovation Fund are hosting an event about the IoT and people with disabilities at the XFINITY Store in Chinatown, 715 7th St. NW, Washington, DC 20001. Remarks and a panel discussion will be followed by audience Q&A, refreshments and networking. The remarks and panel discussion will be streamed via Facebook Live at https://www.facebook.com/FutureofPrivacy/.

FPF's John Verdi on Privacy Papers for Policymakers

In recognition of the Future of Privacy Forum’s 10th anniversary, FPF policy experts are sharing their thoughts on FPF’s work over the past decade, the current privacy landscape, and their vision of the future of privacy. This week, FPF Vice President for Policy John Verdi discusses the Privacy Papers for Policymakers project, which began in 2010. To read previous installments in this series, click here.

What do you expect the next 10 years of privacy to look like? Share your thoughts by clicking here.


Q&A: John Verdi on Privacy Papers for Policymakers

John VerdiFPF’s Privacy Papers for Policymakers program brings expertise from academic, tech and policy circles to Members of Congress, leaders from executive agencies, and their staffs to better inform policy approaches to thorny data protection issues. The event highlights the year’s most influential, practical academic work and connects academics with thought leaders from government, industry, and the advocacy community. Awarded articles are chosen both for their scholarly value and because they offer policymakers concrete solutions and practical insights into real-world challenges. Winners are selected by a diverse team of academics, advocates, and industry privacy professionals from FPF’s Advisory Board. Honorary Co-Hosts Senator Edward J. Markey and Congresswoman Diana DeGette will host FPF and this year’s winning authors as they present their work in the Russell Senate Office Building at 5:30pm on February 06, 2019. The event is free, open to the general public, and widely attended. A reception will follow. To RSVP, please visit privacypapersforpolicymakers.eventbrite.com.

How have the award-winning privacy papers changed over the last nine years?

The first award winners tended to deal with what we now consider broad topics in corporate privacy practices – the role of Chief Privacy Officers and federal law enforcers, how to value privacy, regulatory innovation, and so on. While those issues are still quite pertinent, recent award winners have been likely to examine more precise aspects of privacy, specific technical advances, or policies at the local, national, or international levels. A few examples from this year’s winners:

What’s been consistent from the beginning?

For nine years, the research and analysis into consumer beliefs, corporate practices, technological solutions and legal theory compiled in FPF’s Privacy Papers for Policymakers has informed the policy debate in Congress, in the states, and around the world. They are a valuable tool for legislators and staff considering the structure and elements of a national privacy framework.

Many of the papers have dealt with calls for an effective national privacy law in the US. In fact, here is the first line of the first Privacy Paper for Policymakers recognized in 2010 by the Future of Privacy Forum. “Privacy on the Books and on the Ground,” by Kenneth Bamberger and Deirdre Mulligan:

U.S. privacy law is under attack. Scholars and advocates criticize it as weak, incomplete and confusing, and argue that it fails to empower individuals to control the use of their personal information…”

They continued, “as Congress and the Obama Administration consider privacy reform, they encounter a drumbeat of arguments favoring the elimination of legal ambiguity by adoption of omnibus privacy statutes, the EU’s approach.” If you substitute “Trump” for “Obama” you could write the same words today; you would also find experts making many of the same arguments against imposing the top-down, prescriptive aspects of the EU’s approach in the US.

What are some of the research techniques authors use?

We’ve honored papers from academics, practitioners, technologists and lawyers, which means we’ve seen a wide range of approaches to research and analysis.

Some survey the privacy landscape and make recommendations based on the real-world practices they discover. For Shattering One-Way Mirrors. Data Subject Access Rights in Practice, Jef Ausloos (Postdoctoral Researcher, University of Amsterdam’s Institute for Information Law) and Pierre Dewitte (Researcher, KU Leuven Centre for IT & IP Law) contacted sixty information service providers and requested access to data. They concluded that data access rights in the EU are largely underused and not properly accommodated. Their research not only uncovered what they called an “often-flagrant lack of awareness, organization, motivation, and harmonization,” but also identified concrete suggestions aimed at data controllers, such as relatively easy fixes in privacy policies and access rights templates.

For Designing Without Privacy, Ari Ezra Waldman (Professor of Law and Founding Director, Innovation Center for Law and Technology at New York Law School) conducted an ethnographic study of how, if at all, people designing technology products think about privacy, integrate privacy into their work, and consider user needs in the design process. His paper references and expands upon the work of Kenneth Bamberger and Deirdre Mulligan – work that FPF recognized as one of our first award winners in 2010. Professor Waldman looks at how CPOs’ robust privacy norms can best be diffused throughout tech companies and the industry as a whole.

What’s next for Privacy Papers for Policymakers?

The Privacy Papers for Policymakers program will continue to highlight top scholarship, promote pragmatic solutions to privacy challenges, and generate thoughtful dialogue in Washington DC. We look forward to promoting constructive approaches to data protection around legislative drafting tables and in corporate boardrooms.

Most immediately, we are looking forward to a fantastic event honoring his year’s winning authors. On February 6, 2019, FPF and Honorary Co-Hosts Senator Edward J. Markey and Congresswoman Diana DeGette will host FPF and this year’s winning authors as they present their work in the Russell Senate Office Building at 5:30pm on February 06, 2019. The event is free, open to the general public, and widely attended. To RSVP, please visit privacypapersforpolicymakers.eventbrite.com.

FPF's Amelia Vance on the Future of Student Privacy

Amelia Vance, Policy Counsel and Director of the FPF Education Privacy Project, is one of the foremost experts in the nation on education privacy. She has a knack for making complex regulations and technical trends accessible to individuals who are not lawyers or computer scientists, but who care deeply about student privacy – school administrators, parents, students, and others. This skill is very much in demand; whether testifying before Congress or sharing her expertise at conferences across the country, Amelia has been a valuable resource for anyone who wants to understand how federal and state laws impact data practices in the classroom. In this installment of our 10th anniversary series, Amelia discusses FPF’s work in education, the current student privacy landscape, and the debates of the future.

Why do you like working on student privacy?

Student privacy is a microcosm of every privacy issue out there, except we’re talking about kids, which makes things so much more sensitive. FPF works on algorithms and ethics, health privacy, research privacy, IOT, online trackers – which are all part of the student privacy landscape. And because children are recognized – both legally and developmentally – as especially vulnerable, any privacy discussions must be nuanced and thoughtful because getting it wrong means you can end up derailing a child’s future. Student privacy requires not only legal expertise, but also the ability to put yourself in the shoes of a parent, a teacher, an edtech company, or other stakeholders so you can figure out their concerns and how to best respond to them. Framing the conversation correctly is just as important as getting the policies right.

What sort of work has FPF done on education privacy over the past 10 years? What challenges have arisen during that time?

FPF kicked off its education privacy program in 2014 with the launch of the Student Privacy Pledge. That year, over 100 student privacy bills were introduced in 39 states; the prior year, only one student privacy law had passed. This flurry of legislation highlighted a major gap in law and resources on this issue; the federal student privacy law, FERPA, was passed in 1974, and if you get 15 FERPA experts in a room, they’ll come up with 16 interpretations of any provision. The new state laws were creating mandates for states, districts, and companies without providing the resources and training necessary to implement the laws with fidelity. FPF decided to step in and worked with partners like the Data Quality Campaign, the Software and Information Industry Association (SIIA), ConnectSafely, and the National PTA to create actionable resources for different audiences.

The Student Privacy Pledge has been one of our most successful projects. Co-founded with SIIA, the Pledge is a Federal Trade Commission-enforceable code of conduct for edtech vendors. Now with nearly 330 companies as signatories, the Pledge was designed to both raise awareness of best practices and facilitate their implementation.

We are also particularly proud of FERPAǀSherpa, the student privacy resources website. Whether you’re a student, parent, educator, administrator, policymaker, or higher education staffer, we hope to make the vast student privacy landscape more understandable. The explosion in state privacy laws has made this a challenge, but it’s one we embrace. My primary goal is that everything we release be useful and move the student privacy conversation forward.

What should our readers know about the current student privacy landscape?

39 states and DC have passed 125 new laws since 2013, so right now most stakeholders are focused on implementation and seeing how these laws play out on the ground. There are now 450+ resources on student privacy to help stakeholders on the issue, but few state legislatures provide funding and training to districts and state education agencies to implement student privacy best practices. We have also seen many unintended consequences play out over the past few years. It is unfortunately easy to mess this up – for example, a complete ban on selling student data can result in banning school pictures! One state’s law made parents opt into almost all data sharing and caused some schools to stop announcing football players’ names, hanging student artwork in the hallways, and even referring some students to the state scholarship fund. It’s easy to get lost in sensationalism and misunderstandings when discussing issues that affect children; our work injects nuance and informed analysis into the public debate.

What about the next 10 years? What privacy challenges can we expect to emerge in this space?

Now that most of the new student privacy laws have been in place for a couple years, we are likely to start seeing enforcement actions – which, in some states, could mean jail time. There are fewer big student privacy bills being introduced in states at this point, but we’re seeing more legislation with idiosyncratic student privacy requirements that could trip up schools or edtech companies. We also see legislation that should have privacy requirements but doesn’t – that’s a big issue with school safety legislation! We’re also likely to see a re-write of FERPA pass in Congress at some point in the next five years. Finally, as we’ve seen over the past year, the privacy conversation has now spread past education into the general news; this means edtech companies will have to attempt to reconcile the burgeoning universe of consumer privacy law with parallel developments in education. There will likely be times when legal obligations conflict, and it will be interesting to see how well legislators take the lessons learned from student privacy laws to avoid unintended consequences in general consumer privacy laws.

 

The Future of Mobility in a Connected World: FPF’s Lauren Smith on Connected Cars, Data, and more

FPF is celebrating our tenth anniversary. In recognition of this milestone, FPF policy experts will be sharing their thoughts on FPF’s progress and the work ahead in a series of blogs over the coming weeks. Our 10th anniversary celebration will be on April 30. RSVP here.

This week, Policy Counsel Lauren Smith discusses connected cars and the future of mobility.

What do you expect the next 10 years of mobility and privacy to look like? Share your thoughts by clicking here.


Bearbeitet Headshot LaurenFPF Policy Counsel Lauren Smith didn’t see herself as a “car person” when she joined FPF in 2016. Three years later, she leads FPF’s Data and Mobility Working Group and regularly shares her expertise on the subject in speaking engagements, media interviews, and with state and federal regulators, among other stakeholders. She recently answered some questions on the state of connected cars, FPF’s work in the space, and the future of mobility.

So, what drew you to work on data and mobility at FPF? How did it differ from your prior work?

I had been advising on tech policy, privacy, and big data at the White House Office of Science and Technology Policy, where I contributed to a report on Big Data and Privacy. Those efforts raised several important questions around privacy and data access, the emergence of new technologies, and the social and ethical impact of these advancements. When I got to FPF, I was fascinated to see how rapidly the auto industry was changing, but I also noticed that it faced many of the same issues as the other “Internet of Things” technologies I was familiar with in my previous work. The maxim at most of the conferences I attend now is that we will see more change in the transportation industry over the next five years than we’ve seen in the past 50. Much of this transition is driven by technological advances and opportunities presented by data, and it has been fascinating to get to know an industry during such a unique period.

What sort of work has FPF done on connected cars over the past 10 years? What challenges have arisen during that time?

Data collection in cars isn’t new; for instance, there have been computer systems in most cars since at least the 1990s. The biggest change has been an explosion in the variety, volume, and connectivity of the data collected. As Americans, we tend to associate cars with personal autonomy, but we need to start thinking about cars like we think about our smart phones, rather than as mechanical chassis that get us from point A to point B.

A few years ago, the auto industry anticipated these changes, and nearly every automaker committed to a set of privacy principles for auto data that enable the benefits of these new technologies while establishing baseline privacy protections for consumers. FPF contributed to the effort to establish these principles and has proceeded to be one of the only groups deeply focused on this space. Our work has included creating a Consumer Guide to the Connected Car that auto dealers can hand to consumers; leading efforts to map the vehicle data ecosystem through projects like Data and the Connected Car 1.0; filing comments in federal, state, and local regulatory efforts; hosting convenings for thought leaders in this space throughout the U.S. and Israel; and continuous media and public speaking efforts to better educate consumers, regulators, and lawmakers alike on new developments. My TEDx talk was a fun highlight that allowed us to reach new audiences. It has been a pleasure learning from and working with our Data and Mobility Working Group members, which includes representatives from auto manufacturers, ridesharing companies, mapping, telecommunications firms, mobility startups, and more.

What are some current hot topics in mobility and connected cars?

Right now, we’re seeing a rapidly changing industry encounter a series of new scenarios. First, the mobility ecosystem includes so much more than connected cars. Scooters and shared bikes have grown immensely popular over the past few years, and they produce data that is increasingly of interest to state and local regulators. Mobility data has the potential to make our city transportation infrastructure and planning far more efficient, but any such efforts need to be thorough in creating credible privacy regimes around the data they collect.

Next, we’re seeing a growing number of sophisticated technologies, but we still haven’t answered some basic questions around mobility data, such as who can access it, who manages the consumer relationship in a vehicle, which of this data is “personal,” and how to efficiently wipe basic personal data when vehicles are transferred between users. At the same time, we’re seeing regulators who care about privacy trying to ascertain how to regulate such a new space without limiting the opportunities that these technologies can bring. As the industry navigates its compliance with new privacy laws like GDPR and CCPA, I’m hoping we will see new tools and have the needed conversations to ensure we build a trusted mobility data ecosystem.

As these conversations evolve, cars are gaining powerful on-board processing systems and advanced sensor technologies, with a growing ability to protect safety, monitor happenings inside the car, and gather information on driving habits and consumer preferences. We emphasize the word “ecosystem” because there are a growing number of entities involved. The impact of these technologies will extend far beyond just automakers, impacting the insurance industry, mapping companies, telecommunications providers, public transit, city planners, and more. As driver assistance and autonomous features enable serious safety benefits across our roads, we think it’s our job to ensure we can build a privacy-protective ecosystem that supports them and that earns consumer trust.

What about the next 10 years? What technological advancements and privacy challenges can we expect to emerge in this space?

It’s so exciting to look out on this space 10 years out. I get asked these questions a lot: Will there be flying cars? Will my 3-year-old ever need a driver’s license? The only constant is change right now, and it’s raising a lot of really interesting questions.

By some estimates, the global revenue pool from connected car data is expected to hit $750 billion by 2030. The companies using and generating this data will face many of the same questions around privacy and advertising and data sharing and access that we’ve tackled in other FPF verticals. The good news is that we aim to help the mobility industry learn from other sectors that have already tackled similar issues surrounding data privacy management and regulatory infrastructure. For example, our mobility work has often merged with our location work recently, and I expect that to continue as we face questions around sensitivity of the geolocation that is fundamental to mobility technologies.

I would expect to see major developments not just in advanced driving assistance systems and autonomous vehicles, but also vehicle-to-vehicle and vehicle-to-infrastructure communication. More advanced sensor technology using lidar, hi-definition mapping, radar and video sensors paired with higher processing powers and advanced connectivity options will enable the collection and transmission of significant datasets that will need to be privacy protected to ensure consumer trust in the technology. Vehicles will be more connected to each other and to the larger transportation grid as a result. Vehicles may be managed by fleets rather than individual owners. These advancements and more will save lives and make transportation more efficient, but will also pose challenges for regulators, policymakers, and consumers.

I can’t wait to see how this space changes going forward—there is rarely a dull moment and it’s a pleasure to navigate these exciting questions.


What do you expect the next 10 years of connected cars and privacy to look like? We’d love to hear from you on this subject or any other thoughts you have on privacy:[ninja_form id=8]

Advisory Board Reviewers: PPPM 2018

Each year, FPF awards the Privacy Papers for Policymakers Award to the authors of leading privacy research and analytical work that is relevant to policymakers in the United States Congress, at U.S. federal agencies, and for data protection authorities abroad. The Award showcases work that analyzes current and emerging privacy issues and proposes achievable  solutions or new means of analysis that could lead to real-world policy impact.

PPPM submissions receive an initial ranking from our Advisory Board Reviewers — a diverse team of academics, consumer advocates, and industry privacy professionals.

Winning Authors are invited to join FPF in Washington, DC to discuss their work at the United States Senate with policymakers, academics, and privacy professionals. This year, Privacy Papers for Policymakers will be held at 5:30 PM on February 6, 2019 in Room SR-325 (Kennedy Caucus Room), Russell Senate Office Building. For more information and to register, click here.

FPF would like to extend a special Thank You to our 2018 Advisory Board Reviewers, including:

Eduard Bartholme, Call For Action

Monica Bulger, Future of Privacy Forum

Maureen Cooney, Sprint

Philip Fabinger, HERE Technologies

Jonathan Fox, Cisco

Dona Fraser, The Children’s Advertising Review Unit 

Claire Gartland, Facebook

Lauren Gelman, BlurryEdge Strategies

Scott Goss, Qualcomm

John Grant, Palantir

Rita Heimes, International Association of Privacy Professionals 

Joseph Jerome, Center For Democracy & Technology

Barbara Lawler, Looker Data Services

Knut Mager, Novartis

Magnolia Mobley, LegalMatters, LLC

Lisa Martinelli, Highmark Health

Estelle Masse, Access Now

Drew Mitnick, Access Now

Robyn Mohr, Loeb & Loeb, LLP

Vivek Narayanadas, Shopify

Kara Selke, StreetLight Data

Amie Stepanovich, Access Now

Thomas van der Valk, Facebook

Heather West, Mozilla

Thank you for all your hard work!

 

Spotlight on PPPM Finalist Judges (2018)

On December 17th, the Future of Privacy Forum announced the winners of the 2018 Privacy Papers for Policymakers Award. Each year, FPF awards the Privacy Papers for Policymakers Award to the authors of leading privacy research and analytical work that is relevant to policymakers in the United States Congress, at U.S. federal agencies, and for data protection authorities abroad.

The goal of the Award is to advance academic-industry collaboration by showcasing work that analyzes current and emerging privacy issues and proposes achievable solutions or new means of analysis that could lead to real-world policy impact.

How are PPPM papers chosen?

This year, Privacy Papers for Policymakers will be held at 5:30 PM on February 06, 2018 in Room SR-325 (Kennedy Caucus Room), Russell Senate Office Building. For more information and to register, click here.

Finalist Judges:

Our Finalist Judges for 2018 include representatives from FPF, as well as one representative from Academia, one from Consumer Advocacy, and one from Industry.

Judges include Jules Polonetsky, CEO, Future of Privacy ForumChristopher WolfFounder and Board Chair, Future of Privacy Forum; Mary Culnan, Professor Emeritus, Bentley University, and Board Vice President, Future of Privacy Forum; John BreyaultVice President of Public Policy, Telecommunications and Fraud, National Consumers League; and Mark MacCarthy, Senior Vice President, Public Policy, Software & Information Industry Association.

More on our PPPM Judges:

Jules Polonetsky

CEO, Future of Privacy Forum

Jules Polonetsky

Jules serves as CEO of the Future of Privacy Forum. Jules’ previous roles have included serving as Chief Privacy Officer at AOL and before that at DoubleClick, as Consumer Affairs Commissioner for New York City, as an elected New York State Legislator and as a congressional staffer, and as an attorney. Jules serves on the Advisory Board of the Center for Copyright Information. He has served on the boards of a number of privacy and consumer protection organizations including TRUSTe, the International Association of Privacy Professionals, and the Network Advertising Initiative. From 2011-2012, Jules served on the Department of Homeland Security Data Privacy and Integrity Advisory Committee. In 2001, Crain’s NY Business magazine named Jules one of the top technology leaders in New York City. Jules is a regular speaker at privacy and technology events and has testified or presented before Congressional committees and the Federal Trade Commission.

 

Mary Culnan

culnanProfessor Emeritus, Bentley University

Vice President, Future of Privacy Forum Board of Directors

Dr. Mary J. Culnan is Professor Emeritus at Bentley University. She also serves as a Senior Research Fellow in the Center for IT and the Global Economy (CITGE) at the Kogod School of Business, American University. Mary has testified before Congress, the Massachusetts Senate, and other government agencies on a range of privacy issues. Mary’s primary research interest is governance of privacy and security. She has also conducted research on how organizations can gain value from social media. Mary’s work has been published in a range of academic journals as well as the New York Times, the Washington Post and the Wall Street Journal. Mary was employed for seven years as a systems analyst by the Burroughs Corporation prior to earning her Ph.D. in management from UCLA. Before joining the faculty at Bentley in fall 2000, she held faculty positions at the University of Virginia, University of California, Berkeley, the American University and Georgetown University.

 

Christopher Wolf

Founder and Board Chair, Future of Privacy Forum

wolfChristopher Wolf is the founder and Board Chair of the Future of Privacy Forum. Chris is also a senior partner in the Washington, DC office of Hogan Lovells LLP, where he is a leader of that firm’s Privacy and Information Management practice. He has been in private law practice in Washington, DC since 1982. Chris has served as an adjunct law professor on Internet and privacy law, and is a frequent lecturer in continuing legal education programs on the subject.

MSNBC called Chris Wolf a “pioneer in Internet law”, reflecting his involvement in some of the earliest and precedent setting cases involving technology agreements, copyright, domain names, jurisdiction — and privacy. As the ability to collect, store, share and transfer personal information over the Internet increased, privacy became the main focus of Chris’ law practice. And Chris became known as a pioneer in privacy law too. It was for that reason that the prestigious Practising Law Institute (PLI) tapped Chris to be Editor and Lead Author of its first-ever treatise on privacy law. He also is co-editor of the PLI book, “A Practical Guide to the Red Flag Rules”, the identity theft prevention regulations issued by the FTC and financial regulators.

John Breyault

Vice President of Public Policy, Telecommunications and Fraud, National Consumers League

breyault_headshotJohn joined the National Consumers League — America’s oldest consumer organization — in September 2008. His focus at NCL is advocating for stronger consumer and worker protections before Congress and federal agencies on a range of issues including telecommunications and technology policy, fraud, and consumer financial protections. In addition, John directs NCL’s Fraud Center an online hub for consumer education and advocacy related to fraud.

Prior to coming to NCL, John was Research Director at the Telecommunications Research and Action Center (TRAC), a non-profit consumer organization dedicated to promoting the interests of telecommunications consumers. Concurrent with his work at TRAC, John was Director of Research at Amplify Public Affairs (APA) where he helped launch the firm’s Web 2.0-based public affairs practice.

Prior to joining APA, John worked at Sprint in its International Carrier Services Division, at BellSouth in its Government Affairs office and at the American Center for Polish Culture. John has served on numerous Boards and advisory committees including the Federal Communications Commission’s Consumer Advisory Committee, the Commodity Futures Trading Commission’s Technology Advisory Committee and the Board of the Arlington-Alexandria Coalition for the Homeless.

 

Mark MacCarthy

Senior Vice President, Public Policy, Software & Information Industry Association

Mark MacCarthy is an adjunct faculty member in the Communication, Culture & Technology Program at Georgetown University. He teaches courses and conducts research in information privacy, AI and the future of work, global Internet freedom, algorithmic fairness and the development of electronic media. He also teaches courses in philosophy & privacy and philosophy & free speech and AI and Ethics in the Philosophy Department. He is an Affiliate of the Center for Business and Public Policy at Georgetown’s McDonough School of Business. He is also Senior Vice President for Public Policy at the Software & Information Industry Association, where he advises member companies and directs public policy initiatives in technology policy, information privacy, trade, Internet governance, intellectual property and educational technology. He has been a consultant on technology policy issues for the Organization for Economic Cooperation and Development and for the Aspen Institute. His previous public policy experience includes senior positions with Visa, Inc., the Wexler|Walker Group, Capital Cities/ABC and the Energy and Commerce Committee of the U.S. House of Representatives. He holds a Ph.D in philosophy from Indiana University and an MA in economics from the University of Notre Dame.

Thank you to our 2018 PPPM Finalist Judges!

GDPR: A Year On – IEEE calls for articles

Do you have an interesting perspective on Europe’s General Data Protection Regulation or insightful information about GDPR to share? IEEE Security and Privacy seeks articles from scholars and practitioners from various disciplines and countries to examine GDPR: A Year On. Successful submissions will address (among other topics) the GDPR’s:

• position at the intersection of law and technology;

• global impact;

• implications for global multinationals and for small and medium size enterprises;

• implementation by engineers, economists, and lawyers;

• potential macroeconomic and competitive impact; and

• effect on debates about ethics beyond the law.

Submissions are due by March 1, 2019, with publication in November/December, 2019. Articles should be understandable to a broad audience of people interested in security and privacy, and run between 4,900 to 7,200 words. IEEE’s website has more information about submission requirements, or you can email the guest editors at [email protected].

Guest editors for the special issue are:

• Omer Tene, IAPP and FPF (lead editor)

• Katrine Evans, Hayman Lawyers

• Bruno Gencarelli, European Commission

• Gabe Maldoff, Bird & Bird

• Gabriela Zanfir-Fortuna, FPF

FPF at 10: Envisioning the Future of Privacy

This year, Future of Privacy Forum is celebrating our tenth anniversary as a catalyst for privacy leadership and scholarship. In recognition of this milestone, we will host an anniversary celebration on April 30 and release a report on rising privacy issues. We also are publishing a series of blog posts over the next several weeks in which our policy experts will share their thoughts on FPF’s work over the past decade, the current privacy landscape, and their vision for the future of privacy.

We will also be asking you to share your thoughts with us: What do you expect the next ten years of privacy to look like?

Read the first post below with Jules Polonetsky’s Q&A and sign up for our mailing list to receive updates when new posts are published. 


Q&A: Jules Polonetsky on the Future of Privacy

Jules Polonetsky has been CEO of the Future of Privacy Forum since its founding 10 years ago. He is uniquely suited to bring together privacy experts from industry, consumer advocacy and government. Before he joined FPF, Jules was the Chief Privacy Officer at AOL and DoubleClick, New York City Consumer Affairs Commissioner, a New York state legislator, a congressional staffer and an attorney.

As we observe its 10th Anniversary, how would you describe the idea behind the Future of Privacy Forum?

Our goal at the Future of Privacy Forum is to provide a roadmap on how our world can experience the benefits of data in a way that is ethical, moral, and that maintains our individual senses of self and autonomy.

There are so many areas where data holds opportunity to improve our health, safety, and happiness, but every one of those opportunities also is a source of great risks. We may come up with new medical advances by studying electronic health records but we need to do that in a manner that respects individual privacy. We want a world that is safer from things like terrorism but we don’t want government monitoring every email and phone call. We believe we can integrate privacy protections with responsible data use that will improve our lives.

So we convene experts from businesses, government, academia and civil society to get the best thinking and promote insightful research. We’ve also spurred industry to take actions with real-world impacts to protect consumer privacy. For example, the 300 companies that have taken the Student Privacy Pledge submit to legally enforceable obligations, such as not to sell students’ personal information. Likewise, companies that support FPF’s Privacy Best Practices for Consumer Genetic Testing Services agree to a set of standards for the collection and use of genetic data, like not sharing individual genetic data without express permission.

And we’re not just influencing industry practices. Our Open Data Risk Assessment used in Seattle and other cities helps government officials navigate the complex policy, technical, organizational and ethical standards that support privacy-protective open data programs.

How do you balance enthusiasm about innovative technology with an awareness that there can be pitfalls?

As a think tank director working with companies and advocates and government and foundations, I’m excited about the latest breakthroughs in technology. I’m also aware of the incredible consequences if we don’t put the right policies and structures and laws in place to make sure society benefits in a way that lifts us all up and takes us in a positive direction.

At FPF, we try to be at the center of the world of privacy. We work with companies to make sure when they use data, they are doing it in a responsible way. We also work with academics and civil society folks who worry that the government or companies could take us down an Orwellian path.

What are you looking for in potential privacy legislation from this Congress?

The White House, Congress, industry and civil society are increasingly in agreement about the need for comprehensive federal privacy legislation, so I hope a productive bill can be passed and signed into law. There are a few things I’ll be looking for in new legislation.

First, the rules should be technology neutral and cover every type of company and business model. Data is collected across platforms — on the web, on mobile, with wearables, smart home devices, and phone and facial tracking — and one clear set of rules is easiest for consumers to understand.

Legislation should recognize that data flows internationally, and move us closer to interoperability with Europe and other countries, while preserving room for new ideas and entrepreneurship.

Another foundational principle is fairness, which means using data only as people would reasonably expect and not using it in ways that have unjustified adverse effects. This concept is very similar to how the current FTC Act authorizes legal action against deceptive or unfair practices.

Also, any legislation should consider the increasingly sophisticated privacy tools that are emerging, including differential privacy to measure privacy risk, homomorphic encryption that can enable privacy safe data analysis, and many new privacy compliance tools that are helping companies better manage data. A law that will stand the test of time and successfully protect privacy rights while enabling valuable uses of data should include mechanisms to incentivize new privacy-friendly technology.

What is a cutting-edge privacy issue that you think will grow in importance over the next ten years?

The world of artificial intelligence and machine learning is fascinating and important. We’re just beginning to scratch the surface of the opportunities, but we’re also starting to understand what a dangerous and dark path the misuses of those types of data could lead to. There is a risk of being overly optimistic and not understanding that we could take ourselves to a place of no return.

I hope that in ten years – when FPF celebrates its 20th anniversary – we won’t be fixing ineffective rules and laws from this era. For example, the GDPR, if interpreted narrowly, could pose some challenges for AI and machine learning, since it specifies that personal data must be collected only for a specified purpose, and must be deleted or minimized when not needed for that purpose. For many machine learning processes today, large and representative data sets are required to power new models, to ensure accuracy and to avoid bias. As European regulators seek to advance trusted AI, ensuring the right balance here will be essential.

And as Congress considers a U.S. privacy framework, we will be advising policymakers on the ways that uses of data for machine learning and other innovations can be supported, when responsible safeguards are in place.

I’m looking forward to seeing the work of FPF’s AI and Machine Learning Working Group – and all of our program areas – evolve in the coming years!

Check back in coming weeks for more discussions with FPF policy experts and sign up for our mailing list to stay informed about important privacy issues.

FPF, EFPIA, and CIPL Workshop Report Now Available: "Can GDPR Work for Health Scientific Research?"

On October 22, 2018, the Future of Privacy Forum (FPF), the European Federation of Pharmaceutical Industries and Associations (EFPIA), and the Centre for Information Policy Leadership (CIPL) hosted a workshop in Brussels, “Can GDPR Work for Health Scientific Research?,” to discuss the processing of personal data for health scientific research purposes under the European Union’s General Data Protection Regulation (GDPR).

The use of health data in research, whether it arises in the course of hospital treatment or from personal management of care, has the potential to improve the lives of individuals, as well as transform health care systems and health-related science and innovation. Yet, at this moment, researchers and private and public stakeholders generally are facing difficulty in understanding how to comply with GDPR when processing personal data for health scientific research. Further, National Data Protection Authorities (DPAs), Health Authorities, and Ethical Committees are providing differing guidance on what should be the basis for processing special categories of personal data in scientific research, and the divergences are, if anything, widening.

The workshop highlighted multiple issues at the center of this challenge including:

Legal and regulatory harmonization of approaches to health data research will be critical to the advancement of digital health to improve care and health outcomes. The European Data Protection Board (EDPB) will play a key role in working with the privacy and public research sectors to ensure harmonized application of the GDPR and legal certainty, as well as to clarify the situation and reconcile the needs of research while maintaining the rights of individuals to exercise choice and understand how their data is being used.

READ THE WORKSHOP REPORT