Infographic: Data and the Connected Car – Version 1.0

Infographic Imagenew2

On June 27, 2017, the Future of Privacy Forum released an infographic, “Data and the Connected Car – Version 1.0,” describing the basic data-generating devices and flows in today’s connected vehicles. The infographic will help consumers and businesses alike understand the emerging data ecosystems that power incredible new features—features that can warn drivers of an accident before they see it, or jolt them awake if they fall asleep at the wheel.

Many of these new features are enabled by the collection of new types of data, putting the topic of privacy in connected cars on the agenda of industry, policymakers, and regulators. On June 28, 2017, the Federal Trade Commission and the National Highway Traffic Safety Administration hosted a workshop on the privacy and security issues around automated and connected vehicles. This was the first workshop co-hosted by the two agencies, and their partnership is a recognition of the convergence of the automotive and technology sectors. Lauren Smith, FPF’s Connected Car Policy Counsel, spoke on a panel about cybersecurity and data. Watch a clip below.

“The benefits of connected vehicle technologies are crucial to addressing the 94% of car accidents that are caused by human error,” said Smith. “But we need to foster transparency and communication around consumer data use in order to deploy them responsibly. Conversations between lawmakers, consumers, and businesses such as those happening tomorrow need to go beyond the current day and focus on building trustworthy data practices—and communicating them—as vehicles advance. We think that explaining cars’ data-transmitting devices and flows is an important first step.”

This infographic accompanies a project FPF launched earlier this year, a first-of-its kind consumer guide to Personal Data in Your Car. The Guide includes tips to help consumers understand the new technologies powered by data inside the car.


Download Infographic

Download Consumer Guide

In the Press

Future of Privacy Forum Releases "Data and the Connected Car" Infographic in Advance of FTC & NHTSA Workshop

Infographic Imagenew2

FOR IMMEDIATE RELEASE                   

June 27, 2017

Contact:

Lauren Smith, Connected Cars Policy Counsel, [email protected]

Melanie Bates, Director of Communications, [email protected]

Future of Privacy Forum Releases Infographic Mapping Data and the Connected Car

in Advance of FTC & NHTSA Workshop

Washington, DC – Today, the Future of Privacy Forum released an infographic, “Data and the Connected Car – Version 1.0,” describing the basic data-generating devices and flows in today’s connected vehicles. The infographic will help consumers and businesses alike understand the emerging data ecosystems that power incredible new features—features that can warn drivers of an accident before they see it, or jolt them awake if they fall asleep at the wheel.

Many of these new features are enabled by the collection of new types of data, putting the topic of privacy in connected cars on the agenda of industry, policymakers, and regulators. Tomorrow, Wednesday, June 28, the Federal Trade Commission and the National Highway Traffic Safety Administration will host a workshop on the privacy and security issues around automated and connected vehicles. This is the first workshop co-hosted by the two agencies, and their partnership is a recognition of the convergence of the automotive and technology sectors.

“The benefits of connected vehicle technologies are crucial to addressing the 94% of car accidents that are caused by human error,” said Lauren Smith, FPF’s Connected Cars Policy Counsel. “But we need to foster transparency and communication around consumer data use in order to deploy them responsibly. Conversations between lawmakers, consumers, and businesses such as those happening tomorrow need to go beyond the current day and focus on building trustworthy data practices—and communicating them—as vehicles advance. We think that explaining cars’ data-transmitting devices and flows is an important first step.”

This infographic accompanies a project FPF launched earlier this year, a first-of-its kind consumer guide to Personal Data in Your Car. The Guide includes tips to help consumers understand the new technologies powered by data inside the car. It describes common types of collected data, the Privacy Principles that nearly all automakers have committed to, and includes a “privacy checklist” for renting or selling a car. Did you delete your synced contacts list? How about your garage door programming? And don’t forget to wipe your home address on that navigation system! These easy, simple steps can help consumers protect their own data and start thinking about the types of information involved in today’s new mobile ecosystem.

Data and the Connected Car

###

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

Federal Trade Commission: COPPA Applies to Connected Toys

This week, the Federal Trade Commission (FTC) updated its guidance on COPPA, the Children’s Online Privacy Protection Act, to clarify that the 1998 statute applies not just to websites and online service providers that collect data from children, but also to Internet of Things devices, including children’s toys. The updated guidance has been applauded by advocates, and is a welcome clarification that COPPA’s strong protections of COPPA apply to toys like Hello Barbie, Dino, and Fischer Price’s Smart Toy.  The guidance acknowledges the potential harm to children of deceptive data practices, writing “when companies surreptitiously collect and share children’s information, the risk of harm is very real, and not merely speculative.”

In December 2016, Future of Privacy Forum and Family Online Safety Institute published “Kids & The Connected Home: Privacy in the Age of Connected Dolls, Talking Dinosaurs, and Battling Robots,” an early analysis of the privacy and security implications of connected children’s toys. At the time, some advocates were calling for a legal update to cover the unique issues of screen-less dolls and teddy bears that might collect information from children. In our white paper, we were one of the first to analyze COPPA in the context of children’s toys and concluded that it almost certainly already applied to the wide range of Internet-connected toys on the market:

“Although COPPA was written long before a mainstream market for connected toys existed, there is a growing consensus that the federal statute applies to the wide range of modern toys that connect to the Internet. Most connected toys available today connect to the Internet through a mobile app or other mechanism … and it is well-established that COPPA applies to Internet-connected devices and platforms, including smartphones, tablets, and apps. The FTC is vested with the legal authority to interpret COPPA, and it has promulgated more detailed requirements in the COPPA Rule. COPPA applies to any provider (“operator”) of “a Website or online service directed to children, or any operator that has actual knowledge that it is collecting or maintaining personal information from a child . . .”. Although the FTC has not yet taken an enforcement action against a connected toy operator, the Commission has stated that the term “online service” broadly covers any service available over the Internet or that connects to the Internet or a wide-area network.”

Although COPPA’s protections are strong, we recommend that providers of connected toys go even farther in protecting sensitive information collected from children, and discuss suggested best practices. Privacy-conscious steps include:

Children’s personal information, and parents’ ability to make informed, meaningful choices, should be given the highest level of legal protection. The FTC’s updated guidance represents an important step towards this goal, as well as towards protecting privacy in the growing Internet of Things.

The Top 10: Student Privacy News (May – June 2017)

The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories.

The Top 10

  1. FPF has relaunched FERPA|Sherpa! The site now includes:
  2. The House Subcommittee on Early Childhood, Elementary, and Secondary Education is holding the hearingExploring Opportunities to Strengthen Education Research While Protecting Student Privacy” on June 28th. If this sounds familiar, that’s because a very similar hearing was held on March 22nd last year. Check out my op-ed on this topic. Could the scheduling of this hearing foreshadow a FERPA re-write (re-)introduction?
  3. As reported in the previous newsletter, both the House and Senate have introduced the College Transparency Act, which would overturn the current federal ban on having a student-level data system at the U.S. Department of Education. There was a hearing on the CTA in the House, and many people continue to weigh in on whether the CTA is a good or bad idea.
  4. Following in the footsteps of the ACLU of Massachusetts, the ACLU of Rhode Island released their research on a lack of privacy protections in 1-1 programs in Rhode Island schools. It will be interesting to see the reverberations from the report. Already, Rhode Island schools are changing their policies and the ACLU is pushing its model bill to fix this problem. EdSurge reports on the Rhode Island report and other student privacy issues: “From High School to Harvard, Students Urge for Clarity on Privacy Rights.” While not specifically about privacy, this article, “California law spurs reforms after heartbreaking student suicide cluster,” has implications for the discussions many advocates and schools are having about when schools should be accessing student devices (either 1:1 devices or BYOD). In related news, too much district filtering can actually undermine student willingness to use a 1:1 device, as one district in Texas discovered.
  5. There have been a couple big breaches in higher ed in the past month, including a major breach at the University of Oklahoma where a student journalist discovered that she was able to access sensitive information like student financial aid records and grades through the school’s use of Microsoft Delve. Following the discovery of the breach, the U.S. Department of Education contacted the school to “further assess the institution’s compliance with its data security safeguard requirements according to the Gramm-Leach-Bliley Act.” Don’t know much about that law? Check out the FSA’s letter (and second letter) to institutions.
  6. The U.S. Department of Education’s Privacy Technical Assistance Center has a fantastic newly designed website and a new data breach response training toolkit. There is also another important federal update this month: the FTC has updated the COPPA compliance plan for businesses.
  7. The Consortium for School Networking has released an updated version of their incredibly useful privacy toolkit – I highly recommend that everyone reading this newsletter check it out!
  8. There was a fair amount of focus this month on philanthropic tech funders in education. Natasha Singer from the New York Times wrote about “The Silicon Valley Billionaires Remaking America’s Schools,” Inside Philanthropy asked “Can Technology Turbo-Charge K-12 Learning? Chan Zuckerberg Is Betting On It,” and EdWeek reported “Gates, Zuckerberg Philanthropies Team Up on Personalized Learning.” The Economic Times also reported “Mark Zuckerberg, Bill Gates try opposite paths to education tech in India.”
  9. “Schools are watching students’ social media, raising questions about free speech” and privacy, via a feature on PBS NewsHour on June 20. This relates to privacy and surveillance questions I discussed in my report on that topic.
  10. IEEE has formed a working group to develop a “Standard for Child and Student Data Governance.” You can contact them at this website if you are interested in joining.

*Want more news stories? Email Amelia Vance at avance AT fpf.org to subscribe to our student privacy newsletter.

Image: “20111105-student2-2” by Devon Christopher Adams  is licensed under CC BY 2.0.

Honoring Jessica Rich

During the 2017 Annual Advisory Board Meeting, FPF issued its first-ever award to Jessica Rich, the former Director of the Bureau of Consumer Protection at the Federal Trade Commission (FTC), for her leadership in responsible data use and consumer privacy.

“We are thrilled to honor Jessica with our inaugural award celebrating leadership in responsible data use and consumer privacy,” said Jules Polonetsky, CEO, Future of Privacy Forum. She is widely credited with building the FTC’s privacy program from a small team in the 1990s to the signature program that it is today.”

In her role as Director, Jessica managed over 450 attorneys, investigators, and support staff charged with stopping consumer fraud and false advertising, and protecting consumers’ privacy.  Under her tenure, the Bureau brought a series of major law enforcement actions to halt ongoing law violations. The Bureau also issued groundbreaking reports on data brokers, the Internet of Things, Cross Device Tracking, Big Data, mobile security, and kids’ apps.

Prior to being named Director, Jessica served in a number of senior roles at the FTC, including Deputy Director of the Bureau, Associate Director of the Division of Financial Practices, and Acting Associate Director of the Division of Privacy and Identity Protection.

On May 8, 2017, Jessica joined Consumer Reports as its new Vice President of Consumer Policy and Mobilization.

FPF salutes Jessica for her leadership in responsible data use and consumer privacy!

Future of Privacy Forum and the Data Quality Campaign Relaunch the FERPA|Sherpa Education Privacy Resource Center

 

FOR IMMEDIATE RELEASE                      

June 6, 2017

Contact: Melanie Bates, Director of Communications, [email protected]

Future of Privacy Forum and the Data Quality Campaign Relaunch the

FERPA|Sherpa Education Privacy Resource Center

Washington, DC – Today, the Future of Privacy Forum (FPF) and the Data Quality Campaign (DQC) relaunched FERPA|Sherpa, the leading resource for information about education privacy issues. Named after the core federal law that governs education privacy, FERPA|Sherpa provides students, parents, schools, ed tech companies, and policymakers with easy access to the resources, best practices, and guidelines that are essential to understanding the complex privacy issues arising at the intersection of kids, schools, and technology.

The newly improved website includes:

“FERPA|Sherpa is the best place to easily find the most current, relevant, and authoritative resources regarding student privacy.” said Amelia Vance, Education Policy Counsel for FPF, who runs and creates content for the website. “Stakeholders have created so many great resources and models – some quite recently – and FERPA|Sherpa is the trusted one-stop shop for anyone who wants to access the latest best practices and guidance.”

“Data should be used to open doors, never to close them,” said Aimee Rogstad Guidera, President and CEO of DQC. “Parents want and deserve assurances that their child’s information is used to help them, never to hurt them and that this data is safeguarded and used responsibly and ethically. That’s why we are pleased to partner with the Future of Privacy Forum on the FERPA|Sherpa website to ensure the education sector is prioritizing the effective and responsible use of data in the service of student learning. FERPA|Sherpa provides information, messages, tools, and emerging best practices around safeguarding data to parents, educators, and policymakers so they can be informed actors and advocates for the ethical use of data in education.”

More than at any other time in the evolution of education, data-driven innovations and emerging technologies – such as online textbooks, apps, tablets and mobile devices, and internet-based learning – are bringing advances and critical improvements in teaching and learning, with profound implications.

At the same time, the increased use of vendors and data in schools is matched by the need for heightened responsibility to manage and safeguard student data and implement policies that benefit education and minimize risk. Concerns have been raised about how student data is collected and used in a next-stage learning ecosystem buzzing with social media, mobile devices, central databases, student records, Big Data, and an array of vendors and software. Since 2013, over 100 new student privacy laws have passed in 40 states.

“The Future of Privacy Forum is committed to creating a better landscape for education privacy,” said Jules Polonetsky, CEO of FPF. “The relaunch of FERPA|Sherpa will enable more effective collaboration between stakeholders and better education privacy practices from schools and companies.”

“Technology and the internet are powerful tools for teaching, learning and family-school communication. At the same time, it is imperative that students’ academic and personal information is protected,” said Laura Bay, president of National PTA. “It is a top priority of National PTA to safeguard children’s data and make certain that parents have appropriate notification and consent as to what and how data is collected and used. National PTA is pleased to collaborate with the Future of Privacy Forum and the Data Quality Campaign to bring the FERPA|Sherpa online resource center to families nationwide to ensure they are knowledgeable about the laws that protect student data as well as students’ and parents’ rights under the laws.”

The new FERPA|Sherpa website builds on FPF’s work to ensure the responsible use of student data and education technology in K-12 and higher education, helping educators with resources and information, and seeking inputs from all stakeholders to ensure privacy while allowing for effective data and technology use in education. FERPA|Sherpa initially launched in spring 2014.

FPF and the Data Quality Campaign are proud to support responsible education technologies in order to promote successful student outcomes. If you have questions or resources that you think should be part of FERPA|Sherpa, please contact Amelia Vance at [email protected].

June 22nd Event: Ensuring Individual Privacy in a Data Driven World

Criteo and Future of Privacy Forum are pleased to invite you to an exceptional conference gathering a very high-level selection of regulators, lawyers, advertisers, and publishers to discuss individual privacy in a data driven world.

Save the date for Thursday, June 22, 2017 from 8:30 am to 8:00 pm.

REGISTER HERE

FPF Capital-Area Academic Network Speaker Series

Future of Privacy Forum, Washington & Lee University School of Law, and the International Association of Privacy Professionals recently collaborated in a Call for Papers focused on the privacy impact of current and projected technological advancements, focusing on the transparency, sharing, and algorithmic implications of data collection and use – topics identified in the National Privacy Research Strategy.

The accepted papers from this call introduce new thinking by taking a closer look at how data flow maps can be leveraged to increase data processing transparency and privacy compliance in the enterprise, how the market has either created or failed to create privacy enhancing standards, and rethinks the traditional notice and consent model in the context of real time M2M communication. These papers have been published in the latest issue of the Washington & Lee Law Review, Online Roundtables. The winning papers are:

The authors of these papers will be sharing their work at a series co-hosted by the FPF Capital-Area Academic Network and IAPP’s Washington D.C. KnowledgeNet Chapter. The first of these series will be held on Tuesday, June 6th featuring Chetan Gupta, CIPP/US, UC Berkeley School of Law who will discuss conclusions from his research on how individuals can have an impact on the market’s adoption of privacy standards and security technologies for the tools we use every day. Following Mr. Gupta, we will be joined by Carrie A. Goldberg, Esq. for a question and answer session about her work focused on justice for individuals who are under attack. Carrie is the founder of C.A. Goldberg, PLLC, a law firm operating out of Brooklyn, New York, that focuses on internet privacy and abuse, domestic violence, and sexual consent.

REGISTER HERE

WannaCry About Backdoors

There are many lessons to learn from the spread of the WannaCry ransomware attacks across the globe.  One lesson that needs more attention is the danger that exists when a government attempts to create mandatory backdoors into computer software and systems.

The ransomware attacks began May 12 and soon spread to over 150 countries and over 10,000 organizations, encrypting files and demanding payment in the online currency Bitcoin for the hackers to unlock those files.  The attacks contained relatively unsophisticated ransomware.  By contrast, the software that spread the ransomware from system to system was very sophisticated, based on the EternalBlue exploit that was stolen from the National Security Agency and leaked in April by a group called the Shadow Brokers.

An initial lesson is to remind us that leaks can and do happen from intelligence agencies, from Edward Snowden, through the publication of CIA hacker code, to the Shadow Brokers release of NSA hacker tools. In an era where leaks happen at scale and get disseminated globally, agencies face a “declining half life of secrets,” and must anticipate that their actions and techniques will be made public far sooner than historically was true.

An important lesson picked up by tech policy experts has been the need to improve what is called the “vulnerabilities equities process” (VEP).  The NSA has long had this process to weigh the benefits of a spying tool (such as breaking into an adversary’s computer system) with the costs (such as leaving civilian computers open to the same attack).  In 2013, I was part of President Obama’s NSA Review Group, and that administration accepted our recommendation to shift the VEP to the White House and involve more agencies and perspectives, especially to highlight the risk to the economy and our own infrastructure from vulnerabilities that are not patched.

Experience with WannaCry shows, however, that improving the VEP is not enough to create good security.  After the government learned about the Shadow Brokers theft, it alerted Microsoft to the vulnerability exploited by the ransomware. Microsoft released a patch in March, before the Shadow Brokers published the key attack mechanism.  Nonetheless, Britain’s National Health Service and the other victims world-wide did not update their systems in time.  These failures show the need to update quickly and systematically, an issue whose importance will only increase as myriad devices connect online as part of the Internet of Things, where many devices have no mechanism for updates.

Along with these lessons, however, WannaCry should inform us about the egregious risks that come from mandatory vulnerabilities in software, what are often called “backdoors.”  The greatest public attention to backdoors arose when the FBI sought to require Apple to write software that would gain access to an encrypted iPhone in the San Bernardino terrorism case.  Apple CEO Tim Cook refused, saying “There have been people that suggest that we should have a back door. But the reality is if you put a back door in, that back door’s for everybody, for good guys and bad guys.”  Strong encryption is permitted even under the 1994 U.S. law that requires phone companies to build their networks to respond to court orders.  As the ACLU’s Chris Soghoian has emphasized, that law “explicitly protected the rights of companies that wanted to build encryption into their products – encryption with no backdoors, encryption with no keys that are held by the company.”

The risk of government-mandated backdoors goes far beyond the U.S., however.  Late last year, the United Kingdom passed the Investigatory Powers Act, which allows the government to compel communications providers to remove “electronic protection applied … to any communications or data.”  The Electronic Frontier Foundation reports that they don’t believe the U.K. government has taken advantage of this requirement to break encryption yet, but the law is now on the books and companies could face severe consequences for non-compliance.

Even more broadly, China’s new cybersecurity law can be read to require encryption backdoors, Brazil temporarily blocked the encrypted app WhatsApp when seeking access to user data, the European Union Justice Minister is considering measures to force companies to cooperate with law-enforcement requests, and India has proposed sweeping encryption legislation that would require backdoor acces as well.

The difficulty with these mandated backdoors, however, is that a computer vulnerability that exists in China, Brazil, or India typically will exist in the United States as well. In all of these countries, users rely on largely the same hardware and software –  the same phones, laptops, operating systems, and applications.

The WannaCry attack thus teaches us lessons about the likelihood of leaks, the need for a better vulnerabilities process, and the importance of better software updating.

Most importantly, however, it teaches us that a backdoor required in one nation opens up the data and devices of users everywhere in the world.  Over 150 countries suffered the effects of the WannaCry ransomware.  Over 150 countries will also have their systems exposed if any one country succeeds in mandating a backdoor in the devices and software upon which we all rely.

Peter Swire teaches cybersecurity at the Georgia Tech Scheller College of Business, and is a Senior Fellow at the Future of Privacy Forum.

In Memory of Elise Berkower

 

 

We learned yesterday of the passing of Elise Berkower, a dear friend and one of the unsung heroes in the world of digital privacy.

Elise was Chief Privacy Counsel at The Nielsen Company and was a valued member of the Future of Privacy Forum Advisory Board. If you didn’t know Elise, it is because she was incredibly modest and humble, despite being one of the most knowledgeable people working at the intersection of data and online ads, mobile ads, traditional TV and the emerging smart TV and smart home.

I first met Elise when I was Commissioner of Consumer Affairs for New York City in 1998 and she was the Deputy Chief Administrative Law Judge at the agency. Her commitment to consumer protection was deep and her ability to manage the agency court system to ensure it served the city and consumers was remarkable. When the opportunity arose, I appointed her as Chief Judge and saw her continue to work to resolve agency bottlenecks, while ensure we did justice to the businesses and the individual consumers we served.

When I became Chief Privacy Officer at DoubleClick, Elise was my first hire. While Nuala o’Connor and I rushed around the world, putting out fires and negotiating with regulators, Elise built and ran the first comprehensive online advertising privacy compliance program in the industry. This involved not only ensuring DoubleClick’s policies and practices were proper, but working with thousands of advertisers and publishers who were clients. In those days, web sites were first beginning to adopt privacy policies and few had language or choices that reflected the uses of ad technology. Elise corrected and edited thousands of policies, helping companies large and small learn about and comply with emerging ad tech best practices – many which we were working with others in the sector to develop as new uses of data emerged.

Elise did her work with a deep commitment to ensuring fair practices for individuals and carried that same devotion to Chappell & Associates and then to Nielsen. When she reached out to fill me in on a new product, the briefing was never about the compromises made to balance privacy and data use and the hope that the end solution was good enough – it was always a review of all the protections she had worked with the business to put in place and was there anything else we could think of that would help do more.

At each company, Elise was devoted to her peers and staff as if they were her family – and indeed we were. My only complaint is that she was so selfless that it was hard to learn about the challenges she was facing over the years. She was severely impacted by 9/11, as she lived in an apartment very close to the World Trade Center, and she struggled with health issues, including a recent stroke. When I called to check up on her recently, she was more interested in finding out how my family was doing than sharing her own challenges.

Elise seemed to be doing better recently and was even at the FPF annual meeting last week, sharing her views and input.  We will be coordinating and effort to honor her friendship and service.

Our condolences to our friends at Nielsen and to peers at ESOMAR and NAI, where she for many years was a leader in advancing industry self-regulation and was a respected and beloved partner.

The funeral will take place on June 4th in Manhattan.

May her memory be a blessing.