ICYMI: FPF Webinar Examines Policies to Protect Child Privacy Online

As policymakers worldwide reexamine how to more effectively protect children’s privacy online without imposing broad age restrictions across the internet, the Future of Privacy Forum (FPF) recently hosted a webinar to assess diverse approaches to addressing child privacy concerns. The webinar also explored how respective policies can help address the many potential risks children face online, including oversharing, identify theft, physical safety, and exposure to inappropriate content.

“The majority of child privacy laws and proposals are focused on limiting commercialization,” said FPF Director Youth & Education Privacy Amelia Vance. “This includes preventing targeted or behavioral advertising to children, limiting or eliminating the ability to sell or share children’s data, or other protections aimed at limiting children’s exposure to marketing and protecting data from being used in inappropriate ways by companies.”

In addition to child privacy proposals from the European Union, United Kingdom, South Korea, and California, FPF experts highlighted the federal child privacy law in the U.S., the Children’s Online Privacy Protection Act (COPPA), and several of its key limitations.

While two new federal proposals and California’s new consumer privacy law extend the age of COPPA protections to 16, most children in the U.S. are only covered until age 13.  Additionally, the ability of the Federal Trade Commission⁠—which is currently reviewing COPPA⁠⁠—to effectively interpret and enforce the law’s standards for determining whether a business has ‘actual knowledge’ that a specific user of their website is a child, or is providing services that are ‘directed’ at children, has varied considerably over the statute’s nearly 20-year history.

“How ‘actual knowledge’ is defined has really changed over time,” Vance said. “We saw in the recent YouTube settlement, for example, the FTC noting that YouTube was telling potential advertisers that there were children on the platform that they could reach.”

FPF’s recent comments to the FTC in response to its ongoing review of COPPA also underscore the need for guidance on the law’s “actual knowledge” definition, as well as for the agency to modernize its policies related to voice-enabled technologies and provide greater alignment with the primary federal student privacy law, FERPA.

When it comes to developing child privacy legislation, Vance cautioned unintended consequences are “incredibly easy to occur.” To mitigate this risk, Vance advised policymakers to be as intentional and clear as possible, and to get input from those on the ground including parents, teachers, school superintendents, attorneys, and children and teens themselves.

“When looking at child privacy, it is important to be focused and ask, ‘what are you trying to regulate?” Vance noted. “Being specific about what potential risks or harms you are trying to mitigate or prevent lends itself to a more targeted bill and one that is more likely to achieve whatever that end goal is.”

Finally, policymakers may need to acknowledge that children today are growing up in a vastly different world. “Look broadly to the stakeholders who you are talking to because schools and homes are very different from how we all grew up as children,” Vance advised. “You want to make sure you’re not limiting some aspect of the digital world that can be important.”

“It’s worth noting that all of this is up for discussion right now,” Vance ultimately concluded. “This is very much an evolving space in the U.S.”

Click here to watch to the full webinar, part of FPF’s ongoing Privacy Legislation Series, which to date has also covered preemption, commercial research and defining covered data. Access the slide deck from the presentation and additional recommended materials on child privacy here.

To learn more about the Future of Privacy Forum, visit www.fpf.org and subscribe to FPF’s student privacy newsletter.

Contact:

[email protected]

 

Award-Winning Papers: "Antidiscriminatory Privacy" and "Algorithmic Impact Assessments under the GDPR"

For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting to lawmakers and regulators award-winning research representing a diversity of perspectives. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 are two papers broadly addressing the impact of algorithms on transparency and fairness: Antidiscriminatory Privacy by Ignacio N. Cofone and Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations by Margot E. Kaminski and Gianclaudio Malgieri. Cofone assesses how privacy rules can both facilitate and protect against discriminatory behavior, while Kaminski and Malgieri discuss how impact assessments serve to link the individual and systemic regulatory subsystems within the European General Data Protection Regulation (GDPR).


Law often blocks sensitive personal information to prevent discrimination. In Antidiscriminatory Privacy, Ignacio Cofone, assistant professor at McGill University Faculty of Law, presents a framework for reducing discrimination against minorities. To build this framework, Cofone explored two case studies that “illustrate when rules that regulate the flow of personal information (privacy rules) are compatible with antidiscrimination efforts and when they are not.” Through an analysis of anonymous orchestra auditions and the “Ban the Box” initiative, Cofone reveals how blocking an information flow can be successful at combatting discrimination in some cases, but not all of them. Cofone states that “privacy can protect against discrimination as well as enable a discriminatory dynamic.” He notes that certain data points may serve as proxies for categories that the law aims to protect, arguing that information about certain proxies must be blocked when employing privacy rules to fight discrimination. In the case of the “Ban the Box” initiative, for example, when employers were prohibited from asking about an applicant’s criminal history, they were more likely to discriminate against black applicants they thought might have criminal histories. Cofone found that in the “Ban the Box” case, applicants’ race became a proxy for criminal history, fostering discriminatory behavior. Cofone’s analysis offers a framework for determining the effectiveness of antidiscrimination measures based on information restrictions, including questions to consider to identify proxies for protected information.

In Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations, authors Margot Kaminski of Colorado Law School and Gianclaudio Malgieri of Vrije Universiteit Brussels, propose that impact assessments should link the GDPR’s dual methods of regulating algorithmic decision-making by providing systemic governance and also safeguarding individual privacy rights. The authors state that, in the context of decision-making algorithms, the GDPR’s existing Data Protection Impact Assessment (DPIA) should serve as an Algorithmic Impact Assessment that addresses problems of algorithmic discrimination, bias, and unfairness. Beyond serving as a tool in the GDPR’s systemic governance regime, the authors state that the DPIA should serve as an element of the GDPR’s protection of individual rights, connecting the two regulatory subsystems that underline the GDPR. The way that the DPIA links these two subsystems within the GDPR, the authors note, mandates the creation of “multi-layered explanations” for algorithmic decision-making that are targeted to everything from oversight bodies and auditors to individuals. Privacy professionals will benefit from the authors’ suggestions for improving Algorithmic Impact Assessments in the GDPR context, calling for the expansion of the right to explanation to include a “whole web of explanations…of differing degrees of breadth, depth, and technological complexity.”

If you’re interested in learning more about the relationship between privacy and discrimination, you’ll want to read the full papers from Cofone and Kaminski and Malgieri.


The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website.

ICYMI: Future of Privacy Forum Highlights Potential “Unintended Consequences” of Child Privacy Policies at TechFreedom Event

The Future of Privacy Forum (FPF) recently joined top YouTube creators, FTC Commissioner Noah Philips, and privacy experts from Google, the Georgetown Institute for Public Representation, and others on Capitol Hill for TechFreedom’s event, Will Kids’ Privacy Break the Internet? The COPPA Rule. FPF Director of Youth & Education Privacy Amelia Vance participated in an expert panel discussion about the Federal Trade Commission’s (FTC) ongoing review of the Children’s Online Privacy Protection Act (COPPA).

The event focused on the controversy surrounding the FTC’s September 2019 settlement with YouTube over COPPA, and YouTube’s response in November announcing changes regarding “child-directed” content. To help dispel some of the resulting confusion among creators that has followed, FPF published a “mythbusters” blog post addressing common misperceptions, including that creators could “stop COPPA” by filing comments with the FTC. The FTC received more than 175,000 comments – including from FPF – as a part of the agency’s ongoing review of COPPA. FPF’s comments urged the FTC to modernize COPPA in three key areas: policies related to voice-enabled technologies, guidance on COPPA’s “actual knowledge” definition, and greater alignment with the primary federal student privacy law, FERPA.

As YouTube content creators face this new uncertainty, Vance emphasized the importance of keeping the conversation focused on the facts, and potential solutions. “I think the key here is to provide as much insight about what laws creators, in particular, have to follow,” Vance said. “We should be talking about how to make it more practical for the people who have to actually implement these new provisions.” However, she noted that, at the end of the day, child privacy in the U.S. may be “out of the FTC’s hands,” since rulemaking will take a significant amount of time and both Congress and state legislatures have indicated that they are eager to legislate on child privacy.

Vance, who spoke at the FTC’s COPPA workshops both in late 2019 and in 2017, reminded the audience that a lot has changed since the FTC’s last review of COPPA in 2013. Europe and California have both passed significant new consumer privacy laws with child privacy protections, and some European countries are considering even higher protections for children. Additionally, two new federal proposals call for extending the age of COPPA protections to 16, and one of those bills also includes an update of COPPA’s “actual knowledge” definition, a key enforcement mechanism.

Additionally, Vance cautioned against legislators or regulators expanding child privacy through “opt-in” parental consent, citing the example of students in Louisiana who, under a strict opt-in regime, missed out on the state’s scholarship program because they couldn’t get parental sign off.

“It’s really important to remember that there are unintended consequences here,” Vance noted. “Where we’re going with privacy protections is an underlying framework of protections that would apply across the board, and not protections that parents have to consent to. Exactly what the boundaries of that…remains to be seen.”

Click here to watch to the full TechFreedom event, read FPF’s comments to the FTC about COPPA here, and access additional FPF child privacy resources here.

To learn more about the Future of Privacy Forum, visit www.fpf.org and subscribe to FPF’s student privacy newsletter.

Contact:

[email protected]

Privacy 2020: 10 Privacy Risks and 10 Privacy Enhancing Technologies to Watch in the Next Decade

Today, FPF is publishing a white paper co-authored by CEO Jules Polonetsky and hackylawyER Founder Elizabeth Renieris to help corporate officers, nonprofit leaders, and policymakers better understand privacy risks that will grow in prominence during the 2020s, as well as rising technologies that will be used to help manage privacy through the decade. Leaders must understand the basics of technologies like biometric scanning, collaborative robotics, and spatial computing in order to assess how existing and proposed policies, systems, and laws will address them, and to support appropriate guidance for the implementation of new digital products and services.

The white paper, Privacy 2020: 10 Privacy Risks and 10 Privacy Enhancing Technologies to Watch in the Next Decade, identifies ten technologies that are likely to create increasingly complex data protection challenges. Over the next decade, privacy considerations will be driven by innovations in tech linked to human bodies, health, and social networks; infrastructure; and computing power. The white paper also highlights ten developments that can enhance privacy – providing cause for optimism that organizations will be able to manage data responsibly. Some of these technologies are already in general use, some will soon be widely deployed, and others are nascent.

Read the White Paper

cleanshot 2023 07 18 at 10.31.00@2x

Child Privacy Protections Compared: California Consumer Privacy Act v. Proposed Washington Privacy Act

By Anisha Reddy, Tyler Park, and Amelia Vance

As legislatures consider enacting broad consumer privacy legislation, officials must consider whether, and how, to address children’s and teen’s privacy. The leading models for addressing consumer privacy contain language addressing child privacy that differs in significant ways. Many states have introduced legislation that mirrors the framework of the California Consumer Privacy Act (CCPA). The proposed Washington Privacy Act (SB 6281) has also emerged as an influential framework. CCPA and SB 6281 differ in many respects, including with regard to child privacy. As described below, the frameworks take different approaches to the age of youth protected, the statutory knowledge standards, and the consumer rights granted. 

As FPF previously wrote, SB 6281 would create a comprehensive data protection framework for Washington residents that includes both individual rights and obligations on data “controllers,” (both for-profit businesses and nonprofits) that go beyond the rights and obligations in CCPA. A bill similar to SB 6281 failed to pass the Washington legislature in 2019, but SB 6281 is an influential model for states considering alternatives to California’s approach to consumer privacy legislation. 

Both CCPA and SB 6281’s approaches to child privacy build on the federal Children’s Online Privacy Protection Act (COPPA), which requires “operators of commercial websites and online services directed to children under 13 or knowingly collecting personal information from children under 13 to obtain verifiable parental consent prior to the collection, use, or disclosure of children’s personal information.” A chart with a full comparison of the relevant language in COPPA, CCPA, and SB 6281 is below. 

CCPA adds new consumer rights for children and also extends child privacy protections to teens. SB 6281 would add new consumer rights for children, such as data portability, but would not extend child-centric protections to teens. The approaches differ in how they craft protections for children: CCPA contains specific requirements regarding the sale of children’s data, while SB 6281 would place children’s data in a larger category of “sensitive data” that would enjoy heightened protection – the category would also include types of data not related to age such as biometrics. 

CCPA and SB 6281 differ in three key ways: 

Age of youth who are protected: CCPA extends child privacy protections to youth under age 16, while SB 6281 would provide heightened protections for children under 13. Though SB 6281’s approach mirrors COPPA’s age threshold, CCPA’s expansion of special protections to teens is likely to become more common. Most consumer privacy bills introduced since CCPA was passed in 2018 would also extend protections to teens. Creating special protections for teens’ data is consistent with international trends – Europe’s GDPR sets age 16 as the threshold for special privacy protections, but permits member states to reduce the age as low as 13; many countries have chosen to retain age 16 as the age of consent for data processing. Though SB 6281 would not apply age-based protections to teens’ data, it would apply strong protections to sensitive data for all consumers, not just young people. Therefore, teens using online services in Washington would still experience a meaningful increase in their privacy rights and protections.

Knowledge Standards: CCPA and SB 6281 contain different “knowledge standards” – they have different thresholds for determining when a regulated entity “knows” a user is a child. SB 6281 would categorize the “personal data from a known child” as “sensitive data,” which would require parent permission before a collector could process the child’s data. What constitutes a “known child” in this context is not clear. 

In contrast, CCPA adopts the “actual knowledge” standard from COPPA and adds that a business that willfully disregards a consumer’s age has actual knowledge of the consumer’s age. Entities are subject to COPPA if they have “actual knowledge that they are collecting, using, or disclosing personal information from children under 13.” While many experts have traditionally advised companies that this standard was applicable only if the company knew a specific child was using their platform, the FTC’s recent YouTube settlement has raised questions about whether more generalized knowledge that children are using an entity’s service could be interpreted as falling under the actual knowledge standard–for example, the FTC’s complaint noted that YouTube enticed brands to market on YouTube by highlighting that children were using the service. 

Consumer rights granted: While CCPA creates protections for children and teens that relate to the sale of their data, SB 6281 would require parental consent before a controller may “process,”  meaning perform “any operation” on, data of a known child. Under CCPA, children’s data cannot be sold unless parents (if a child is under 13) or teens (ages 13–15) opt-in to sale. CCPA also provides new protections to all consumers regardless of age: they are given the right to request deletion of personal information, the right to know how their information is used, and businesses may not discriminate against consumers for exercising CCPA rights. While CCPA’s protections for children only currently apply to the sale of data, a ballot initiative to amend CCPA would expand the legal protections to cover the “sharing” of children’s data as well; the initiative will be voted on in November 2020.

SB 6281 would require collectors to obtain opt-in consent from parents before taking a much wider variety of actions than those covered by the child privacy provisions in CCPA. Collectors would not be permitted to process “sensitive data,” a category that includes data from a known child, without obtaining consent from the child’s parent or guardian. “Sensitive data” also includes non-age-related types of data such as religious beliefs, mental or physical health information, sexual orientation, unique biometric or genetic identifiers, and specific geolocation data. While CCPA’s child-specific protections only address sale of children’s data, SB 6281 would govern “any operation or set of operations which are performed on personal data or on sets of personal data, whether or not by automated means, such as the collection, use, storage, disclosure, analysis, deletion, or modification of personal data.” This scope of protections for children is wider than the protections included in both CCPA and COPPA.

Preemption? It is important to note that COPPA preempts some child privacy laws, preventing states from enacting requirements that conflict with COPPA provisions. Privacy expert Peter Swire has written that COPPA preempts state’s attempts to regulate activities covered by COPPA. Though the scope of COPPA preemption has not been decided by courts, the Federal Trade Commission, which enforces COPPA, wrote in an amicus brief that it believes Congress did not intend for COPPA to displace state laws that create additional protections for teens. Even if a court finds that COPPA preempts some or all of the child privacy protections in CCPA or SB 6281 (if it is enacted), both frameworks are nevertheless influential as Congress considers how to craft a comprehensive federal privacy law or update COPPA.

 

Washington state and California share a commitment to youth privacy, but the CCPA and SB 6281 approaches diverge in notable ways that could eventually create headaches for businesses attempting to comply with differing U.S. and international standards. We’ve seen this in student privacy, where edtech companies need to examine more than 100 state and federal student privacy laws to determine their legal obligations. Moving ahead in 2020, we expect to see other states introduce bills based on CCPA and SB 6281 that include additional protections for children, as well as standalone state and federal bills governing child privacy.

 

Child Privacy Protections in COPPA, CCPA, and SB 6281

COPPA  CCPA SB 6281
Age Applies to children under age 13 Applies to children under age 16 Would apply to children under age 13
Who can consent to data collection/use? Parents or guardians Parents or guardians (when child is under age 13) or Teen (when they are age 13 or over) Parents or guardians
Information Covered Personal information from a child collected or maintained by operators of commercial websites and online services directed to children or with actual knowledge the operator is collecting, using, or disclosing children’s data.  Personal information of consumers if the business has actual knowledge that the consumer is less than 16 years of age. The personal data from a known child.
Rights and Protections Parental consent must be obtained before data is collected. Parents also have rights to access and delete their child’s information. Operators must also have a privacy policy; maintain information only as long as necessary to fulfill the purpose for which it was created; and maintain the confidentiality, security, and integrity of information. Children and 13–15 year olds must opt-in for data to be sold; consumers of all ages have rights: to access, delete, opt-out of the sale of their data, be informed of collection, and not be discriminated against for exercising CCPA rights. Parental consent would need to be obtained before processing data from a “known child.” Consumers of all ages would have rights: to access, correct, delete, port, and opt-out of data processing.
How the Law/Bill Incorporates Child Privacy Entirely focused on protections for children under 13. Sale of data is opt-in instead of opt-out for children under 16; all other protections applicable to all consumers, including children. “Personal data from a known child” would be a type of “sensitive data,” and “sensitive data” requires opt-in consent before processing. 
Knowledge Standard Operators with products that are “directed to children” or that have “actual knowledge” they are collecting data from a child. Applies to businesses with “actual knowledge” that consumer is under 16; willful disregard constitutes actual knowledge. Would apply to personal data from a “known child,” with child defined as under 13. “Known” is not defined. 
Exceptions Information used for internal operations is exempt from needed consent [None applicable] Collectors in compliance with the verifiable parental consent mechanisms under COPPA and personal data regulated by FERPA

 

Takeaways from the Understanding Machine Learning Masterclass

Yesterday, the Future of Privacy Forum provided bespoke training on machine learning as a side event during the Computers, Privacy and Data Protection Conference (CPDP2020) in Brussels. The Understanding Machine Learning masterclass is a training aimed at policymakers, law scholars, social scientists and others who want to more deeply understand the data-driven technologies that are front of mind for data protection discussions. The training received a lot of interest from academics, civil society, and key staff from policymakers in Brussels.

The expert speakers consisted of:

The speakers opened the black box of machine learning step by step. A key question the speakers answered was: How do you get from mathematical regression analysis to (unsupervised) learning and end up with a neural network?

The presentations shed light on the black box by bringing the details of the technology to an audience without an in-depth computer science background. Starting with a primer on the basics of the field, the speakers examined issues of particular consequence to policymakers such as transparency, fairness, bias, and discrimination.

The slides are available for download here. Attendees also received a copy of FPF’s Privacy Expert’s Guide to Artificial Intelligence and Machine Learning, a guide that explains the technological basics of AI and ML systems at a level of understanding useful for non-programmers, and addresses certain privacy challenges associated with the implementation of new and existing ML-based products and services. Learn more about FPF’s presence in Europe.

A Privacy Playbook for Connected Car Data

Drivers and passengers expect cars to be safe, comfortable, and trustworthy. Individuals often consider the details of their travels—and the vehicles that take them between their home, the office, a hospital, their place of worship, or their child’s school—to be sensitive, personal data.

The newest cars contain numerous sensors, from cameras and GPS to accelerometers and event data recorders. Carmakers, rideshare services, tech companies, and others are increasingly using data about cars to reduce emissions, manage traffic, avoid crashes, and more. The benefits of connected vehicles for individuals, communities, and society are clear. So are the privacy risks posed by increased collection, use, and sharing of personal information about drivers, passengers, cyclists, and pedestrians.

It is crucial that companies, advocates, academics, technical experts, and policymakers craft creative solutions that promote the benefits of connected vehicles while mitigating the privacy risks. Global legal frameworks have a role to play in assuring meaningful data protection and promoting trust, as do voluntary, enforceable codes of conduct and technical standards.

However, it is plain that entities must look beyond legal obligations and consider how they will earn and maintain consumer trust. With this white paper, Otonomo has taken an important step to advance the dialogue on connected car data privacy.

Originally released in October 2019, Otonomo’s Privacy Playbook for Connected Car Data presents nine plays for putting privacy at the center of your data business practices.

Read the White Paper

Award-Winning Paper: "Privacy Attitudes of Smart Speaker Users"

For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting to lawmakers and regulators award-winning research representing a diversity of perspectives, including those from students and academics. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 is Privacy Attitudes of Smart Speaker Users by Nathan Malkin, PhD student in computer science at University of California, Berkeley, and his coauthors. The study surveys privacy attitudes of smart speaker users and presents an evaluation of users’ comprehension and use of existing privacy settings and controls.


In Privacy Attitudes of Smart Speaker Users, study authors Nathan Malkin, Joe Deatrick, Allen Tong, Primal Wijesekera, Serge Egelman, and David Wagner surveyed 116 owners of Amazon and Google smart speakers.

In an effort to understand whether smart speaker users are making informed decisions about the privacy consequences and controls offered by smart speakers, the authors used recordings of real interactions between the study participants and their devices. The authors found that “almost half did not know that their recordings were being permanently stored and that they could review them; only a quarter reported reviewing interactions, and very few had ever deleted any.” The authors found that the way smart speakers default to permanent storage of interactions with users places an “undue burden” on the user, and “is almost certain to result in most interactions going unreviewed.” However, the authors observe that, after the conclusion of the study, both Google and Amazon updated their voice assistants to allow for automatic data deletion after three or 18 months (Google) or the deletion of a day’s worth of recorded interactions (Amazon).

While more than 71% of smart speaker users had not raised privacy concerns related to their device in the past, the authors are careful to state that people are not “apathetic” about their privacy. Instead, people’s acceptance of smart speakers is tied closely to what is happening with their data, as well as the specific subjects in the speaker’s recordings. According to the study, more than 72% of users found recordings reviewed by a computer to be acceptable, while users were more likely to view human review of recordings as unacceptable. Additionally, users found certain interactions with smart speakers more sensitive than others. Recordings of children, financial information, sexual or medical topics, locations, and personally identifying information were viewed as particularly sensitive.

The authors conclude that privacy controls on smart speakers are underutilized. Based on the views of study participants, the authors suggest that voice assistants adopt shorter retention periods, despite the fact that users did not feel that their stored recordings presented a grave privacy danger. The authors may have revealed an important insight about individual perspectives on privacy: “people seem more protective of the privacy of others” than their own privacy.

If you’re interested in reading more about the attitudes of smart speaker users toward privacy, you’ll want to check out the full paper.


The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website.

The Future Is Now: FPF at CPDP2020

Computers, Privacy and Data Protection (CPDP) Conference 2020 commences next week in Brussels, bringing together academics, data protection authorities, policymakers, data scientists, and civil society to network, exchange ideas, and talk over the latest trends. Check out the panels and events FPF will be participating in below. 

Algorithmic Regulation of Transportation

Wednesday, January 22 at 11:45, Petite Halle

We are bringing together experts across the privacy, mobility, and civic space to discuss the challenges of transforming—and enforcing—transportation regulations through the use of code and algorithms. This panel aims to build upon the issue as framed by the ITIF report released earlier this year, which introduced multiple potential frameworks for integrating automated enforcement mechanisms in the transportation industry. At CPDP, we hope to reexamine this issue with the specific lens of privacy and data protection and ultimately, identify concrete steps cities and mobility operators can take to share data responsibly. Specific questions we hope to address in this panel:

The speakers are Simon Hania, Uber; Ger Baron, City of Amsterdam; Karen Vancluysen, Polis; and Kara Selke, Streetlight Data. The panel is moderated by Rob van Eijk, FPF. 


The Future Is Now: Autonomous Vehicles, Trolley Problem(s) and How to Deal with Them 

Wednesday, January 22 at 14:15, Petite Halle

Autonomous and highly automated vehicles are likely the first product that will bring AI to the masses in a life-changing way. They rely on AI for a variety of uses: from mapping, perception and prediction, to self-driving technologies. Their promise is great: increasing the safety and convenience of our cities and roads. But so are the challenges that come with it, from solving life and death questions to putting in place a framework that works for the protection of fundamental rights of drivers, passengers and everyone physically around them. This panel proposes a EU-US comparative perspective to discuss essential questions. Are existing legal frameworks well-equipped to deal with these challenges? How much data and what type of data runs through all systems of an autonomous vehicle? What rights are affected? What ethical considerations might play into decision-making algorithms around accidents?

Moderated by IAPP’s Trevor Hughes, the panel includes speakers Sophie Nerbonne, Director of Economic Co-Regulation, CNIL; Andreea Lisievici, Volvo Cars; Chelsey Colbert, FPF; and Mikko Niva, Vodafone. 


Turning the Tables: Academics in the Hot Seat

Wednesday, January 22 at 16:00, Grande Halle 

In numerous privacy and data protection conferences and workshops, academics moderate discussions between policymakers, regulators and industry players. Academics are tough inquisitors and harsh critics, pointing out the shortcomings of legislation, the slow turn of the wheels of justice, the practical challenges of enforcement and the tangled web of interests

of businesses. In this session we turn the tables. Helen Dixon, Data Protection Commissioner for Ireland, will be asking the questions. The academics will be in the hot seat providing direct and complete answers. Are their theories sound and coherent? Do they influence the world outside the ivory tower? Did their writings withstand the test of time?

Speakers include Franziska Boehm, Karlsruhe Institute of Technology; Neil Richards, Washington University School of Law; Omer Tene, IAPP; Gabriela Zanfir-Fortuna, Future of Privacy Forum. The session will be moderated by Helen Dixon, Data Protection Commissioner for Ireland. 


SIDE EVENT

Masterclass: Understanding Machine Learning

Thursday, January 23 from 16:00-18:00, Area 42, 46 Rue des Palais, 1030 Bruxelles, Belgium

This Masterclass is aimed at policymakers, law scholars, social scientists and others who want to more deeply understand the data driven technologies that are front of mind for data protection discussions. Structured as an interactive lesson, technology experts will present a training session focused on Artificial Intelligence and Machine Learning. 

Attendees will be provided with a copy of “The Privacy Expert’s Guide to Machine Learning” and will join leading machine learning experts for a presentation geared at bringing the details of the technology to an audience without an in depth computer science background. In addition to a primer on the basics of the field, issues of particular consequence to policymakers such as fairness, bias, and data minimization will be examined. 

Expert Speakers: 

RSVP for the event here.


TECH POLICY HAPPY HOUR

Friday, January 24 from 17:00-20:00, Ginette Bar 

Join us at Ginette Bar in Place du Luxembourg for an evening of drinks and networking. No RSVP needed. 


ADDITIONAL INFORMATION

Please view the entire program for additional details.

If you would like to discuss FPF’s expanding activities in Europe, please contact us at [email protected].

FPF Welcomes New Staff to Focus on Artificial Intelligence and Mobility

FPF is pleased to announce the addition of two new members to its team, Dr. Sara Jordan and Chelsey Colbert

As individuals’ personal data is increasingly used by algorithmic systems that employ machine learning and artificial intelligence technologies, the benefits to consumers, businesses and society are evident, but so are the privacy risks. In her role as policy counsel, Sara will lead FPF’s efforts to create an ethical review process that can provide trusted vetting of research projects. She will also support the organization’s work on artificial intelligence and machine learning and emerging ethical questions relating to privacy and data protection. 

Her profile includes privacy implications of data sharing, data and AI review boards, privacy analysis of AI and Machine Learning (AI/ML) technologies, and analysis of the ethics challenges of AI/ ML. Sara is an active member of the IEEE Global Initiative on Ethics for Autonomous and Intelligent Systems. Prior to working at FPF, Sara was faculty in the Center for Public Administration and Policy at Virginia Tech (2014-2020) and in the Department of Politics and Public Administration at the University of Hong Kong (2007- 2013).

Similarly, individuals’ geolocation data is increasingly collected, used, and shared as part of businesses ranging from connected cars and scooter rentals to mobile apps and online advertising. While FPF recognizes the benefits of geolocation data that lead to the development of safer vehicles and more personalized services, we acknowledge the privacy risks associated with sensitive data use. Chelsey will serve as policy counsel, leading FPF’s portfolio on mobility and location data, including connected cars, autonomous vehicles, ride-sharing, micro-mobility, drones, and robotics. 

Prior to FPF, Chelsey was an associate at an international business law firm in Canada and was seconded as in-house privacy and data governance counsel to Sidewalk Labs, an Alphabet company that designs and builds urban innovations to help cities meet their biggest challenges. Chelsey holds a J.D. with a major in technology law and policy from the University of Ottawa.