Close to the Finish Line: Observations on the Washington Privacy Act

By: Stacey Gray and Gabriela Zanfir-Fortuna *

We wrote last week that Washington State seems poised to become the second US state to pass a major comprehensive privacy bill. The proposed Washington Privacy Act (WPA) would be mostly aligned with the EU’s GDPR, the global gold standard for data protection (although there are still some significant differences).  Read our full comparison of the WPA with GDPR and other privacy laws. At a minimum, the WPA goes much further than the California Consumer Privacy Act (CCPA). Perhaps the most significant difference between the WPA and CCPA is that the WPA would require companies and even non-profits to obtain affirmative (“opt in”) consent for the collection of sensitive data, including biometric data and geolocation data.

Despite this, some lawmakers in Olympia have expressed skepticism over a bill that contains a variety of potential weaknesses, wondering if it might be worse than no bill at all. This is surprising given the overall strength of the proposed WPA compared to anything that has come before it in the United States. Are Washington policymakers letting the perfect be the enemy of the good?

We offer a few observations on the process unfolding in Washington:

If Washington lawmakers choose not to pass a comprehensive privacy law in 2020, they may miss an important opportunity to lead the nation and establish privacy protections for Washington residents. In the absence of a baseline national consumer privacy law, however, we expect debates to continue as state legislators across the United States continue to tackle these important questions.


* Stacey Gray is a Senior Counsel leading FPF’s US federal and state legislative analysis, outreach, and policymaker education ([email protected]). Gabriela Zanfir-Fortuna is an EU Senior Counsel, leading FPF’s EU and global privacy efforts, and formerly worked for the European Data Protection Supervisor in Brussels ([email protected]).

EDPB Draft Guidelines on Connected Cars Focus on Data Protection by Design and Push for Consent

By Gabriela Zanfir-Fortuna and Chelsey Colbert


The European Data Protection Board recently published its draft Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility related applications, which are open for feedback until March 20. The EDPB writes that the main challenge for complying with European data protection and privacy laws in this field is for ‘each stakeholder to incorporate the protection of personal data dimension from the product design phase, and to ensure the car users enjoy transparency and control in relation to their data’. One of the key points clarified by the Guidelines is that the general provision of the ePrivacy Directive, Article 5(3), which requires consent for any access to information on a telecommunications device – ‘terminal equipment’ (with few exceptions), is applicable to connected vehicles. Thus, obtaining valid ‘consent’ becomes one of the key compliance issues in this environment, even if consent will not be necessary for all subsequent uses of personal data.

The challenges are heightened by the complexity of actors in the connected vehicle ecosystem, the different types of ‘data subjects’, the sensitivity of some data, such as geolocation data, and the different tiers of the legal obligations that may apply: Article 5(3) of the ePrivacy Directive, the General Data Protection Regulation (GDPR), the specific EU law on eCall and the framework for deployment of Cooperative Intelligent Transport Systems (C-ITS). In fact, the C-ITS is specifically left outside the scope of these Guidelines since it is an ongoing complex discussion at EU level, with the EDPB pointing to its older Opinion on this topic adopted by the Article 29 Working Party. 

In this post, we will look closer at: (1) the scope of the Guidelines, including defining personal data in connected vehicles; (2) the complex ecosystem of actors and their roles as controllers, joint controllers and processors; (3) key recommendations for implementing Data Protection by Design and by Default; (4) high risk personal data processing (geo-location data, data related to criminal offences, biometric data) and (5) the role of consent.

1. Scope and Definitions: What is a ‘Connected Vehicle’, what are ‘Personal Data’ 

The EDPB’s definition of a “connected vehicle” is “a vehicle equipped with many electronic control units (ECU) that are linked together via an in-vehicle network as well as connectivity facilities allowing it to share information with other devices both inside and outside the vehicle.” 

The EDPB’s draft Guidance document is focused on non-professional use of connected vehicles by data subjects, such as drivers, passengers, vehicle owners, renters, etc. The scope of the document is quite broad, as it applies to personal data a) processed inside of the vehicle, b) exchanged between the vehicle and personal devices connected to it, which includes standalone mobile apps used to assist drivers since they contribute to the vehicle’s connectivity capabilities, and c) collected within the vehicle and exported to other parties for processing, such as the vehicle manufacturer, insurance companies, or car repairers. 

Mobile apps are an interesting category and the EDPB provides a non-exhaustive list of examples. To fall within the scope of the Guidelines, apps need to be related to “the environment of driving”. The EDPB provides an example to illustrate the distinction: GPS navigation apps are within scope, while an app that suggests places of interest are not.

Employers providing company cars to staff who monitor employee’s actions within the context of employment are outside of this document’s scope. Also out of scope of this document is the issue of filming in public spaces, which is important to note, given the plethora of cameras and sensors that outfit connected and autonomous vehicles, for example, dashcams, parking assistance, or driver monitoring. 

(Almost) All Data From Connected Vehicles is Likely to be Personal Data

The EDPB considers most data associated with connected vehicles to be personal data to the extent that it is possible to link it to one or more identifiable individuals. This includes technical data about the vehicle’s movements, such as speed, and data about the vehicle’s condition, such as tire pressure. Even when data is not directly linked to a name and is about the technical aspects and features of the vehicle, such as the driving style, distance driven, or the vehicle’s wear and tear, it concerns the driver or passenger of the vehicle. This is because when this data is cross-referenced with other data, such as the vehicle identification number (VIN), it can be linked to an individual. Similarly, vehicle metadata, such as vehicle maintenance status, may also be personal data.

Data collected by cameras and other sensors may also concern driver and passenger behavior, as well as the behavior of those outside of the vehicle. 

2. Complex Ecosystem of Actors: Who are the Controllers, Processors and Joint Controllers

The Guidelines note at the beginning that connected vehicles are becoming mainstream and that the data processing is taking place within a complex ecosystem, as is shown by FPF’s infographic “Data and the connected car. In addition to the traditional players in the automotive industry, the EDPB mentions new players including infotainment service providers, driving assistance systems and services, road infrastructure managers, fleet managers, insurance companies, ride-sharing companies, and telecommunications operators. The Guidance document is directed to this non-exhaustive list. 

Things get more complicated when all these actors need to clarify their legal responsibility related to collecting and using personal data: Are they controllers – the entities that are liable for complying with most data protection obligations, and that establish the ‘means and purposes’ of a processing operation? Are they joint controllers – the entities that define jointly the means and purposes of a processing operation and that share liability? Or are they processors – the entities that merely process data on behalf of controllers and that need to follow a strictly defined mandate from the controller, having thus primarily contractual liability? 

It is important to note that car manufacturers are not the only controllers. According to the EDPB, data controllers can also include ‘service providers that process vehicle data to send the driver traffic information, eco-driving messages or alerts regarding the functioning of the vehicle’ and ‘insurance companies offering Pay-as-You-Drive contracts’. The Guidelines do not give examples of joint controllers, but specify that in a joint controllership situation a contract establishing how responsibility is shared is necessary, particularly regarding complying with data subjects’ rights requests. 

As for processors, they can be the ‘equipment manufacturers and automotive suppliers’ that may process data on behalf of car manufacturers. The EDPB also recalls that the equipment providers may as well become controllers if they process personal data collected from vehicles for their own purposes. 

3. Key Recommendations for Implementing Data Protection by Design and by Default

The EDPB highlights how important it is to consider data protection from the product design phase and the importance of transparency and data control to vehicle users. According to the Guidelines, ‘technologies should be designed to minimize the collection of personal data, provide privacy-protective default settings and ensure that data subjects are well informed and have the option to easily modify configurations associated with their personal data’.

The EDPB provides some general recommendations to implement Data Protection by Design and by Default (DPbD). In what looks like an invitation for cross-industry bodies or individual DPAs, it also recognizes that ‘specific guidance on how manufacturers and service providers can comply with Data Protection by Design and by Default could be beneficial for the industry’.

The EDPB recommends local (on-vehicle) processing of personal data rather than processing that occurs outside of the vehicle to mitigate the potential risks of cloud processing, which can be found in the Opinion on Cloud Computing released by the Article 29 Working Party. Such local processing guarantees to the user ‘the sole and full control of his/her personal data and, as such, it presents by design less privacy risks’ and ‘fewer cybersecurity risks’. One example of applications involving local processing are applications for unlocking, starting or activating vehicle commands using biometric data that is stored within the vehicle (face, voice, fingerprints).

The EDPB highlights that such local processing will likely fall outside the scope of the GDPR in what concerns natural persons (who could be drivers, car owners, passengers), due to the “household exception” in Article 2(2). The GDPR does, however, apply to ‘controllers or processors, which provide the means for processing personal data for such personal or household activities’. Conversely, this would mean that applications which involve transferring of personal data to the cloud could put natural persons in the situation of being a controller or processor, a fact which is not immediately obvious and which would benefit from further clarification in the final version of the Guidelines. 

When local processing is not possible, ‘hybrid processing’ may be used. The EDPB gives usage-based insurance as an example, in that an insurance company could not gain access to the raw behavioral data, but rather to the aggregate score that is the result of the processing conducted either locally within the vehicle or by a telematics service provider. 

With regard to ensuring users have control over their data, the EDPB recommends that ‘only data strictly necessary for the vehicle functioning are processed by default’. This means that individuals should have the possibility to ‘activate or deactivate the data processing for each other purpose and controller/processor and have the possibility to delete the data concerned’.

The EDPB recommends that when data must leave the vehicle, data should be anonymized or, in any case, pseudonymized. One last recommendation is to deploy Data Protection Impact Assessments even in the cases where they are not required by law.  

4. High Risk Personal Data Processing: Geolocation Data, Data Related to Criminal Offences and Biometric Data

Some data may warrant special attention if the data is sensitive or could impact the rights and interests of data subjects. The EDPB has  identified three categories of personal data warranting special attention: geolocation data, biometric data, and data that could reveal offenses of traffic violations. 

Geolocation data is particularly revealing of someone’s life habits and allows the inference of the driver’s residence, place of work, interests, and other sensitive details such as religion or sexual orientation. The EDPB cautions that all parties in the connected car ecosystem should not collect location data except when absolutely necessary for the purpose of processing. For example, when the vehicle’s movement is required, the vehicle’s gyroscope is sufficient and there is no need to collect location data. The EDPB provides several principles for collecting geolocation. Even when the data subject has given consent, accessing location data more often than necessary is not recommended. Geolocation should not be activated by default and on continuously when the car has been started. It should only activate when the user launches a functionality that requires the vehicle’s location. Users should have the option to deactivate geolocation at any time and data controllers should define a limited storage period.

If biometric data are used either to enable access to the vehicle or its settings, or to authenticate the owner or driver, the EDPB recommends that the use of biometrics always has an alternative and that storing and comparing the biometric template is done locally.

Finally, data revealing criminal offences or other infractions are also identified as particularly relevant for high risk processing and are broadly defined. For example, ‘the instantaneous speed of a vehicle combined with precise geolocation data or data indicating that the vehicle crossed a white line could be considered offence related data’. The EDPB considers that processing of such data ‘can only be carried out under the control of official authority or when the processing is authorised by Union or Member State law providing for appropriate safeguards for the rights and freedoms of data subjects stated in art. 10 GDPR’ and must be carried out locally (on vehicle). 

According to the EDPB, except for a few exceptions, such as accidentology studies consented to by the owner/driver, external processing of data revealing criminal offences or other infractions is forbidden

5. The Role of Consent

One of the most significant clarifications brought by the draft Guidelines is that Article 5(3) of the ePrivacy Directive is applicable in the context of connected vehicles, even if the rest of the Directive is not (since the Directive generally applies to providers of publicly available electronic communications networks). However, Article 5(3) is a general provision and it applies to ‘every entity that places on or reads information from a terminal equipment without regard to the nature of the data being stored or accessed’. 

The EDPB considers that a connected vehicle and any device connected to it meet the definition of a ‘terminal equipment’ under the ePrivacy Directive, and thus Article 5(3) is applicable to the data that are either stored on or go through them. Since Article 5(3) requires prior consent of the user for accessing or storing data on a ‘terminal equipment’ (any data, not just personal data), user consent becomes a cornerstone of the data governance environment in connected cars.

Even when the data being accessed or stored on the ‘terminal equipment’ is also ‘personal’ data under GDPR’s definition, Article 5(3) of the ePrivacy Directive takes precedence, as lex specialis, over the GDPR. This means that the lawful grounds for processing under Article 6 GDPR are not available to justify accessing personal data from or storing personal data to connected vehicles.  

However, any processing operation of personal data following the mere access of data on the device or storing data on the device, must additionally have a legal basis under Article 6 GDPR (for more on the complicated relationship between the ePrivacy Directive and the GDPR, see this Opinion of the EDPB on their interplay) – which theoretically could be consent again, or legitimate interest, or any of the six possible lawful grounds. 

In any case, the EDPB estimates that since the controller will have to provide notice to users for all purposes for which consent to access data on their device is sought, ‘consent will likely constitute the legal basis both for the storing and gaining of access to information already stored and the processing of personal data following the aforementioned processing operations’. The same conclusion is strengthened elsewhere in the Guidelines when the EDPB mentions that further processing of data collected on the basis of consent under Article 5(3) of the Privacy Directive, or on the basis of one of its exceptions, is only possible ‘either if the controller seeks additional consent for this other purpose or if the data controller can demonstrate that it is based on a Union or Member State law to safeguard the objectives referred to in Article 23(1) GDPR’. Technically, the EDPB considers that further processing on the basis of a ‘compatibility of purposes’ test is not possible.     

Speaking of exceptions, there are two situations which do not require consent to gain access to a terminal equipment either to store or retrieve data, under Article 5(3) of the ePrivacy Directive: (1) if this happens for the sole purpose of carrying out the transmission of a communication over an electronic communications network; and (2) when it is strictly necessary in order for the provider of an information society service explicitly requested by the subscriber or user to provide that service. For example, the service of renting or booking a parking space through an application offered by a third party provider will not need user’s consent under Article 5(3) ePrivacy Directive to access information that is already stored in the vehicle, such as ‘navigation data’, in order to provide this service explicitly requested by the user. For the processing of personal data stored in the vehicle, as well as for the processing of other personal data through the app, such as contact details, license plate number, payment information, the lawful ground for processing under the GDPR will be necessity to enter a contract, under Article 6(1)(b) (see Scenario 3.1.2 from the Guidelines). 

The EDPB further cautions that when data processing is based on consent, data controllers must pay attention to the possible complexities of obtaining consent from different participants, which could vary from car owners, users, or passengers. Consent cannot be bundled into the contract to purchase or lease a new car; consent must be provided separately for specific purposes. Consent may be especially difficult to obtain for drivers or passengers who are not related to the vehicle’s owner. 

Conclusion

The EDPB’s Guidance document concludes with five case studies of processing in the context of connected vehicles for various players in the connected car ecosystem, including provision of a service by a third party, such as pay as you drive insurance; users who wish to use geolocation to find their vehicle in the event of theft; and personal data stored on a rental car’s dashboard.

The Guidelines map out a complex compliance environment stemming from the ePrivacy Directive and the GDPR to match the incredibly complex connected cars ecosystem. They are open for consultation until March 20. If you want to contribute, follow this link.

The 10th Annual Privacy Papers for Policymakers Event

The Future of Privacy Forum’s 10th annual Privacy Papers for Policymakers event was a hit! This year’s event featured a keynote speech by FTC Commissioner Christine S. Wilson and facilitated discussions between the winning authors – Ignacio Cofone, Neil Richards, Margot Kaminski, Gianclaudio Malgieri, Arunesh Mathur, and Paul Ohm – and policy and regulatory staff, including Lisa Goldman, Jared Bomberg, Nasreen Djouini, Morgan Kennedy, and Michelle Richardson, followed by a reception.

 

 

During her opening remarks, FTC Commissioner Christine S. Wilson spoke on the importance of implementing federal privacy legislation: “We’ve arrived at a tipping point for privacy… All eyes are on Congress at this defining moment.” Commissioner Wilson cited the need for certain characteristics to be incorporated in the development of federal privacy legislation, including:

 

 

Summaries of each of the presenters’ papers can be accessed below.

 

In addition to the presentations and reception, we recorded short video interviews with each of the presenting authors. We will add new videos to this post every week. Check them out below:

Paul Ohm

 

 

Neil Richards

 

 

Ignacio Cofone

 

 

Margot Kaminski

 

 

Arunesh Mathur

 

Gianclaudio Malgieri

 

 

The goal of the Privacy Papers for Policymakers project is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on our website.

 

Call for Public Comments: Resources for Companies Sharing Personal Data with Academic Researchers

In June 2019, FPF launched the Corporate-Academic Data Stewardship Research Alliance, a peer-to-peer network of private companies that share the goal of facilitating privacy-protective data sharing between businesses and academic researchers. The Alliance has worked to support data sharing efforts, help address and mitigate challenges that create barriers to sharing, and promote responsible and privacy-protective practices that enable more data sharing between industry and academic researchers. 

FPF Senior Fellow Mike Hintze has been leading the project, working with 25 prominent organizations to develop usable, privacy-focused resources for companies that share personal data with academic researchers.

Today, FPF releases two resources in working form:

In the spirit of openness and collaboration, FPF invites public comments from companies, researchers, privacy experts, and all other interested individuals and stakeholders regarding these resources.

Following the comment period, FPF will publish revised versions of these two documents so that companies and researchers can use them to help facilitate responsible and protective data sharing. 

How to Comment:

Please email your feedback to: [email protected]. Please submit your feedback no later than Thursday, April 30, 2020.

View the Best Practices

View the Contract Guidelines

FPF Submits Written Statement to the U.S. House Committee on Financial Services Task Force on AI

This week, Future of Privacy Forum (FPF) Senior Counsel and Director of AI & Ethics Brenda Leong submitted a written statement on the use of artificial intelligence and machine learning-based applications in financial products and services. Addressed to the House Committee on Financial Services Task Force on Artificial Intelligence, the statement explores how to protect AI and ML-based financial systems from the impact of undesired or unintended bias. 

“Financial services organizations have the responsibility, both legally and ethically, to treat their customers, whether other businesses or individuals, fairly and equally,” wrote Leong. “As more players in this industry employ AI systems in more use cases, it is incumbent on them to ensure that their algorithms are fair and explainable.”

In the statement, Leong describes beneficial ways that financial institutions are currently using AI, such as combating fraud or extending credit to traditionally underserved individuals. She also identifies a few factors that can present fairness and equity concerns that are unique to or heightened by processing within an AI or ML-based system; and identifies the technical, policy, regulatory and legislative actions that can help mitigate risk and bias from the use of these systems.

“While ML and AI are technologies thought of as completely ‘other’ from human thinking, they are so far still always based on algorithms and models created by people,” wrote Leong. “Thus, these algorithms are prone to incorporating the biases of their designers, as well as the biases of the systems they’re designed to serve, because the only data available to train them already reflects decades or even centuries of inequality.”

She prompts legislators to take caution when taking new action to legislate AI, since existing laws and regulations already apply, including legislation protecting consumers from unfair or misleading trade practices, labor and employment laws, applicable privacy laws, and the entire regulatory structure around financial services.

The statement was submitted for the record of the Equitable Algorithms: Examining Ways to Reduce AI Bias in Financial Services hearing held by the Task Force on Artificial Intelligence on February 12. 

Read the full statement here.

A New U.S. Model for Privacy? Comparing the Washington Privacy Act to GDPR, CCPA, and More

By Stacey Gray, Pollyanna Sanderson, and Katelyn Ringrose

Download a printable version of this report (pdf).

As Congress continues to work toward drafting and passing a comprehensive national privacy law, state legislators are not slowing down. In Washington State, a new comprehensive privacy law is moving quickly: last week, the Washington Privacy Act (SSB 6281) was voted out of the Washington Senate Ways & Means Committee, and appears likely to be voted on by the Senate. If approved, it will reach the House, which is currently considering (and amending) an almost identical companion bill. The deadline for the bill to be voted on by both Senate and House (including, if applicable, resolving any differences) is March 12, 2020.

FPF commented at a recent public hearing that, if passed into law, the Washington Privacy Act (as represented by Senator Carlyle’s SSB-6281) would be a significant achievement for US privacy legislation. We have previously noted that the WPA would incorporate protections that go beyond those in the California Consumer Privacy Act, the only existing comprehensive consumer privacy law in the United States.

Is the Washington Privacy Act a good model for U.S. legislation? Lawmakers should consider:

FPF has created the following charts to compare key elements of the current California Consumer Privacy Act (CCPA); the upcoming 2020 California Ballot Initiative (CPRA); the EU General Data Protection Regulation (GDPR); the WPA of 2019 (Senate Bill 5376); and the WPA of 2020 (Substitute Senate Bill 6281). The following charts take into account the following key features of all laws: (1) jurisdictional scope; (2) definitions and structure; (3) pseudonymous data; (4) individual rights; (5) obligations on companies;  (6) facial recognition provisions; and (7) preemption and enforcement.

DOWNLOAD THE FULL ANALYSIS HERE.

1. JURISDICTIONAL SCOPE

The 2020 Washington Privacy Act (SSB-6281) would govern legal entities in Washington that collect data from Washington residents. Although narrower in scope than the GDPR, the WPA contains a significantly broader scope and territorial reach than the CCPA. Unlike the CCPA (which governs for-profit businesses), the WPA would also govern non-profit organizations, including public charities and foundations. In some cases, the WPA would even govern entities that do not “conduct business” in Washington, if they produce products or services “targeted to” residents of Washington.

EU GDPR CCPA CA Ballot Initiative WPA 2019 WPA 2020
Who can exercise rights? Natural persons (“data subjects”) California residents  California residents  Washington residents Washington residents
Who has obligations?  All govt and non-govt legal entities and individuals established in the EU or offering goods or services to EU residents For-profit businesses that “[do] business in the State of California” and meet thresholds (below) For-profit businesses that “[do] business in the State of California” and meet thresholds (below) Non-govt legal entities that “conduct business in Washington or produce products or services that are intentionally targeted to residents of Washington.” Non-gov’t legal entities that “conduct business in Washington or produce products or services that are targeted to residents of Washington.”
Thresholds  None. However, there is a limited small-business exemption for certain obligations (see e.g. Art. 30(5)) $25 million annual revenue; or 50,000+ consumers; or 50% of annual revenue derived from selling consumers personal data $25 million annual revenue; or 100,000+ consumers; or 50% of annual revenue derived from selling or sharing consumers’ personal data 100,000+ consumers; or derives 50%+ annual revenue from the sale of personal data and processes or controls personal data of 25,000+ consumers 100,000+ consumers during a calendar year; or derives 50%+ annual revenue from the sale of personal data and processes or controls personal data of 25,000+ consumers

 

2. DEFINITIONS AND STRUCTURE

The 2020 Washington Privacy Act (SSB-6281) contains key terms and an overall structure that closely aligns with the GDPR. It would define personal data broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person.” This definition is in line with long-standing global norms; for example, personal data was defined similarly as early as 1981 in the text of Convention 108, the first binding international data protection agreement. The 2020 WPA also contains different obligations for “controllers” and “processors,” with narrow CCPA-like exemptions for “de-identified” data and “publicly available information.”

EU GDPR CCPA CA Ballot Initiative WPA 2019 WPA 2020
Broad definition of covered data Y Y Y Y
“Controllers” & “Processors” Y Y (“businesses” and “service providers”) Y (“businesses” and “service providers”) Y Y
Excludes “de-identified” data Y * Y Y Y Y
Excludes “publicly available information” N Y Y Y Y
* The GDPR defines personal data very broadly. (Art. 4(1)). Its provisions do not apply to data which does not relate to an “identified or identifiable person” or to personal data “rendered anonymous in such a manner that the data subject is no longer identifiable.” (Recital 26).

 

3. PSEUDONYMOUS DATA

The 2020 Washington Privacy Act treats “pseudonymous data” differently than other covered data. Under the WPA, pseudonymous data – data that “cannot be attributed to a specific consumer without the use of additional information” – is exempted from access, deletion, and correction rights, but not from opt-outs of sale, profiling, or targeted advertising. This provides flexibility for companies processing data that is less identifiable (and therefore harder to associate with individuals in order to fulfill their requests) but still carries some risks to privacy or autonomy. For example, pseudonyms are frequently used in large datasets to conduct scientific research (e.g., in a HIPAA Limited Dataset, John Doe = 5L7T LX619Z).

In contrast, other U.S. laws do not explicitly codify different obligations for pseudonymous data. In practice, however, there is a growing consensus that U.S. privacy law will need to reflect the practical challenges of dealing with data that falls along a spectrum of identifiability. For example, in practice under the CCPA, individual rights to access, delete, or correct their data are almost always more limited for pseudonymous data due to the challenges with (1) linking the request to the data the company holds; and (2) verifying that the request is authentic and that disclosure or deletion is being conducted on behalf of the right person. (For more, see the California Attorney General’s ongoing CCPA rulemaking efforts).

In the EU, the GDPR explicitly recognizes that pseudonymization of personal data decreases risks to the rights and freedoms of individuals (see Recital 28). The GDPR also exempts controllers from complying with individual requests to exercise rights of access and deletion (erasure) when identification in a dataset would require the controller to acquire additional information, unless the individuals themselves provide additional information to help re-identification (see Article 11). Pseudonymization is also considered an important safeguard for the GDPR’s “privacy by design” requirements in Article 25 and for data security measures in Article 32.

EU GDPR CCPA CA Ballot Initiative WPA 2019 WPA 2020
Recognizes pseudonymous data Y * Indirect ** Indirect ** N Y
* More precisely, the GDPR recognizes “pseudonymization” as a method to decrease privacy risks and comply with certain obligations (see description above). ** Indirectly, individual rights to access and delete pseudonymous data in California may be limited in practice due to challenges with verifying consumer requests (see description above).

 

4. INDIVIDUAL RIGHTS

The WPA would codify individual rights for residents of Washington that go beyond both CCPA and the bill introduced in Washington in 2019. For instance, it would offer consumers a right to correct inaccurate data and to exercise broader rights to opt out of not only “sale” but also “profiling” and “targeted advertising.” In comparison, the CCPA does not require an opt out of certain targeted advertising practices if they can be conducted without “selling” data (a limitation that would be eliminated in the Ballot Initiative). Last year’s Washington bill contained a right to object to processing for targeted advertising, but would have allowed other kinds of data processing if outweighed by the interests of the company. The WPA would also grant consumers additional protections by requiring companies to establish internal appeals processes, paralleling certain procedural elements of the GDPR.

Finally, the WPA would require opt-in consent for collection of “sensitive information.” This includes, for example, racial or ethnic origin, biometric data, sexual orientation, or mental or physical health condition or diagnosis. Heightened protection for sensitive data largely aligns with the Ballot Initiative and the GDPR, which requires either “explicit consent” or a very narrow and specifically prescribed justification to process “special categories of data” (see Article 9). Notably, the 2020 WPA also includes “specific geolocation” as a type of sensitive data that requires opt-in consent. This aligns with the Ballot Initiative, but in comparison, under the GDPR, geolocation data can be processed in some situations without consent where other strong safeguards apply (see e.g. guidance on location data collected through Wi-Fi Analytics). In other cases, EU privacy laws like the ePrivacy Directive may apply.

EU GDPR CCPA CA Ballot Initiative WPA 2019 WPA 2020
Right to Access Y Y Y Y Y
Right to Correct Y N Y Y Y
Right to Delete Y Y Y Y
Right to Portability  Y Y Y Y
Internal Appeals Processes Y * N N N Y
Opt out of “Sale” Y ** Y Y N Y
Out Out for “Targeted Advertising”  N *** Y Y Y
Opt Out of “Profiling” N N N Y
Opt In Consent for Sensitive Data Y N Y N **** Y
* Companies engaged in high-volume or high-risk processing must appoint a Data Protection Officer (DPO) who handles requests, communications, and appeals (Article 37, Article 38, and Article 39). ** An individual can object to any processing of their personal data conducted pursuant to certain lawful bases, at which point the controller may no longer process the data unless it demonstrates “compelling legitimate grounds” to override that person’s interests, rights, and freedoms (Article 21). If such processing is conducted with consent, the consent must be easy to withdraw at any time (Article 7.3). Finally, the GDPR includes an absolute right to object to “direct marketing.” (Article 21.2). *** The CCPA does not restrict targeted advertising if it can be conducted without “selling” data. In contrast, the Ballot Initiative contains a broader opt-out provision (of both “sale” and “sharing”) and specifically limits service providers from engaging in any “cross-context behavioral advertising.” **** Except where consent could be used as a way for a company to engage in “high risk” processing, as determined by risk assessments. (s8(3)).

 

5. OBLIGATIONS ON COMPANIES

After the CCPA passed in 2018, it was widely criticized by privacy advocates for placing most of the burden on consumers to exercise their rights, without containing strong restrictions on the collection or uses of data. The California Ballot Initiative would go significantly further than CCPA by incorporating additional consumer rights and restrictions on the collection and use of “sensitive data.” The WPA similarly places additional obligations on companies to act as responsible stewards of information, including mandated risk assessments for high-risk activities. Neither would go as far as the GDPR, which requires that companies have a “lawful basis” to collect data at all, where a “lawful basis” can include, for example, consent, fulfillment of a contract, or “legitimate interests” (for more, see FPF’s report: Deciphering Legitimate Interests).

Both the California Ballot Initiative and the 2020 Washington Privacy Act incorporate elements of data minimization, purpose limitation, and avoidance of “secondary uses.” Neither is as restrictive as the provisions in the GDPR’s Article 5. However, the Ballot Initiative would require that a business’s collection and use of data be “reasonably necessary and proportionate to achieve the purposes for which [it was] collected or processed . . . and not further processed in a manner that is incompatible with those purposes.” (1798.100c). In comparison, the 2020 Washington Privacy Act would require that data collection be “limited to what is reasonably necessary” as well as “adequate, relevant, and limited” in relation to “the purposes for which such data are processed, as disclosed to the consumer,” and prohibit further processing that is not “compatible with” those purposes (absent consent) (Section 8).

EU GDPR CCPA CA Ballot Initiative WPA 2019  WPA 2020
Lawful Bases for Collection Y N N N N
Privacy Policies Y Y Y Y Y
Risk Assessments for High-Risk Activities Y N N Y
Data Minimization Y (strongest) N Y N Y
Purpose Limitation Y (strongest) N Y N Y
Duty to Avoid Secondary Use Y (strongest) N Y N Y
Security Requirements Y Y Y Y Y
Non-Discrimination Y (Indirectly)* Y Y N Y
* The GDPR does not include an explicit provision stating that a data subject must not be discriminated against on the basis of their choices to exercise rights. However, it is implicit from other principles of the GDPR that individuals must be protected from discrimination on these grounds. (Article 5, Article 13, Article 22, and elements of “freely given” consent and fair processing).

 

6. FACIAL RECOGNITION PROVISIONS

The current version of the  WPA contains special provisions for commercial uses of facial recognition technologies. Such provisions do not directly exist in the GDPR or other comprehensive privacy laws. However, other laws in the US and EU govern facial recognition technologies, whether as category of “sensitive data” (e.g. the Ballot Initiative would require consent for uses of biometric data), or as a form of sensitive data or automated profiling (under the GDPR).

Specifically, the GDPR regulates facial recognition technologies through several provisions. When facial recognition is used for identification purposes, “explicit consent” is required under Article 9, unless a narrow overriding justification applies, like a substantial public interest provided by law. In addition, GDPR imposes obligations for companies engaged in “solely automated decision-making and profiling” (Article 22), both of which can be part of real-world facial recognition use cases (see, e.g., EU guidance on collecting data through video).

Compared to the facial recognition provisions in 2019, the 2020 WPA provisions are significantly stronger. In 2019, the bill that passed the Washington Senate allowed for implied consent for facial recognition: “The placement of conspicuous notice in physical premises . . . [shall] constitute a consumer’s consent to the use of such facial recognition services . . . (Section 14). In contrast, the 2020 version does not permit this – instead, it would require businesses to obtain affirmative opt in consent from consumers prior to their enrollment in a facial recognition system (with narrow, limited exceptions). The 2020 WPA would also require covered entities to enable third-party auditing, and to address inaccuracies identified related to bias and discrimination. 

EU GDPR CCPA CA Ballot Initiative WPA 2019  WPA 2020
Protections for Commercial Uses of Facial Recognition Y (indirectly)  N Indirectly Y (limited) Y (stronger)

 

7. PREEMPTION AND ENFORCEMENT

The WPA aligns with other privacy laws in that it would preempt local regulations that would govern the same data processing activities. As a result, it would be likely to preempt local regulations for commercial uses of data that fall within the same jurisdictional scope of the law, but might not preempt local regulations of government or municipal entities.

The current WPA would be enforced by the Washington Attorney General. Similarly, the CCPA is enforced by the California Attorney General, whose office is currently engaged in regulatory rulemaking (see California’s draft regulations). The CCPA does not allow for civil enforcement of most of the law’s provisions, but contains a limited private right of action for data breaches. The GDPR allows individuals to exercise rights to “individual redress,” in addition to each EU Member State having their own well-funded Data Protection Authority. 

EU GDPR CCPA CA Ballot Initiative WPA 2019 WPA 2020
Preemption Y Y Y Y Y
Enforcement by State AG or Government Body (DPA) Y Y Y Y
Enforcement by Individuals Y (mix of EU judicial rights & individual redress from regulators) N (exception for security breaches) N (exception for security breaches) N N *
* Unlike the 2019 WPA, the 2020 WPA has been amended to clarify that it does not override the existing rights of Washington residents to bring actions under Washington State’s Consumer Protection Act (chapter 19.86 RCW) for conduct or behavior that would amount to an unfair or deceptive practice (Section 11). Similarly, residents of California (and many other states) have the ability to bring lawsuits to challenge privacy violations when they violate unfair and deceptive practices (UDAP) state laws.

 

CONCLUSION

The Senate sponsor of the 2020 WPA, Senator Reuven Carlyle, recently noted: “I don’t think that we’re ever going to be done dealing with the regulatory framework of consumer data and the issue of privacy. We’re living in a new era.” We agree. The United States needs a comprehensive, baseline federal privacy law to set uniform standards and create clarity for companies and strong rights for individuals. In the absence of such a law, the Washington Privacy Act could serve as a useful regulatory model for other states and for Congress that improves upon the CCPA, provides rights to Washington residents, and helps companies build effective data protection programs.

 

Did we miss anything? Let us know at [email protected] or [email protected] as we continue tracking state and federal developments in privacy legislation.

New “Privacy 101” Video Series Helps School District Leaders Protect Student Data

WASHINGTON, D.C. – In recognition of Safer Internet Day, the Future of Privacy Forum (FPF) today released a new “Student Privacy 101” video series that is designed to help school leaders better understand federal and state privacy laws and protect sensitive student data.

“As technology becomes increasingly prevalent in the classroom, faculty, administrators, and district leaders could benefit from a quick and easy guide to understanding how they can help reduce privacy risks and improve transparency around student data,” said FPF Director of Youth & Education Privacy Amelia Vance. “FPF’s new videos provide an animated overview of best practices and tips on how schools can protect student privacy.”

The “Student Privacy 101” video series includes:

Vance added, “Safer Internet Day reminds companies to examine how they can use technology more responsibly in support a better Internet experience for everyone, with a special focus on advancing positive practices that protect children and young people under 18. As one of the nation’s leading think tanks focused on privacy issues, FPF is a proud supporter of Safer Internet Day and works to provide year-round resources that support a culture of data security.”

FPF also published a new blog post celebrating Safer Internet Day today with additional information and resources about how schools can protect children’s data privacy.

The video series is based on the Siegl Framework⁠—developed by Jim Siegl, the Technology Architect at Fairfax County Public Schools in Virginia⁠—which advises local education agencies to consider the Venn diagram of legal compliance, privacy and security risks, and perception risks when working on student privacy.

The series was created by Monica Bulger, David Sallay, and Amelia Vance, with the animation magic and brilliance of Thought Café.

To learn more about Safer Internet Day, visit www.saferinternetday.org. For more information about FPF’s student privacy work, visit studentprivacycompass.org.

# # #

Contact

Alexandra Sollberger

[email protected]

202-317-0774

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by internet privacy experts and includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. For more information, visit www.fpf.org.

Privacy Papers 2019: Spotlight on the Winning Authors

FPF recently announced the winners of the 10th Annual Privacy Papers for Policymakers (PPPM) Award. This Award recognizes leading privacy scholarship that is relevant to policymakers in the United States Congress, at U.S. federal agencies, and for data protection authorities abroad.

From many nominated privacy-related papers published in the last year, five were selected by Finalist Judges, after having been first evaluated highly by a diverse team of academics, advocates, and industry privacy professionals from FPF’s Advisory Board. Finalist Judges and Reviewers agreed that these papers demonstrate a thoughtful analysis of emerging issues and propose new means of analysis that can lead to real-world policy impact, making them “must-read” privacy scholarship for policymakers.


The winners of the 2019 PPPM Award are:

Antidiscriminatory Privacy

by Ignacio N. Cofone, McGill University Faculty of Law

Ignacio N. Cofone is an Assistant Professor at McGill University’s Faculty of Law, where he teaches about privacy law and artificial intelligence regulation, and an Affiliated Fellow at the Yale Law School Information Society Project. His research explores how law should adapt to technological and social change with a focus on information privacy and algorithmic decision-making. Before joining McGill, Ignacio was a research fellow at the NYU Information Law Institute, a resident fellow at the Yale Law School Information Society Project, and a legal advisor for the City of Buenos Aires. He obtained a joint PhD from Erasmus University Rotterdam and Hamburg University, where he was an Erasmus Mundus Fellow, and a JSD from Yale Law School. His full list of publications is available at www.ignaciocofone.com. He tweets from @IgnacioCofone.


Privacy’s Constitutional Moment and the Limits of Data Protection

by Woodrow Hartzog, Northeastern University, School of Law and Khoury College of Computer Sciences and Neil M. Richards, Washington University, School of Law and the Cordell Institute for Policy in Medicine & Law

Woodrow Hartzog is a Professor of Law and Computer Science at Northeastern University School of Law and the Khoury College of Computer Sciences. He is also a Resident Fellow at the Center for Law, Innovation and Creativity (CLIC) at Northeastern University, a Faculty Associate at the Berkman Klein Center for Internet & Society at Harvard University, a Non-resident Fellow at The Cordell Institute for Policy in Medicine & Law at Washington University, and an Affiliate Scholar at the Center for Internet and Society at Stanford Law School. His research on privacy, media, and robotics has been published in scholarly publications such as the Yale Law Journal, Columbia Law Review, and California Law Review and popular publications such as The New York Times, The Washington Post, and The Guardian. He has testified multiple times before Congress and has been quoted or referenced by numerous media outlets, including NPR, BBC, and The Wall Street Journal. He is the author of Privacy’s Blueprint: The Battle to Control the Design of New Technologies, published in 2018 by Harvard University Press. His book with Daniel Solove, Breached!: Why Data Security Law Fails and How to Improve It, is under contract with Oxford University Press.

Neil M. Richards is one of the world’s leading experts in privacy law, information law, and freedom of expression. He writes, teaches, and lectures about the regulation of the technologies powered by human information that are revolutionizing our society. Professor Richards holds the Koch Distinguished Professor in Law at Washington University School of Law, where he co-directs the Cordell Institute for Policy in Medicine & Law. He is also an affiliate scholar with the Stanford Center for Internet and Society and the Yale Information Society Project, a Fellow at the Center for Democracy and Technology, and a consultant and expert in privacy cases. Professor Richards serves on the board of the Future of Privacy Forum and is a member of the American Law Institute. Professor Richards graduated in 1997 with graduate degrees in law and history from the University of Virginia, and served as a law clerk to both William H. Rehnquist, Chief Justice of the United States and Paul V. Niemeyer, United States Court of Appeals for the Fourth Circuit. Professor Richards is the author of Intellectual Privacy (Oxford Press 2015). His many scholarly and popular writings on privacy and civil liberties have appeared in a wide variety of media, from the Harvard Law Review and the Yale Law Journal to The Guardian, WIRED, and Slate. His next book, Why Privacy Matters, will be published by Oxford Press in 2020. Professor Richards regularly speaks about privacy, big data, technology, and civil liberties throughout the world, and also appears frequently in the media. At Washington University, he teaches courses on privacy, technology, free speech, and constitutional law, and is a past winner of the Washington University School of Law’s Professor of the Year award. He was born in England, educated in the United States, and lives with his family in St. Louis. He is an avid cyclist and a lifelong supporter of Liverpool Football Club.


Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations

by Margot E. Kaminski, University of Colorado Law and Gianclaudio Malgieri, Vrije Universiteit Brussel (VUB) – Faculty of Law

Margot E. Kaminski is an Associate Professor at the University of Colorado Law and the Director of the Privacy Initiative at Silicon Flatirons. She specializes in the law of new technologies, focusing on information governance, privacy, and freedom of expression. Recently, her work has examined autonomous systems, including AI, robots, and drones (UAS). In 2018, she researched comparative and transatlantic approaches to sensor privacy in the Netherlands and Italy as a recipient of the Fulbright-Schuman Innovation Grant. Her academic work has been published in UCLA Law Review, Minnesota Law Review, Boston University Law Review, and Southern California Law Review, among others, and she frequently writes for the popular press. Prior to joining Colorado Law, Margot was an Assistant Professor at the Ohio State University Moritz College of Law (2014-2017), and served for three years as the Executive Director of the Information Society Project at Yale Law School, where she remains an affiliated fellow. She is a co-founder of the Media Freedom and Information Access (MFIA) Clinic at Yale Law School. She served as a law clerk to the Honorable Andrew J. Kleinfeld of the Ninth Circuit Court of Appeals in Fairbanks, Alaska.

Gianclaudio Malgieri is a doctoral researcher at the “Law, Science, Technology and Society” center of Vrije Universiteit Brussel, Attorney in Law and Training Coordinator of the Brussels Privacy Hub. He is Work Package Leader of the EU Panelfit Research Project, about Legal & Ethical issues of  data processing in the research sector. He is also external expert of the EU Commission for the ethics and data protection assessment of EU research proposals. He has authored more than 40 publications in leading international law reviews and is deputy editor of Computer, Law and Security Review (Elsevier). He is lecturer of Data Protection Law and Intellectual Property for undergraduate and professional courses at the University of Pisa, Sant’Anna School of Advanced Studies and Strasbourg University. He got an LLM with honours at the University of Pisa and a JD with honours at Sant’Anna School of Advanced Studies of Pisa (Italy). He was visiting researcher at the Oxford University, London School of Economics, World Trade Institute of the University of Bern and École Normale Superieure de Paris.


Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites

by Arunesh Mathur, Princeton University; Gunes Acar, Princeton University; Michael Friedman, Princeton University; Elena Lucherini, Princeton University; Jonathan Mayer, Princeton University; Marshini Chetty, University of Chicago; and Arvind Narayanan, Princeton University  

Arunesh Mathur is a graduate student in the department of computer science at Princeton University, where he is affiliated with the Center for Information Technology Policy (CITP). Mathur’s research examines how technical systems interface with and impact society in negative ways. His current research focus is dark patterns: empirically studying how commercial, political, and other powerful actors employ user interface design principles to exploit individuals, markets, and democracy. His research has won multiple awards including the best paper awards at ACM CSCW 2018 and USENIX SOUPS 2019.

Gunes Acar is a FWO postdoctoral fellow at KU Leuven’s COSIC research group. His research interests involve web tracking measurement, anonymous communications, and IoT privacy and security. Gunes obtained his PhD at KU Leuven in 2017, and was a postdoctoral researcher between 2017 and 2019 at Princeton University’s Center for Information Technology Policy.

 

Michael Friedman is a Technical Program Manager at Google. His work focuses on monitoring compliance with privacy regulations and certifications. Michael is broadly interested in the privacy implications of information technology and the enforcement of privacy standards. He earned his Bachelor’s degree in Computer Science at Princeton University with a concentration in societal implications of information technology. While there, he conducted research on the effectiveness of technology privacy policies, with a focus on children’s data privacy. He also collaborated in this work on dark patterns.

Elena Lucherini is a second-year Ph.D. student at the Center for Information Technology Policy at Princeton University. Her advisor is Arvind Narayanan. Lucherini received her bachelor’s degree from University of Pisa and her master’s from University of Pisa and Sant’Anna School of Advanced Studies.

 

 

Jonathan Mayer is an Assistant Professor at Princeton University, where he holds appointments in the Department of Computer Science and the Woodrow Wilson School of Public and International Affairs. Before joining the Princeton faculty, he served as the technology law and policy advisor to United States Senator Kamala Harris and as the Chief Technologist of the Federal Communications Commission Enforcement Bureau. Professor Mayer’s research centers on the intersection of technology and law, with emphasis on national security, criminal procedure, and consumer privacy. He is both a computer scientist and a lawyer, and he holds a Ph.D. in computer science from Stanford University and a J.D. from Stanford Law School.

Marshini Chetty is an assistant professor in the Department of Computer Science at the University of Chicago. She specializes in human-computer interaction, usable privacy and security, and ubiquitous computing. Marshini designs, implements, and evaluates technologies to help users manage different aspects of Internet use from privacy and security to performance, and costs. She often works in resourceconstrained settings and uses her work to help inform Internet policy. She has a Ph.D. in Human-Centered Computing from Georgia Institute of Technology, USA and a Masters and Bachelors in Computer Science from the University of Cape Town, South Africa. In her former roles, Marshini was on the faculty in the Computer Science Department at Princeton University and the College of Information Studies at the University of Maryland, College Park. Her work has won best paper awards at SOUPS, CHI, and CSCW and has been funded by the National Science Foundation, the National Security Agency, Intel, Microsoft, Facebook, and multiple Google Faculty Research Awards.

Arvind Narayanan is an Associate Professor of Computer Science at Princeton. He leads the Princeton Web Transparency and Accountability Project to uncover how companies collect and use our personal information. Narayanan is the lead author of a textbook on Bitcoin and cryptocurrency technologies which has been used in over 150 courses around the world. His doctoral research showed the fundamental limits of de-identification, for which he received the Privacy Enhancing Technologies Award. His 2017 paper in Science was among the first to show how machine learning reflects cultural stereotypes, including racial and gender biases. Narayanan is a recipient of the Presidential Early Career Award for Scientists and Engineers (PECASE).


The Many Revolutions of Carpenter

by Paul Ohm, Georgetown University Law Center

Paul Ohm is a Professor of Law and the Associate Dean for Academic Affairs at the Georgetown University Law Center, where he also serves as a faculty director for the Center on Privacy & Technology and the Institute for Technology Law & Policy. His writing and teaching focuses on information privacy, computer crime law, intellectual property, and criminal procedure. A computer programmer and computer scientist as well as a lawyer, Professor Ohm tries to build new interdisciplinary bridges between law and computer science, and much of his scholarship focuses on how evolving technology disrupts individual privacy.

Professor Ohm began his academic career on the faculty of the University of Colorado Law School, where he also served as Associate Dean and Faculty Director for the Silicon Flatirons Center. From 2012 to 2013, Professor Ohm served as Senior Policy Advisor to the Federal Trade Commission. Before becoming a professor, he worked as an Honors Program trial attorney in the U.S. Department of Justice’s Computer Crime and Intellectual Property Section and a law clerk to Judge Betty Fletcher of the United States Court of Appeals for the Ninth Circuit and Judge Mariana Pfaelzer of the United States District Court for the Central District of California. He is a graduate of the UCLA School of Law.


The Finalist Judges also selected three papers for Honorable Mention on the basis of their uniformly strong reviews from the Advisory Board.

The 2019 PPPM Honorable Mentions are:

Additionally, the 2019 Student Paper award goes to:


The winning authors have been invited to join FPF and Honorary Co-Hosts Senator Edward J. Markey, and Congresswoman Diana DeGette, to present their work at the U.S. Senate with policymakers, academics, and industry privacy professionals. This annual event will be held on February 6, 2020. FPF will subsequently publish a printed digest of summaries of the winning papers for distribution to policymakers, privacy professionals, and the public. RSVP here to join us.

Award-Winning Paper: "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites"

For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting award-winning research to lawmakers and regulators. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 is Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites by Arnuesh Mathur, Gunes Acar, Michael Friedman, Elena Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan. Mathur and his co-authors present an analysis of deceptive user interface designs across 11,000 shopping websites to create a taxonomy of “dark pattern” characteristics that harm user decision-making.


Dark patterns are user interface design choices that benefit an online service by coercing or deceiving users into making a decision that, if fully informed and capable of selecting alternatives, they may not make. In Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites, Arunesh Mathur and his co-authors present a new, large-scale analysis of the presence of dark patterns across 11,000 shopping websites, informing our understanding of the prevalence of these patterns and their influence on users. Mathur observes that “at best, dark patterns annoy and frustrate users. At worst, they can mislead and deceive users, e.g., by causing financial loss, tricking users into giving up vast amounts of personal data, or inducing compulsive and addictive behavior in adults and children.” In the context of shopping websites, dark patterns can trick users into signing up for recurring subscriptions and making unwanted purchases, resulting in “concrete financial loss.”

The authors contribute a taxonomy that offers precise terminology to characterize how each type of dark pattern functions and exploits users’ cognitive biases. The authors identify five distinct types of dark patterns: asymmetric, covert, deceptive, information-hiding, and restrictive. Of these, the authors state that “the majority of [dark patterns] are covert, deceptive, and information hiding in nature.” Additionally, the authors observe the effects of user interface design that employs the anchoring effect, the bandwagon effect, the default effect, the framing effect, the scarcity bias, and the sunk cost fallacy to manipulate users’ decision-making abilities.

Through their analysis, the authors discover 1,818 dark pattern instances across 53K product pages and 11K shopping websites, representing multiple types and categories. Interestingly, the authors observe that “shopping websites that were more popular, according to Alexa rankings, were more likely to feature dark patterns.” Based on their findings, the authors suggest that future study should focus on creating empirical evaluations of the effects of dark patterns on user behavior in order to develop better countermeasures against dark patterns to ensure that users can enjoy a fair and transparent shopping experience.

If you’re interested in learning more about how dark patterns in user interface design influence users’ behavior, you’ll want to read Mathur’s paper.


The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website.

CPDP2020 Panel: The Future Is Now: Autonomous Vehicles, Trolley Problem(s) and How to Deal with Them

Last week, FPF brought together a panel of technology, legal, regulatory, and business voices to discuss “The Future is Now: Autonomous Vehicles, Trolley Problem(s) and How to Deal with Them at the 13th annual Computers, Privacy, and Data Protection conference.

The premise of the panel was that autonomous and highly automated vehicles are likely the first product that will bring AI to the masses in a life-changing way. They rely on AI for a variety of uses: from mapping, perception and prediction, to self-driving technologies. Their promise is great: increasing the safety and convenience of our cities and roads. But so are the challenges that come with it, from solving life and death questions to putting in place a framework that works for the protection of fundamental rights of drivers, passengers and everyone physically around them. This panel of experts discussed connected and automated technology, law, policy, and proposed a EU-US comparative perspective to discuss essential questions. The panel was moderated by Trevor Hughes (CEO, IAPP), and the panelists were Sophie Nerbonne (Director, CNIL), Andreea Lisievici (Head of Global Data Protection Office, Volvo Cars), Mikko Niva (Group Privacy Officer, Vodafone), and Chelsey Colbert (Policy Counsel, Mobility and Location, FPF). 

The speakers answered many questions, including: How much data and what type of data runs through all systems of an autonomous vehicle? What are the benefits of autonomous vehicles and what are the risks to individual rights? How can they be balanced? They also discussed the infamous thought experiment “the Trolley Problem” and its application to connected and automated vehicles in the real world. 

Andreea Lisievici (Head of Global Data Protection Office, Volvo Cars) started the panel with demystifying what we are talking about when we are talking about “connected and autonomous cars.” She gave an overview of the levels of autonomy in vehicles: Level 0 – no automation; Level 1 – driver assistance; Level 2 – partial driver automation; Level 3 – conditional driving automation; Level 4 – high driving automation; and Level 5 – full driving automation. Commercial vehicles currently on the market are considered level 2 (or “2+” or 3), while some other companies doing AV testing are reportedly at level 4. 

Mikko Niva (Group Privacy Officer, Vodafone) commented on the vast ecosystem of parties in the connected and automated car ecosystem. 

Sophie Nerbonne (Director, CNIL) reminded everyone that most of the data in this complex ecosystem is personal data. She recounted that when the CNIL began working with French OEMs a couple of years ago, they weren’t fully aware of how much “technical data” was in fact personal data. 

Indeed, the CAV ecosystem is vast and interconnected; we must think beyond the individual car and consider the broader ecosystem that will include city infrastructure, such as streetlights, pedestrians, other vehicles, and even other objects, such as delivery robots. These “V2X” (vehicle to everything) technologies, which includes V2V (vehicle to vehicle), V2I (vehicle to infrastructure), and V2P (vehicle to pedestrian), bring in parties such as car manufacturers, telecom providers, third party apps and services, and local governments. This ecosystem presents challenges and opportunities for not just personal car ownership, but also rental companies, rideshare and ride-hailing companies, delivery robots, micro-mobility such as scooters, and modes of transportation or freight delivery that are underground and in the air. 

The majority of this information is likely to be personal data, or capable of being linked to a person, and there are many players and data flows for organizations to consider, including drivers, passengers, pedestrians, and employees. [See here for FPF infographic about data and the connected car]. Data protection impact assessments are an important tool available to organizations, and the speakers agreed that while privacy and ethics by design is important, operationalizing this can be a challenge. Entities must look beyond legal obligations and consider how they will earn and maintain consumer trust.

As for the Trolly Problem, the speakers agreed that… it is not the right problem, since it does not ask the right question. Real life scenarios where connected/autonomous vehicles need to “make decisions” have much more parameters to take into account and many more options than what the Trolly Problem proposes. Watch the full recording of the panel by following this link.

Data and the Connected Car Infographic