FPF Testifies on Automated Decision System Legislation in California

Last week, on April 8, 2021, FPF’s Dr. Sara Jordan testified before the California House Committee on Privacy and Consumer Protection on AB-13 (Public contracts: automated decision systems). The legislation passed out of committee (9 Ayes, 0 Noes) and was re-referred to the Committee on Appropriations. The bill would regulate state procurement, use, and development of high-risk automated decision systems by requiring prospective contractors to conduct automated decision system impact assessments. 

At the hearing, Dr. Jordan commented as an expert witness alongside Vinhcent Le, who represented The Greenlining Institute. Dr. Jordan commended the sponsors for amending the definition of “automated decisionmaking” to account for the wide range of technical complexity in automated systems. In addition, Dr. Jordan testified that the government contract stage is an appropriate stage for the introduction of algorithmic impact assessments for high-risk applications of automated decisionmaking. This would allow authorities in California to evaluate technology before it is implemented using transparent and actionable assessment criteria.

Emerging Patchwork or Laboratories of Democracy? Privacy Legislation in Virginia and Other States

Stacey Gray, Pollyanna Sanderson & Samuel Adams

In the absence of federal privacy legislation, U.S. states are weighing in. In Virginia, the “Consumer Data Protection Act” (“CDPA”) (HB 2307 / SB 1392) could be signed into law within weeks, and if passed, would take effect on Jan. 1, 2023. If the law passes, it would become the second comprehensive (non-sectoral) data protection law in the United States, making it a potential model for other states and federal legislation.

At present, the Virginia CDPA is about 50% of the way through Virginia’s bicameral, citizen legislature. Both bills have passed in their own chambers (in the House on Jan. 29, 89-9, and in the Senate on Feb. 5, 36-0). Either bill must now pass in the other chamber, a process that will likely involve additional hearings and opportunity for debate. Assuming either the House or Senate version passes in the other chamber without further amendment, it would then be sent to the Governor for veto, signature, or amendment. In light of the rapid speed and near-unanimous legislative support, businesses, law firms, and privacy advocates alike are beginning to pay close attention.

We provide a summary below of the key features of Virginia’s CDPA, including (1) the scope of covered entities & covered data; (2); consumer rights & pseudonymised data; (3) sensitive data and risk assessment; (4) consent standard and use limitations; (5) non-discrimination; (6) controllers, processors & third parties; (7) limitations and commercial research; and (8) enforcement. 

Overall, we observe many similarities to the structure and provisions in the Washington Privacy Act (SB 5062), although there are notable differences, in particular in the scope of covered entities and personal data. Both likely go further than the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA), which do not require opt-in consent for sensitive data or mandated risk assessments – although California has made notable changes through Attorney General rulemaking. We also note some differences between these leading state legislative efforts and the EU General Data Protection Regulation (GDPR), the most notable being that US proposals rely primarily on downstream limitations on sale, sharing, and use (including through opt-outs), while the GDPR restricts all data processing for government, companies and non-profits unless one of six lawful bases are available. 

Finally, we note the broader landscape of emerging state privacy legislation, including Washington State, Oklahoma, New York, Connecticut, Minnesota, and others. As more and more state models emerge, the pressure will continue to increase on Congress to pass a federal comprehensive baseline privacy law.

1. Scope of Covered Entities & Covered Data 

The Consumer Data Protection Act (CDPA) would apply to businesses “that conduct business in the Commonwealth or produce products or services that are targeted to residents of the Commonwealth” and that process “personal data” of (i) at least 100,00 consumers per year, or (ii) at least 25,000 consumers and deriving over 50% of gross revenue from the sale of personal data. It would create rights for residents of Virginia acting only in an individual or household context, and not acting in a commercial or employment context. 

The CDPA has a broad definition of personal data: “any information that is linked or reasonably linkable to an identified or identifiable natural person.” This excludes “de-identified data,” as well as “publicly available information,” a broad term that encompasses information “lawfully made available through federal, state, or local government records” and “information that a business has a reasonable basis to believe is lawfully made available to the general public through widely distributed media, by the consumer, or by a person to whom the consumer has disclosed the information, unless the consumer has restricted the information to a specific audience.” 

In addition, a number of exemptions are currently drafted in the bill, including for:

As a result, the Virginia bill’s jurisdictional scope of “covered entities” and “personal data” is both similar and in some ways narrower than other US laws and the GDPR. It would apply to a broad definition of “personal information,” which is similar to leading US and EU law, but apply to a narrower scope of covered entities. For example, while the WPA and CCPA contain similar exemptions (for data governed by the Fair Credit Reporting Act), they do not totally exclude “entities” governed by HIPAA or GLBA. Similarly, unlike the WPA, Virginia’s bill would not apply to non-profits. The exclusions for publicly available information also differ – with the WPA containing a narrower exclusion for lawfully available government records – in contrast to information made available to the general publicly through distributed media. Given the variety of existing sectoral privacy regulatory environments in the United States (such as HIPAA and GLBA), all leading US laws and proposals so far have been narrower in scope than the GDPR – which applies broadly to all legal persons that process personal data, including non-profits, employers, and government entities.

2. Consumer Rights & Pseudonymised Data

The CDPA would grant consumers in Virginia the rights to request (1) access to their personal data; (2) correction, (3) deletion, (4) the obtainment of a copy in a “portable” and “readily usable” format; and (5) an opt-out of sale, targeted advertising, and “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Profiling is defined as: “any form of automated processing performed on personal data to evaluate, analyze, or predict personal aspects related to an identified or identifiable natural person’s economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”

The CDPA provides a narrow limitation for “pseudonymous data,” for which companies would be required to comply with the bulk of the bill’s requirements – including opt-out rights – but not access, correction, deletion, or portability. Pseudonymous data is defined as “data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to ensure that the personal data is not attributed to an identified or identifiable natural person.” Providing flexibility for “pseudonymous data” is a shared feature of the CDPA and the WPA. The approach recognizes challenges involving authentication and verification of consumer requests involving pseudonymous data, and creates an incentive for covered entities to maintain personal information in less readily identifiable formats. The treatment of pseudonymous data also has some similarities with the GDPR’s flexibility for data that cannot be identified (Article 11), recognition of pseudonymization as a risk minimizing safeguard (Recital 28), and inclusion in Privacy by Design requirements (Article 25). 

When combined with the opt-in requirement for sensitive data (discussed below), this approach goes further than the California Consumer Privacy Act, which currently only requires that consumers be able to opt out of “sale.” When the California Privacy Rights Act (CPRA) goes into effect in 2023, it will add the right of correction, clarify that the opt-out of “sale” applies to all targeted advertising, and add an opt-out for limiting certain uses of sensitive data. On the other hand, Attorney General rulemaking in California has bolstered existing law by requiring compliance with “user-enabled global privacy controls” (including, arguably, new tools such as the Global Privacy Control). Other states, including Washington and Virginia, have not adopted such a “global opt-out” requirement.

3. Sensitive Personal Data & Risk Assessments

The CDPA would require “freely given, specific, informed, and unambiguous” consent (a standard discussed below) for controllers to collect or process sensitive data, defined as: “(1) personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status; (2) the processing of genetic or biometric data for the purpose of uniquely identifying a natural person; (3) the personal data collected from a known child; and (4) precise geolocation data.” “Child” means a person younger than 13, which aligns with COPPA. 

This definition roughly aligns with WPA, with a few notable differences, such as the WPA’s inclusion of “health condition” in addition to “diagnosis.” The CDPA’s definition also roughly aligns with the definition of sensitive data in the California Privacy Rights Act (CPRA), which will create an “opt-out” for sensitive data uses when it comes into effect in 2023. In contrast to California, however, both the WPA and CDPA would require affirmative opt-in consent prior to any collection of data – a much higher standard than the CPRA’s right to opt-out or limit the use and disclosure of sensitive personal information.

In addition, the CDPA (like the WPA) would create a new requirement that controllers conduct data protection assessments if engaged in any of the following:

These data protection assessments would be required to be made available to the Attorney General upon request, pursuant to an investigative civil demand.

4. Consent Standard & Use Limitations 

The CDPA would define consent as “freely given, specific, informed, and unambiguous,” a strong opt-in standard that aligns with the GDPR and WPA. The CDPA does lack the “dark patterns” language found in the current WPA, which would specifically outlaw controllers and processors from providing deceptive user interfaces to obtain consent from individuals. We note that many of these dark patterns may already be illegal under state and federal consumer protection law, as University of Chicago’s Lior Strahelivits observes in a recent paper, “Shining a Light on Dark Patterns.”

The CDPA would require controllers to obtain opt in consent to process personal data for incompatible secondary uses, “as disclosed to the consumer.” This is likely to be less restrictive in practice for businesses than the current WPA, which removed this language in the version of the bill introduced in 2021. Under the WPA, collection of personal data must be limited to what is reasonably necessary in relation to the purposes for which the data is processed; and adequate, relevant, and limited to what is reasonably necessary in relation to the purposes for which the data is processed. In comparison, the GDPR’s principle of purpose limitation requires all data collection to be only for a specified, explicit, and legitimate purpose, which includes compatible purposes. 

5. Non-Discrimination

The CDPA would prohibit a controller from discriminating against a consumer for exercising any of their consumer privacy rights, including denying goods or services, charging different rates, or providing a different level of quality of goods and services. However, similar to California law, it provides a broad exception for “voluntary participation in a bona fide loyalty, rewards, premium features, discounts, or club card program.” In contrast, the current WPA contains a narrower exemption for such programs that would require additional disclosures and limits the sale and secondary uses of personal information.

In addition, the CDPA would prohibit controllers from processing personal data in violation of state and federal laws that prohibit unlawful discrimination against consumers. A similar requirement is in the WPA. In comparison, while the GDPR does not contain a specific provision stating that a data subject must not be discriminated against on the basis of their choices to exercise rights, other principles of the GDPR require that individuals must be protected from discrimination on these grounds (Article 5, Article 13, Article 22, and elements of “freely given” consent and fair processing).

6. Controllers, Processors & Third Parties 

The CDPA follows the GDPR and WPA structure of dividing responsibilities between “controllers” and “processors,” rather than using the CCPA/CPRA terminology of “businesses” and “service providers”:

“Third party” is defined as “a natural or legal person, public authority, agency, or body other than the consumer, controller, processor, or an affiliate of the processor or the controller.” Controllers would be required to provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes the categories of personal data that the controller shares with third parties, if any; the categories of third parties, if any, with whom the controller shares personal data; and if a controller sells personal data to third parties or processes personal data for targeted advertising, the controller would be required to “clearly and conspicuously” disclose such processing, as well as the manner in which a consumer may exercise the right to opt out of such processing. Of note, if a third party recipient processes data in violation of the Act, controllers and processors would not be liable to the extent that they did not have “actual knowledge” that the recipient intended to commit a violation. 

7. Limitations and Commercial Research 

The CDPA would not limit a controller or processor’s ability to provide a product or service specifically requested by a consumer, perform a contract to which the consumer is a party; comply with existing laws; cooperate with civil, criminal, or regulatory investigations; cooperate with law enforcement agencies; defend legal claims; to protect an interest that is essential to the life or physical safety of the consumer or another natural person; or protect against fraud, theft, or harassment.

In addition, the CDPA would not restrict the ability of controllers and processors to engage in public or peer-reviewed scientific or statistical research in the public interest that is approved, monitored, and governed by an institutional review board (IRB) or a “similar independent oversight entity.” This aligns provides greater flexibility for commercial research than the CCPA, and aligns with broader trends in U.S. privacy legislation, including the WPA, Sen. Cantwell’s (D-WA) COPRA, and Sen. Wicker’s (R-MS) SAFE DATA Act

The requirements of the CDPA would also not limit the ability of controllers and processors to “collect, use, or retain data” to: 1) conduct internal research to “develop, improve, or repair products, services, or technology”; 2) effectuate product recalls, identify and repair technical errors; or 3) to perform internal operations that are reasonably aligned with the “reasonable expectations” of the consumer or “reasonably anticipated” based on the consumer’s existing relationship with the controller. 

8. Enforcement 

The CDPA, which contains a 30-day cure period, would be enforced by the Attorney General, with civil fines capped at $7,500. The legislation would establish a Consumer Privacy Fund within the Office of the Attorney General in order to establish funding in future years. 

A right to cure period is a current feature in the CCPA that will be removed by the California Privacy Rights Act, and is currently being debated in Washington State. In Virginia, several stakeholders have testified that the cure period would promote faster and less costly results for consumers, and may be useful as businesses adapt to compliance. Others, such as Consumer Reports, have advocated for it to be removed as unduly limiting on enforcement. At a recent hearing on the WPA in Senate Ways & Means Committee, the WA Attorney General Office suggested sunsetting the cure period once affected entities had time to adjust their data practices. 

Recognizing the Broader Landscape of Emerging State Laws

Virginia is not alone – many, if not most, U.S. states are considering or introducing consumer privacy legislation in 2021. The CDPA shares a similar framework with the WPA and the Minnesota Consumer Data Privacy Act (HF3936). Some align more closely with the CCPA, e.g., Michigan (HB6457), Connecticut (SB156), New York (Data Accountability and Transparency Act). These bills generally place a greater focus on privacy self-management, business-consumer relationships, and enabling consumers to “opt-out” of having their personal data “sold.” In contrast, Oklahoma (OK 1602) is actively considering a proposal that would provide individuals with a right to “opt-in” to sales of their personal data. 
If passed, the CDPA would be the second comprehensive (non-sectoral) data protection law in the United States after the CCPA. As more and more states weigh in, however, significant issues are already beginning to arise about interoperability for companies handling data across state lines. For now, legislation in Virginia, Washington, and Minnesota are in some ways interoperable with the GDPR and the CCPA, but will clearly require companies to customize practices to match key differences in each state. As additional laws pass, and if further conflicts emerge, it will certainly increase the pressure on Congress to pass a federal comprehensive baseline privacy law. At a Nov. 24 hearing, Virginia legislators generally expressed a preference for federal legislation, but recognized that in its absence, Virginia residents deserve data protection rights already enjoyed in California, by everyone in Europe, and by an increasing number of people around the world.