Uniform Law Commission Finalizes Model State Privacy Law

This month, the Uniform Law Commission (ULC) voted to approve the Uniform Personal Data Protection Act (UPDPA), a model bill designed to provide a template for uniform state privacy legislation. After some final amendments, it will be ready to be introduced in state legislatures in January 2022. 

The ULC has been engaged in an effort to draft a model state privacy law since 2019, with the input of advisors and observers, including the Future of Privacy Forum. First established in 1892, the ULC has a mission of “providing states with non-partisan, well-conceived and well-drafted legislation that brings clarity and stability to critical areas of state statutory law.” Over time, many of its legislative efforts have been very influential and become law in the United States — for instance, the ULC drafted the Uniform Commercial Code in 1952. More recently, the ULC drafted the Uniform Fiduciary Access to Digital Assets Act (2014-15), which has been adopted in at least 46 states.

The UPDPA departs in form and substance from existing privacy and data protection laws in the U.S., and indeed internationally. The law would provide individuals with fewer, and more limited, rights to access and otherwise control data, with broad exemptions for pseudonymized data. Further, narrowing the scope of application, UPDPA only applies to personal data “maintained” in a “system of records” used to retrieve records about individual data subjects for the purpose of individualized communication or decisional treatment. The Prefatory Note of a late-stage draft of the UPDPA notes that it seeks to avoid “the compliance and regulatory costs associated with the California and Virginia regimes.” 

Central to the framework, however, is a useful distinction between “compatible,” “incompatible,” and “prohibited” data practices, which moves beyond a purely consent model based on the likelihood that the data practice may benefit or harm a data subject. We also find that the model laws’ treatment of Voluntary Consensus Standards offers a unique approach towards implementation that is context and sector-specific. Overall, we think the ULC model bill offers an interesting alternative for privacy regulation. However, because it departs significantly from existing frameworks, it could be slow to be adopted in states that are concerned with interoperability with recent laws passed in California, Virginia, and Colorado. 

The summary below provides an overview of the key features of the model law, including:

Read More:

Scope 

UPDPA applies to controllers and processors that “conduct business” or “produce products or provide services purposefully directed to residents” in the state of enactment. Government entities are excluded from the scope of the Act. 

To be covered, businesses must meet one of the following thresholds:

  1. during a calendar year maintains personal data about more than [50,000] data subjects who are residents of the state, excluding data subjects whose data is collected or maintained solely to complete a payment transaction;
  2. earns more than [50] percent of its gross annual revenue during a calendar year from maintaining personal data from data subjects as a controller or processor;
  3. is a processor acting on behalf of a controller the processor knows or has reason to know satisfies paragraph (1) or (2); or
  4. maintains personal data, unless it processes the personal data solely using compatible data practices

The effect of threshold (4) is that UPDPA applies to smaller firms that maintain personal data, but relieves them of compliance obligations as long as they use the personal data only for “compatible” purposes. 

Maintained Personal Data 

UPDPA applies to “personal data,” which includes (1) records that identify or describe a data subject by a direct identifier, or (2) pseudonymized data. The term does not include “deidentified data.” UPDPA also does not apply to information processed or maintained in the course of employment or application for employment, and “publicly available information,” defined as information lawfully made available from a government record, or available to the general public in “widely distributed media.”

Narrowing the scope of application, UPDPA only applies to personal data “maintained” in a “system of records” used to retrieve records about individual data subjects for the purpose of individualized communication or decisional treatment. The committee has commented that the definition of “maintains” is pivotal to understanding the scope of UPDPA. To the extent that data collected by businesses related to individuals is not maintained as a system of records for the purpose and function of making individualized assessments, decisions, or communications, it would not be within the scope of the Act (for instance, if it were maintained in the form of emails or personal photographs). According to the committee, the definition of “maintains” is modeled after the federal Privacy Act’s definitions of “maintains” and “system of records”. 5 U.S.C. §552a(a)(3), (a)(5).

Rights of Access and Correction

Access and Correction Rights: UPDPA grants data subjects the rights to access and correct personal data, excluding personal data that is pseudonymized and not maintained with sensitive data (as described below). Controllers are only required to comply with authenticated data subject requests. “Data subject” is defined as an individual who is identified or described by personal data. According to the committee, the access and correction rights extend not only to personal information provided by a data subject, but also commingled personal data collected by the controller from other sources, such as public sources, and from other firms. 

Non-Discrimination: UPDPA prohibits controllers from denying a good or service, charging a different rate, or providing a different level of quality to a data subject in retaliation for exercising one of these rights. However, controllers may still make a data subject ineligible to participate in a program if the corrected information requested by them makes them ineligible, as specified by the program’s terms of service. 

No Deletion Right: Notably, UPDPA does not grant individuals the right to delete their personal data. The ULC committee has enumerated various reasons for taking this approach, including: (1) the wide range of legitimate interests for controllers to retain personal data, (2) difficulties associated with ensuring that data is deleted, given how it is currently stored and processed, and (3) compatibility with the First Amendment of the U.S. Constitution (free speech). The committee has also stated that UPDPA’s restrictions on processing for compatible uses or incompatible uses with consent should provide sufficient protection.

Pseudonymized Data 

Pseudonymized data” is defined as “personal data without a direct identifier that can be reasonably linked to a data subject’s identity or is maintained to allow individualized communication with, or treatment of, the data subject. The term includes a record without a direct identifier if the record contains an internet protocol address, a browser, software, or hardware identification code, a persistent unique code, or other data related to a particular device.”

Pseudonymized data is subject to fewer restrictions than more identifiable forms of personal data. Generally, consumer rights contained in UPDPA (access and correction) do not apply to pseudonymized data. However, these rights do still apply to “sensitive” pseudonymized data to the extent that it is maintained in a way that renders the data retrievable for individualized communications and treatment. 

Sensitive data” includes personal data that reveals: (A) racial or ethnic origin, religious belief, gender, sexual orientation, citizenship, or immigration status; (B) credentials sufficient to access an account remotely; (C) a credit or debit card number or financial account number; (D) a Social Security number, tax-identification number, driver’s license number, military identification number, or an identifying number on a government-issued identification; (E) geolocation in real time; (F) a criminal record; (G) income; (H) diagnosis or treatment for a disease or health condition; (I) genetic sequencing information; or (J) information about a data subject the controller knows or has reason to know is under 13 years of age. 

In practice, the ULC committee has stated that a collecting controller that stores user credentials and customer profiles can avoid the access and correction obligations if it segregates its data into a key code and a pseudonymized database so that the data fields are stored with a unique code and no identifiers. The separate key will allow the controller to reidentify a user’s data when necessary or relevant for their interactions with the customers. Likewise, a collecting controller that creates a dataset for its own research use (without maintaining it in a way that allows for reassociation with the data subject) will not have to provide access or correction rights even if the pseudonymized data includes sensitive information. Additionally, a retailer that collects and transmits credit card data to the issuer of the credit card in order to facilitate a one-time credit card transaction is not maintaining this sensitive pseudonymized data.

Compatible, Incompatible, and Prohibited Data Practices 

UPDPA distinguishes between “compatible,” “incompatible,” and “prohibited” data practices. Compatible data practices are per se permissible, so controllers and processors may engage in these practices without obtaining consent from the data subject. Incompatible data practices are permitted for non-sensitive data if the data subject is given notice and an opportunity to withdraw consent (an opt-out right). However, opt-in consent is required for a controller to engage in incompatible data processing of “sensitive” personal data. Controllers are prohibited from engaging in prohibited data practices. 

UPDPA’s distinctions between “compatible,” “incompatible,” and “prohibited” data practices are based on the likelihood that the data practice may benefit or harm a data subject:

Compatible data practices: A controller or processor engages in a compatible data practice if the processing is “consistent with the ordinary expectations of data subjects or is likely to benefit data subjects substantially.” 

Incompatible data practices: A controller or processor engages in an incompatible data practice if the processing: 

Prohibited data practices: Processing personal data is a prohibited data practice if the processing is likely to: 

Responsibilities of Collecting Controllers, Third-Party Controllers, and Processors 

UPDPA creates different obligations for “controllers” and “processors,” and further distinguishes between “collecting controllers” and “third party controllers.”

If a person with access to personal data engages in processing that is not at the direction and request of a controller, that person becomes a controller rather than a processor, and is therefore subject to the obligations and constraints of a controller.

Aside from complying with access and correction requests, controllers have a number of additional responsibilities, such as notice and transparency obligations, obtaining consent for incompatible data practices, abstaining from engaging in a prohibited data practice, and conducting and maintaining data privacy and security risk assessments. Processors are also required to conduct and maintain data privacy and security risk assessments. 

Voluntary Consensus Standards 

UPDPA enables stakeholders representing a diverse range of industry, consumer, and public interest groups to negotiate and develop voluntary consensus standards (VCS’s) to comply with UPDPA in specific contexts. VCS’s may be developed to comply with any provision of UPDPA, such as the identification of compatible practices for an industry, the procedure for securing consent for an incompatible data practice, or practices that provide reasonable security for data. Once established and recognized by a state Attorneys General, any controller or processor can explicitly adopt and comply with it. It is also worth noting that compliance with a similar privacy or data protection law, such as the GDPR or CCPA, is sufficient to constitute compliance with UPDPA.

Enforcement and Rulemaking 

UPDPA grants State Attorneys General rulemaking and enforcement authority. The “enforcement authority, remedies, and penalties provided by the [state consumer protection act] apply to a violation” of UPDPA. However, notwithstanding this provision, “a private cause of action is not authorized for a violation of this Act or under the consumer protection statute for violations of this Act.” 

What’s Next for the UPDPA?

The future of the UPDPA is, as yet, unclear. The drafting committee is currently developing a legislative strategy for submitting the Act to state legislatures on a state-by-state basis. It remains to be seen whether state legislators will have an appetite to introduce, consider, and possibly pass the UPDPA during the legislative session of 2022 and beyond. As an alternative, legislators may wish to adapt certain elements of the model law, such as the voluntary consensus standards (VCS), flexibility for research, or the concept of “compatible,” “incompatible,” or “prohibited” data practices based on the likelihood of substantial benefit or harm to the data subject, rather than purely notice and consent.

Colorado Privacy Act Passes Legislature: Growing Inconsistencies Ramp Up Pressure for Federal Privacy Law

Today, the Colorado Senate approved the House version of the Colorado Privacy Act (SB21-190) that passed yesterday, on June 7. If approved by Governor Jared Polis, Colorado will follow Virginia and California as the third U.S. state to establish baseline legal protections for consumer privacy.

“Although the Colorado Privacy Act contains notable advances that build on California and Virginia — in particular, formalizing a global privacy control, and applying to non-profit organizations — there continues to be an urgent need for Congress to set federal standards that create baseline nationwide protections for all.”

Statement by Polly Sanderson, Policy Counsel, Future of Privacy Forum

Colorado’s law features elements of both Virginia and California’s consumer privacy laws, as well as some elements unique to Colorado. The law is the first in the U.S. to apply to non-profit entities in addition to commercial entities. It contains a strong consent standard to process personal data for incompatible secondary uses and to process sensitive data such as health information, race, ethnicity, and other sensitive categories. The bill prohibits controllers from employing so-called “dark patterns” to obtain consent and allows consumers to exercise their opt-out rights via authorized agents. Consumers will be able to express their intent to opt-out of sales and targeted advertising via a universal opt-out mechanism established by the Colorado Attorney General, who is also granted authority to issue opinion letters and interpretive guidance on what constitutes a violation of the Act. 

Similar to Virginia’s recently passed Consumer Data Protection Act, Colorado’s law requires controllers to conduct data protection assessments for processing activities that present a heightened risk of harm to a consumer. This, along with FIPPs-inspired data minimization and purpose specification provisions, promotes organizational accountability and moves beyond a notice and consent framework. By excluding de-identified data from the scope of personal data and excluding pseudonymous data from the rights of access, correction, deletion, and portability, the law follows existing standards and incentivizes covered entities to maintain data in less identifiable formats. 

As a growing number of states begin to pass their own consumer privacy laws, concerns about interoperability may begin to emerge. For instance, definitional differences regarding what constitutes sensitive data, pseudonymous data, and biometric data may present operational challenges for businesses. Similarly, the scope of access, deletion, and other consumer rights differ between Colorado, Virginia, and California, creating potential implementation challenges. Finally, the research exemptions of each of these laws differ in their flexibility, consent, and oversight requirements.

Media Inquiries: Polly Sanderson, Senior Counsel at [email protected]

Privacy Trends: Four State Bills to Watch that Diverge from California and Washington Models

During 2021, state lawmakers have proposed a range of models to regulate consumer privacy and data protection. 

As the first state to pass consumer privacy legislation in 2018, California established a highly influential model with the California Consumer Privacy Act. In the years since, other states have introduced dozens of nearly identical CCPA-like state bills. In 2019, the Washington Privacy Act became an alternative model, which also saw large numbers of nearly identical WPA-like state bills introduced in other states throughout 2019-2021. In February, 2021, the passage of the Virginia Consumer Data Protection Act cemented the Washington model as an influential alternative framework. 

In 2021, however, numerous divergent frameworks have begun to emerge, with the potential to establish strong consumer protections, conflict with other states, and potentially influence federal privacy law. These proposals diverge from the California and Washington models in key ways, and are worth examining because of how they show ongoing cross-pollination, reveal concerns driving lawmakers about the inadequacy of notice and choice frameworks, and offer novel approaches for lawmakers and other stakeholders to discuss, debate, and consider. 

The California Model 

As the first state to enact consumer privacy legislation in 2018, California has a distinct and highly influential model for consumer privacy law. Since the passage of the California Consumer Privacy Act (CCPA), a proliferation of state proposals have adopted a similar framework, scope, and terminology. This reflects a general desire among state legislators to provide their constituents with at least the same privacy rights as those afforded to Californians, but in 2018, many hadn’t yet conceptualized alternative frameworks of their own. 

California-style proposals adopt “business-service provider” terminology, focus on consumer-business relationships, and are characterized by their focus on providing consumers with greater transparency and control over their personal data. They feature a bundle of privacy rights, including the right for consumers to “opt-out” of sales (or sharing) of personal data, and require businesses to post “Do Not Sell” links on their website homepages. Often, California-style proposals also include provisions which aim to make it easier for consumers to exercise their opt-out rights, such as authorized agent and universal opt-out provisions.

Though none have passed into law, the California model has influenced many state proposals over the past three years, such as Alaska’s failed HB 159 / SB 116, Florida’s failed HB 969 / SB 1734, and New York Governor Cuomo’s failed Data Accountability and Transparency Act incorporated into Budget Legislation. Oklahoma’s failed HB 1602 also adopted a similar framework, though it would require businesses to obtain “opt-in” consent to sell personal data, rather than “opt-out.” 

The Washington Model 

The Washington Privacy Act (WPA – SB 5062), sponsored by Sen. Reuven Carlyle (D), recently failed for the third consecutive legislative session. However, in February, 2021, Virginia passed legislation which follows the general framework of the WPA. Virginia’s Consumer Data Protection Act (VA-CDPA) sponsored by Delegate Cliff Hayes (D) and Sen. David Marsden (D), will become effective on January 1, 2023. 

The framework includes (1) processor/controller terminology; (2) heightened privacy protections for sensitive data; (3) individual rights of access, deletion, correction, portability, and the right to opt-out of sales, targeted advertising, and profiling in the furtherance of legal (or similarly significant) decisions; (4) differential treatment of pseudonymous data, (5) data protection impact assessments for high risk data processing activities, (6) flexibility for research, and (7) enforcement by the Attorney General. 

Numerous other active state bills adopt this framework, such as Colorado SB21-190, Connecticut SB 893, and failed proposals in Utah, Minnesota, and elsewhere. The Colorado and Connecticut proposals are both on the Senate floor calendars in their respective states. Of course, each WPA-type bill contains important differences. For instance, the Colorado and Connecticut proposals both broadly exclude pseudonymous data from all consumer rights, including opt-out rights. The Colorado proposal also features a global/universal opt-out provision for sales and targeted advertising, an opt-out standard for the processing of sensitive data (rather than opt-in), a prescriptive HIPAA de-identification standard (rather than the FTC’s 3-part test), and public research exemptions that do not incorporate provisions mandating oversight by an institutional review board (IRB) or a similar oversight body. 

Growing Divergence and Cross-Pollination 

In the three years since the passage of the CCPA, legislative divergence has increased as more and more states have convened task forces to study consumer privacy issues, and held hearings, roundtables, and 1-on-1’s with diverse experts from academia, the advocacy community, and industry. In other words, the laboratories of democracy have been experimenting — a trend which will likely continue in 2022 and beyond as legislators’ views on consumer privacy continue to become more sophisticated and nuanced. 

State bills in 2021, as compared to 2019-2020, are increasingly focused on bolstering notice and choice regimes (including a shift towards more “opt-in” rather than “opt-out” requirements), are borrowing more features from other laws (such as the GDPR’s “legitimate interests” framework), and in some cases experimenting with novel approaches (such as fiduciary duties, or “data streams”).   

For example, some state bills would require businesses to provide two-tiered short-form and long-form disclosures, and would authorize a government agency to develop a uniform logo or button to promote individual awareness of the short-form notice. Numerous proposals would generally require opt-in consent for all data processing, would prohibit manipulative user interfaces to obtain consent, and would designate user-enabled privacy controls as a valid mechanism for an individual to communicate their privacy preferences. Some proposals feature additional rights, such as the right not to be subject to “surreptitious surveillance,” the right not to be subject to a solely automated decision, and the right to object to processing. 

There is also a trend among proposals towards moving beyond a notice and choice framework, with the aim of moving the burden of privacy management away from individuals. For instance, many include strong purpose specification and data minimization requirements, and some include outright prohibitions on discriminatory data processing. At least one state (NJ A. 3283, discussed below) has taken inspiration from the EU’s General Data Protection Regulation (GDPR) by recognizing “legitimate interests” along with other lawful bases for data processing. 

Many proposals are taking novel or unique approaches to privacy legislation. For example, a Texas proposal leans towards conceptualizing personal data as property by enabling an individual to exchange their “data stream” as consideration for a contract with a business. Meanwhile, various proposals contain duties of loyalty, care, and confidentiality. These trust-based duties were first introduced into US legislation in 2018, when Sen. Brian Schatz (D-HI) introduced the Data Care Act (S. 2961). At that time, it wasn’t clear whether trust-based duties would become influential in the US. The fact that they have demonstrates the potential for cross-pollination between federal and state proposals. 

Four Notable Models to Watch 

Amidst such a large volume of California and Washington-like bills, it may be easy to miss the handful of states where legislators are taking a different approach to baseline or comprehensive privacy legislation. Even if they do not pass, these bills are worth examining because they could eventually influence federal privacy law. Additionally, they can provide insights into some of the most pressing concerns of policymakers, such as whether (and how) to regulate automated decision-making, including profiling? Whether a framework should be based on privacy self-management, relationships of trust, civil rights, or personal data as property? How personal data should be defined, and whether it should be subcategorized according to sensitivity, identifiability, source (first party, third party, derived) or something else? Answering these types of questions is not straightforward, and there are many reasonable philosophical positions for stakeholders to take. Close attention to legislative proposals can help to promote nuanced dialogue and debate about the relative merits and drawbacks of different approaches. 

Four active bills that are worth watching are (1) the New York Privacy Act (NYPA – S. 6701), (2) the New Jersey Disclosure and Accountability Transparency Act (NJ DaTA – A. 3283), (3) the Massachusetts Information Privacy Act (MIPA – S.46), and (4) Texas’s HB 3741

  1. New York Privacy Act (NYPA)

The New York Privacy Act (NYPA) (S. 6701) introduced by Sen. Kevin Thomas in May, 2021, has several distinctive features, such as an opt-in consent framework, duties of loyalty and care, heightened protections for certain types of consequential automated decision-making, and a data broker registry. The proposal passed out of the Consumer Protection Committee on May 18, and is now on the floor calendar. The legislature adjourns June 10. 

  1. New Jersey Disclosure and Accountability Transparency Act (NJ DaTA) 

The New Jersey Disclosure and Accountability Transparency Act (NJ DaTA – A. 3283) introduced by Assemblyman Andrew Zwicker (D) was heard before the Assembly Science, Innovation and Technology Committee on March 15, 2021. The legislature will remain in session through 2021. The framework includes six lawful bases for data processing, affirmative data processing duties, the right for an individual to object to processing, and heightened requirements surrounding automated decision-making. 

  1. Massachusetts Information Privacy Act (MIPA) 

The Massachusetts Information Privacy Act (MIPA – S.46) was introduced by Sen. Cynthia Stone Creem’s (D) in March, 2021. The legislature will remain in session through 2021. MIPA’s framework is based on a framework of notice and consent, with additional trust-based obligations for covered entities. Heightened protections arise for biometric data, location data, and “surreptitious surveillance” is prohibited. 

  1. Texas HB 3741

HB 3741, introduced by Rep. Capriglione (R), was referred to the Business & Industry Committee on Mar. 22. Texas’s legislative session is scheduled to end May 31, 2021. The proposal has numerous unique features. It would enable a consumer to provide their “data stream” as consideration under contract, it imposes different restrictions on three defined subcategories of personal data, and it would require opt-in consent for geotracking. In addition, businesses would be required to maintain accurate personal data.

FPF Testifies on Automated Decision System Legislation in California

Last week, on April 8, 2021, FPF’s Dr. Sara Jordan testified before the California House Committee on Privacy and Consumer Protection on AB-13 (Public contracts: automated decision systems). The legislation passed out of committee (9 Ayes, 0 Noes) and was re-referred to the Committee on Appropriations. The bill would regulate state procurement, use, and development of high-risk automated decision systems by requiring prospective contractors to conduct automated decision system impact assessments. 

At the hearing, Dr. Jordan commented as an expert witness alongside Vinhcent Le, who represented The Greenlining Institute. Dr. Jordan commended the sponsors for amending the definition of “automated decisionmaking” to account for the wide range of technical complexity in automated systems. In addition, Dr. Jordan testified that the government contract stage is an appropriate stage for the introduction of algorithmic impact assessments for high-risk applications of automated decisionmaking. This would allow authorities in California to evaluate technology before it is implemented using transparent and actionable assessment criteria.

Emerging Patchwork or Laboratories of Democracy? Privacy Legislation in Virginia and Other States

Stacey Gray, Pollyanna Sanderson & Samuel Adams

In the absence of federal privacy legislation, U.S. states are weighing in. In Virginia, the “Consumer Data Protection Act” (“CDPA”) (HB 2307 / SB 1392) could be signed into law within weeks, and if passed, would take effect on Jan. 1, 2023. If the law passes, it would become the second comprehensive (non-sectoral) data protection law in the United States, making it a potential model for other states and federal legislation.

At present, the Virginia CDPA is about 50% of the way through Virginia’s bicameral, citizen legislature. Both bills have passed in their own chambers (in the House on Jan. 29, 89-9, and in the Senate on Feb. 5, 36-0). Either bill must now pass in the other chamber, a process that will likely involve additional hearings and opportunity for debate. Assuming either the House or Senate version passes in the other chamber without further amendment, it would then be sent to the Governor for veto, signature, or amendment. In light of the rapid speed and near-unanimous legislative support, businesses, law firms, and privacy advocates alike are beginning to pay close attention.

We provide a summary below of the key features of Virginia’s CDPA, including (1) the scope of covered entities & covered data; (2); consumer rights & pseudonymised data; (3) sensitive data and risk assessment; (4) consent standard and use limitations; (5) non-discrimination; (6) controllers, processors & third parties; (7) limitations and commercial research; and (8) enforcement. 

Overall, we observe many similarities to the structure and provisions in the Washington Privacy Act (SB 5062), although there are notable differences, in particular in the scope of covered entities and personal data. Both likely go further than the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA), which do not require opt-in consent for sensitive data or mandated risk assessments – although California has made notable changes through Attorney General rulemaking. We also note some differences between these leading state legislative efforts and the EU General Data Protection Regulation (GDPR), the most notable being that US proposals rely primarily on downstream limitations on sale, sharing, and use (including through opt-outs), while the GDPR restricts all data processing for government, companies and non-profits unless one of six lawful bases are available. 

Finally, we note the broader landscape of emerging state privacy legislation, including Washington State, Oklahoma, New York, Connecticut, Minnesota, and others. As more and more state models emerge, the pressure will continue to increase on Congress to pass a federal comprehensive baseline privacy law.

1. Scope of Covered Entities & Covered Data 

The Consumer Data Protection Act (CDPA) would apply to businesses “that conduct business in the Commonwealth or produce products or services that are targeted to residents of the Commonwealth” and that process “personal data” of (i) at least 100,00 consumers per year, or (ii) at least 25,000 consumers and deriving over 50% of gross revenue from the sale of personal data. It would create rights for residents of Virginia acting only in an individual or household context, and not acting in a commercial or employment context. 

The CDPA has a broad definition of personal data: “any information that is linked or reasonably linkable to an identified or identifiable natural person.” This excludes “de-identified data,” as well as “publicly available information,” a broad term that encompasses information “lawfully made available through federal, state, or local government records” and “information that a business has a reasonable basis to believe is lawfully made available to the general public through widely distributed media, by the consumer, or by a person to whom the consumer has disclosed the information, unless the consumer has restricted the information to a specific audience.” 

In addition, a number of exemptions are currently drafted in the bill, including for:

As a result, the Virginia bill’s jurisdictional scope of “covered entities” and “personal data” is both similar and in some ways narrower than other US laws and the GDPR. It would apply to a broad definition of “personal information,” which is similar to leading US and EU law, but apply to a narrower scope of covered entities. For example, while the WPA and CCPA contain similar exemptions (for data governed by the Fair Credit Reporting Act), they do not totally exclude “entities” governed by HIPAA or GLBA. Similarly, unlike the WPA, Virginia’s bill would not apply to non-profits. The exclusions for publicly available information also differ – with the WPA containing a narrower exclusion for lawfully available government records – in contrast to information made available to the general publicly through distributed media. Given the variety of existing sectoral privacy regulatory environments in the United States (such as HIPAA and GLBA), all leading US laws and proposals so far have been narrower in scope than the GDPR – which applies broadly to all legal persons that process personal data, including non-profits, employers, and government entities.

2. Consumer Rights & Pseudonymised Data

The CDPA would grant consumers in Virginia the rights to request (1) access to their personal data; (2) correction, (3) deletion, (4) the obtainment of a copy in a “portable” and “readily usable” format; and (5) an opt-out of sale, targeted advertising, and “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” Profiling is defined as: “any form of automated processing performed on personal data to evaluate, analyze, or predict personal aspects related to an identified or identifiable natural person’s economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”

The CDPA provides a narrow limitation for “pseudonymous data,” for which companies would be required to comply with the bulk of the bill’s requirements – including opt-out rights – but not access, correction, deletion, or portability. Pseudonymous data is defined as “data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to ensure that the personal data is not attributed to an identified or identifiable natural person.” Providing flexibility for “pseudonymous data” is a shared feature of the CDPA and the WPA. The approach recognizes challenges involving authentication and verification of consumer requests involving pseudonymous data, and creates an incentive for covered entities to maintain personal information in less readily identifiable formats. The treatment of pseudonymous data also has some similarities with the GDPR’s flexibility for data that cannot be identified (Article 11), recognition of pseudonymization as a risk minimizing safeguard (Recital 28), and inclusion in Privacy by Design requirements (Article 25). 

When combined with the opt-in requirement for sensitive data (discussed below), this approach goes further than the California Consumer Privacy Act, which currently only requires that consumers be able to opt out of “sale.” When the California Privacy Rights Act (CPRA) goes into effect in 2023, it will add the right of correction, clarify that the opt-out of “sale” applies to all targeted advertising, and add an opt-out for limiting certain uses of sensitive data. On the other hand, Attorney General rulemaking in California has bolstered existing law by requiring compliance with “user-enabled global privacy controls” (including, arguably, new tools such as the Global Privacy Control). Other states, including Washington and Virginia, have not adopted such a “global opt-out” requirement.

3. Sensitive Personal Data & Risk Assessments

The CDPA would require “freely given, specific, informed, and unambiguous” consent (a standard discussed below) for controllers to collect or process sensitive data, defined as: “(1) personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status; (2) the processing of genetic or biometric data for the purpose of uniquely identifying a natural person; (3) the personal data collected from a known child; and (4) precise geolocation data.” “Child” means a person younger than 13, which aligns with COPPA. 

This definition roughly aligns with WPA, with a few notable differences, such as the WPA’s inclusion of “health condition” in addition to “diagnosis.” The CDPA’s definition also roughly aligns with the definition of sensitive data in the California Privacy Rights Act (CPRA), which will create an “opt-out” for sensitive data uses when it comes into effect in 2023. In contrast to California, however, both the WPA and CDPA would require affirmative opt-in consent prior to any collection of data – a much higher standard than the CPRA’s right to opt-out or limit the use and disclosure of sensitive personal information.

In addition, the CDPA (like the WPA) would create a new requirement that controllers conduct data protection assessments if engaged in any of the following:

These data protection assessments would be required to be made available to the Attorney General upon request, pursuant to an investigative civil demand.

4. Consent Standard & Use Limitations 

The CDPA would define consent as “freely given, specific, informed, and unambiguous,” a strong opt-in standard that aligns with the GDPR and WPA. The CDPA does lack the “dark patterns” language found in the current WPA, which would specifically outlaw controllers and processors from providing deceptive user interfaces to obtain consent from individuals. We note that many of these dark patterns may already be illegal under state and federal consumer protection law, as University of Chicago’s Lior Strahelivits observes in a recent paper, “Shining a Light on Dark Patterns.”

The CDPA would require controllers to obtain opt in consent to process personal data for incompatible secondary uses, “as disclosed to the consumer.” This is likely to be less restrictive in practice for businesses than the current WPA, which removed this language in the version of the bill introduced in 2021. Under the WPA, collection of personal data must be limited to what is reasonably necessary in relation to the purposes for which the data is processed; and adequate, relevant, and limited to what is reasonably necessary in relation to the purposes for which the data is processed. In comparison, the GDPR’s principle of purpose limitation requires all data collection to be only for a specified, explicit, and legitimate purpose, which includes compatible purposes. 

5. Non-Discrimination

The CDPA would prohibit a controller from discriminating against a consumer for exercising any of their consumer privacy rights, including denying goods or services, charging different rates, or providing a different level of quality of goods and services. However, similar to California law, it provides a broad exception for “voluntary participation in a bona fide loyalty, rewards, premium features, discounts, or club card program.” In contrast, the current WPA contains a narrower exemption for such programs that would require additional disclosures and limits the sale and secondary uses of personal information.

In addition, the CDPA would prohibit controllers from processing personal data in violation of state and federal laws that prohibit unlawful discrimination against consumers. A similar requirement is in the WPA. In comparison, while the GDPR does not contain a specific provision stating that a data subject must not be discriminated against on the basis of their choices to exercise rights, other principles of the GDPR require that individuals must be protected from discrimination on these grounds (Article 5, Article 13, Article 22, and elements of “freely given” consent and fair processing).

6. Controllers, Processors & Third Parties 

The CDPA follows the GDPR and WPA structure of dividing responsibilities between “controllers” and “processors,” rather than using the CCPA/CPRA terminology of “businesses” and “service providers”:

“Third party” is defined as “a natural or legal person, public authority, agency, or body other than the consumer, controller, processor, or an affiliate of the processor or the controller.” Controllers would be required to provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes the categories of personal data that the controller shares with third parties, if any; the categories of third parties, if any, with whom the controller shares personal data; and if a controller sells personal data to third parties or processes personal data for targeted advertising, the controller would be required to “clearly and conspicuously” disclose such processing, as well as the manner in which a consumer may exercise the right to opt out of such processing. Of note, if a third party recipient processes data in violation of the Act, controllers and processors would not be liable to the extent that they did not have “actual knowledge” that the recipient intended to commit a violation. 

7. Limitations and Commercial Research 

The CDPA would not limit a controller or processor’s ability to provide a product or service specifically requested by a consumer, perform a contract to which the consumer is a party; comply with existing laws; cooperate with civil, criminal, or regulatory investigations; cooperate with law enforcement agencies; defend legal claims; to protect an interest that is essential to the life or physical safety of the consumer or another natural person; or protect against fraud, theft, or harassment.

In addition, the CDPA would not restrict the ability of controllers and processors to engage in public or peer-reviewed scientific or statistical research in the public interest that is approved, monitored, and governed by an institutional review board (IRB) or a “similar independent oversight entity.” This aligns provides greater flexibility for commercial research than the CCPA, and aligns with broader trends in U.S. privacy legislation, including the WPA, Sen. Cantwell’s (D-WA) COPRA, and Sen. Wicker’s (R-MS) SAFE DATA Act

The requirements of the CDPA would also not limit the ability of controllers and processors to “collect, use, or retain data” to: 1) conduct internal research to “develop, improve, or repair products, services, or technology”; 2) effectuate product recalls, identify and repair technical errors; or 3) to perform internal operations that are reasonably aligned with the “reasonable expectations” of the consumer or “reasonably anticipated” based on the consumer’s existing relationship with the controller. 

8. Enforcement 

The CDPA, which contains a 30-day cure period, would be enforced by the Attorney General, with civil fines capped at $7,500. The legislation would establish a Consumer Privacy Fund within the Office of the Attorney General in order to establish funding in future years. 

A right to cure period is a current feature in the CCPA that will be removed by the California Privacy Rights Act, and is currently being debated in Washington State. In Virginia, several stakeholders have testified that the cure period would promote faster and less costly results for consumers, and may be useful as businesses adapt to compliance. Others, such as Consumer Reports, have advocated for it to be removed as unduly limiting on enforcement. At a recent hearing on the WPA in Senate Ways & Means Committee, the WA Attorney General Office suggested sunsetting the cure period once affected entities had time to adjust their data practices. 

Recognizing the Broader Landscape of Emerging State Laws

Virginia is not alone – many, if not most, U.S. states are considering or introducing consumer privacy legislation in 2021. The CDPA shares a similar framework with the WPA and the Minnesota Consumer Data Privacy Act (HF3936). Some align more closely with the CCPA, e.g., Michigan (HB6457), Connecticut (SB156), New York (Data Accountability and Transparency Act). These bills generally place a greater focus on privacy self-management, business-consumer relationships, and enabling consumers to “opt-out” of having their personal data “sold.” In contrast, Oklahoma (OK 1602) is actively considering a proposal that would provide individuals with a right to “opt-in” to sales of their personal data. 
If passed, the CDPA would be the second comprehensive (non-sectoral) data protection law in the United States after the CCPA. As more and more states weigh in, however, significant issues are already beginning to arise about interoperability for companies handling data across state lines. For now, legislation in Virginia, Washington, and Minnesota are in some ways interoperable with the GDPR and the CCPA, but will clearly require companies to customize practices to match key differences in each state. As additional laws pass, and if further conflicts emerge, it will certainly increase the pressure on Congress to pass a federal comprehensive baseline privacy law. At a Nov. 24 hearing, Virginia legislators generally expressed a preference for federal legislation, but recognized that in its absence, Virginia residents deserve data protection rights already enjoyed in California, by everyone in Europe, and by an increasing number of people around the world.