Privacy Trends: Four State Bills to Watch that Diverge from California and Washington Models
During 2021, state lawmakers have proposed a range of models to regulate consumer privacy and data protection.
As the first state to pass consumer privacy legislation in 2018, California established a highly influential model with the California Consumer Privacy Act. In the years since, other states have introduced dozens of nearly identical CCPA-like state bills. In 2019, the Washington Privacy Act became an alternative model, which also saw large numbers of nearly identical WPA-like state bills introduced in other states throughout 2019-2021. In February, 2021, the passage of the Virginia Consumer Data Protection Act cemented the Washington model as an influential alternative framework.
In 2021, however, numerous divergent frameworks have begun to emerge, with the potential to establish strong consumer protections, conflict with other states, and potentially influence federal privacy law. These proposals diverge from the California and Washington models in key ways, and are worth examining because of how they show ongoing cross-pollination, reveal concerns driving lawmakers about the inadequacy of notice and choice frameworks, and offer novel approaches for lawmakers and other stakeholders to discuss, debate, and consider.
The California Model
As the first state to enact consumer privacy legislation in 2018, California has a distinct and highly influential model for consumer privacy law. Since the passage of the California Consumer Privacy Act (CCPA), a proliferation of state proposals have adopted a similar framework, scope, and terminology. This reflects a general desire among state legislators to provide their constituents with at least the same privacy rights as those afforded to Californians, but in 2018, many hadn’t yet conceptualized alternative frameworks of their own.
California-style proposals adopt “business-service provider” terminology, focus on consumer-business relationships, and are characterized by their focus on providing consumers with greater transparency and control over their personal data. They feature a bundle of privacy rights, including the right for consumers to “opt-out” of sales (or sharing) of personal data, and require businesses to post “Do Not Sell” links on their website homepages. Often, California-style proposals also include provisions which aim to make it easier for consumers to exercise their opt-out rights, such as authorized agent and universal opt-out provisions.
Though none have passed into law, the California model has influenced many state proposals over the past three years, such as Alaska’s failed HB 159 / SB 116, Florida’s failed HB 969 / SB 1734, and New York Governor Cuomo’s failed Data Accountability and Transparency Act incorporated into Budget Legislation. Oklahoma’s failed HB 1602 also adopted a similar framework, though it would require businesses to obtain “opt-in” consent to sell personal data, rather than “opt-out.”
The Washington Model
The Washington Privacy Act (WPA – SB 5062), sponsored by Sen. Reuven Carlyle (D), recently failed for the third consecutive legislative session. However, in February, 2021, Virginia passed legislation which follows the general framework of the WPA. Virginia’s Consumer Data Protection Act (VA-CDPA) sponsored by Delegate Cliff Hayes (D) and Sen. David Marsden (D), will become effective on January 1, 2023.
The framework includes (1) processor/controller terminology; (2) heightened privacy protections for sensitive data; (3) individual rights of access, deletion, correction, portability, and the right to opt-out of sales, targeted advertising, and profiling in the furtherance of legal (or similarly significant) decisions; (4) differential treatment of pseudonymous data, (5) data protection impact assessments for high risk data processing activities, (6) flexibility for research, and (7) enforcement by the Attorney General.
Numerous other active state bills adopt this framework, such as Colorado SB21-190, Connecticut SB 893, and failed proposals in Utah, Minnesota, and elsewhere. The Colorado and Connecticut proposals are both on the Senate floor calendars in their respective states. Of course, each WPA-type bill contains important differences. For instance, the Colorado and Connecticut proposals both broadly exclude pseudonymous data from all consumer rights, including opt-out rights. The Colorado proposal also features a global/universal opt-out provision for sales and targeted advertising, an opt-out standard for the processing of sensitive data (rather than opt-in), a prescriptive HIPAA de-identification standard (rather than the FTC’s 3-part test), and public research exemptions that do not incorporate provisions mandating oversight by an institutional review board (IRB) or a similar oversight body.
Growing Divergence and Cross-Pollination
In the three years since the passage of the CCPA, legislative divergence has increased as more and more states have convened task forces to study consumer privacy issues, and held hearings, roundtables, and 1-on-1’s with diverse experts from academia, the advocacy community, and industry. In other words, the laboratories of democracy have been experimenting — a trend which will likely continue in 2022 and beyond as legislators’ views on consumer privacy continue to become more sophisticated and nuanced.
State bills in 2021, as compared to 2019-2020, are increasingly focused on bolstering notice and choice regimes (including a shift towards more “opt-in” rather than “opt-out” requirements), are borrowing more features from other laws (such as the GDPR’s “legitimate interests” framework), and in some cases experimenting with novel approaches (such as fiduciary duties, or “data streams”).
For example, some state bills would require businesses to provide two-tiered short-form and long-form disclosures, and would authorize a government agency to develop a uniform logo or button to promote individual awareness of the short-form notice. Numerous proposals would generally require opt-in consent for all data processing, would prohibit manipulative user interfaces to obtain consent, and would designate user-enabled privacy controls as a valid mechanism for an individual to communicate their privacy preferences. Some proposals feature additional rights, such as the right not to be subject to “surreptitious surveillance,” the right not to be subject to a solely automated decision, and the right to object to processing.
There is also a trend among proposals towards moving beyond a notice and choice framework, with the aim of moving the burden of privacy management away from individuals. For instance, many include strong purpose specification and data minimization requirements, and some include outright prohibitions on discriminatory data processing. At least one state (NJ A. 3283, discussed below) has taken inspiration from the EU’s General Data Protection Regulation (GDPR) by recognizing “legitimate interests” along with other lawful bases for data processing.
Many proposals are taking novel or unique approaches to privacy legislation. For example, a Texas proposal leans towards conceptualizing personal data as property by enabling an individual to exchange their “data stream” as consideration for a contract with a business. Meanwhile, various proposals contain duties of loyalty, care, and confidentiality. These trust-based duties were first introduced into US legislation in 2018, when Sen. Brian Schatz (D-HI) introduced the Data Care Act (S. 2961). At that time, it wasn’t clear whether trust-based duties would become influential in the US. The fact that they have demonstrates the potential for cross-pollination between federal and state proposals.
Four Notable Models to Watch
Amidst such a large volume of California and Washington-like bills, it may be easy to miss the handful of states where legislators are taking a different approach to baseline or comprehensive privacy legislation. Even if they do not pass, these bills are worth examining because they could eventually influence federal privacy law. Additionally, they can provide insights into some of the most pressing concerns of policymakers, such as whether (and how) to regulate automated decision-making, including profiling? Whether a framework should be based on privacy self-management, relationships of trust, civil rights, or personal data as property? How personal data should be defined, and whether it should be subcategorized according to sensitivity, identifiability, source (first party, third party, derived) or something else? Answering these types of questions is not straightforward, and there are many reasonable philosophical positions for stakeholders to take. Close attention to legislative proposals can help to promote nuanced dialogue and debate about the relative merits and drawbacks of different approaches.
Four active bills that are worth watching are (1) the New York Privacy Act (NYPA – S. 6701), (2) the New Jersey Disclosure and Accountability Transparency Act (NJ DaTA – A. 3283), (3) the Massachusetts Information Privacy Act (MIPA – S.46), and (4) Texas’s HB 3741.
- New York Privacy Act (NYPA)
The New York Privacy Act (NYPA) (S. 6701) introduced by Sen. Kevin Thomas in May, 2021, has several distinctive features, such as an opt-in consent framework, duties of loyalty and care, heightened protections for certain types of consequential automated decision-making, and a data broker registry. The proposal passed out of the Consumer Protection Committee on May 18, and is now on the floor calendar. The legislature adjourns June 10.
- Opt-In Consent & Global Consent Mechanism: The requirement for controllers to obtain opt-in consent to process a consumer’s personal data is accompanied by a prohibition on manipulative user interface design choices in order to obtain consent. Controllers would also be required to treat user-enabled privacy controls (e.g., browser plug-ins, device settings, or other mechanisms) that communicate or signal the consumer’s choice not to be subject to targeted advertising or the sale of their personal data as a denial of consent.
- Duty of Loyalty & Care: The “duty of loyalty” would require a controller to obtain consent to process data in ways that are reasonably foreseeable to be against a consumer’s physical, financial, psychological, or reputational interests. The proposal also contains a “duty of care,” requiring controllers to conduct and document annual, nonpublic risk assessments for all processing of personal data. Of note, the “duty of loyalty” is distinct from the “fiduciary duty” contained in an earlier 2019 version of the NYPA (S. 5642).
- Automated Decision Making: Annual public impact assessments would be required for a controller or processor “engaged in” consequential automated decisions (such as financial, housing, and employment). Whenever a controller makes an automated decision involving solely automated processing that “results in” a denial of financial or lending services, housing, public accommodation, insurance, health care services, or access to basic necessities, such as food and water, the controller must: (i) disclose that the decision was made by a solely automated process; (ii) provide an avenue for the affected consumer to appeal the decision, including by allowing the affected consumer to (a) express their point of view, (b) contest the decision, and (c) obtain meaningful human review; and (iii) explain how to appeal the decision.
- Data Broker Registry: The NYPA would establish a data broker registry and prohibit controllers from sharing personal data with data brokers that fail to register.
- New Jersey Disclosure and Accountability Transparency Act (NJ DaTA)
The New Jersey Disclosure and Accountability Transparency Act (NJ DaTA – A. 3283) introduced by Assemblyman Andrew Zwicker (D) was heard before the Assembly Science, Innovation and Technology Committee on March 15, 2021. The legislature will remain in session through 2021. The framework includes six lawful bases for data processing, affirmative data processing duties, the right for an individual to object to processing, and heightened requirements surrounding automated decision-making.
- Six Lawful Grounds for Processing: The framework creates six lawful grounds for processing, including “legitimate interests” pursued, affirmative consent, contractual necessity, to protect the vital interests of a person, compliance with a legal obligation, and necessary for the performance of a task.
- Data Processing Duties: NJ DaTA would require for: (1) all data collection to be for a specified, explicit, and legitimate purpose, (2) data to be processed lawfully, fairly, and transparently, (3) data collection and processing to be adequate, relevant, and limited to what is necessary, (4) data to be accurate and kept up to date, (5) data to be kept in a form which permits identification of consumers for no longer than is necessary for the processing purposes, (6) data security. Data protection impact assessments would also be required prior to processing personal data.
- Right to Object to Processing: In addition to the rights of access, correction, deletion, and portability, the proposal would grant individuals the right to object to the processing of personal data. For a controller to continue to process personal data in this circumstance, the controller must demonstrate “compelling legitimate grounds” for processing which override the interests, rights, and freedoms of the consumer. The proposal would also grant individuals a right to object to processing of personal data for the purpose of direct marketing, including profiling. When a consumer objects to processing for this purpose, the controller must stop processing.
- Automated Decision-Making: The proposal would create the right for a consumer not to be subject to a decision based on solely automated decision making, including profiling, which produces legal effects concerning the consumer. NJ DaTA would require controllers to provide specific notice to consumers at the time of collection of personal data regarding the existence of automated decision making, including profiling, meaningful information concerning the logic involved, and significance and potential consequences for the consumer.
- New Data Protection Office: NJ DaTA would establish an Office of Data Protection and Responsible Uses in the Division of Consumer Affairs to oversee compliance with the Act.
- Massachusetts Information Privacy Act (MIPA)
The Massachusetts Information Privacy Act (MIPA – S.46) was introduced by Sen. Cynthia Stone Creem’s (D) in March, 2021. The legislature will remain in session through 2021. MIPA’s framework is based on a framework of notice and consent, with additional trust-based obligations for covered entities. Heightened protections arise for biometric data, location data, and “surreptitious surveillance” is prohibited.
- Opt-In Consent: Opt-in consent would be annually required to collect or process personal data.
- Prohibition of Surreptitious Surveillance: MIPA would prohibit “surreptitious surveillance,” meaning that a covered entity would be required to obtain opt-in consent every 180 days in order to “activate the microphone, camera, or any other sensor” on individuals’ connected devices.
- Heightened Protections for Biometric and Location Data: MIPA would require covered entities to obtain specific opt-in consent annually to collect and process location or biometric data, and additional consent to disclose it to third parties. Covered entities would also be required to establish a retention schedule and guidelines for permanently destroying biometric or location data.
- Data Processing Duties: MIPA contains strict purpose limitations, and duties of loyalty, care, and confidentiality. It would also require covered entities to process personal data and use automated decision systems “discreetly and honestly,” to be “protective” of personal data, “loyal” to individuals, and “honest” about processing risks. The duty of loyalty would require covered entities to not use personal data or information derived from personal data in ways that: (1) benefit themselves to the detriment of an individual, (2) result in reasonably foreseeable and material physical or financial harm to an individual, or (3) would be unexpected and highly offensive to a reasonable individual.
- Discrimination Prohibition: MIPA would prohibit a covered entity from engaging in acts or practices that directly result in discrimination against or otherwise make an opportunity, or a public accommodation, unavailable on the basis of an individual’s or group’s actual or perceived belonging to a protected class. This includes a prohibition on targeting advertisements on the basis of actual or perceived belonging to a protected class.
- Texas HB 3741
HB 3741, introduced by Rep. Capriglione (R), was referred to the Business & Industry Committee on Mar. 22. Texas’s legislative session is scheduled to end May 31, 2021. The proposal has numerous unique features. It would enable a consumer to provide their “data stream” as consideration under contract, it imposes different restrictions on three defined subcategories of personal data, and it would require opt-in consent for geotracking. In addition, businesses would be required to maintain accurate personal data.
- Data Streams: The proposal would enable an individual to provide their “data stream” as consideration under contract. “Data stream” is defined as “the continuous transmission of an individual ’s personal identifying information through online activity or with a device connected to the Internet that can be used by the business to provide for the monetization of the information, customer relationship management, or continuous identification of an individual for commercial purposes.”
- Three Categories of Personal Data: HB 3741 divides personal data into three subcategories of “personally identifiable information.”
- Category 1 information includes personal data “that an individual may use in a personal, civic, or business setting,” including a SSN, a driver’s license number, passport number, unique biometric information, physical or mental health information, private communications, etc.
- Category 2 information includes personal data that may present a “privacy risk” to an individual, including members of a constitutionally protected class.” It includes information such as racial or ethnic origin, religion, age, precise geolocation, or physical or mental impairment. “Privacy risk” is defined broadly. Businesses would be prohibited from selling, transferring, or communicating category 2 information to a third party.
- Category 3 information includes time of birth and political party or association. Businesses would be prohibited from collecting or processing category 3 information.
- Opt-In Consent for Geolocation Tracking: Consent would be required to perform geolocation tracking of an individual, and to sell geolocation information. Individuals would also have the rights of access, correction, deletion, and portability.
- Duty to Maintain Accurate Information: Businesses would be required to maintain accurate information, and to protect and properly secure personal data.