The “Colorado Effect?” Status Check on Colorado’s Privacy Rulemaking
Colorado is set to formally enter a rulemaking process which may establish de facto interpretations for privacy protections across the United States. With the passage of the Colorado Privacy Act (CPA) in 2021, Colorado, along with Virginia, Utah, and Connecticut, became part of an emerging group of states adopting privacy laws that share a similar framework and many core definitions with a legislative model developed (though never enacted) in Washington State. However, while the general model of legislation seen in the CPA is similar to recently enacted state privacy laws, the CPA stands alone in providing authority to the state Attorney General to issue regulations.
Because no other similar state law has provided for this type of interpretative authority, regulations issued by the Colorado Attorney General could have far-reaching implications for how both businesses and regulators in other jurisdictions come to interpret key state privacy rights and protections. Colorado’s pre-rulemaking process recently concluded, revealing a range of possible directions that formal rulemaking could take. Below, we assess key priorities and areas of significant divergence that have been brought into focus both through public comments from stakeholders and questions posed by the Attorney General.
The Rulemaking Process
The CPA grants broad discretionary rulemaking authority to the Colorado Attorney General to issue regulations to help implement the Act. In April 2022, Colorado Attorney General Phil Weiser released a set of pre-rulemaking considerations containing a series of questions for public comment. This document offered the first hints as to the specific topics that the Colorado Department of Law (“the Department”) is considering addressing beyond opt-out mechanisms. It includes targeted questions on the CPA’s consent requirements, restrictions on so-called “dark patterns”, standards for data protection assessments, and consumers’ right to opt-out of certain automated profiling decisions. The Department’s questionnaire received 44 comments from a range of stakeholders including business groups, non-profits, civil society organizations, and think tanks (including the Future of Privacy Forum). We provide a non-comprehensive summary of significant issues addressed across these public comments below.
1. Universal Opt-Out Mechanisms
Colorado holds the distinction of being the first state to clearly require that businesses allow consumers to exercise certain privacy rights on an automated basis through technological signals (such as browser settings or plug-ins). Notably, opt-out mechanisms are the only topic on which the CPA requires rulemaking, directing the Attorney General to establish “technical specifications” for signal mechanisms that will: (1) prohibit signal providers from unfairly disadvantaging other businesses, (2) ensure that signals represent a consumer’s freely given choice to opt out, and (3) permit covered entities to authenticate that a signal is sent by a resident of the state and represents a legitimate request to opt out. The Department’s questionnaire addressed these issues and sought additional input on how signal mechanisms should apply to data collected offline.
Default Signal Settings: The CPA prohibits opt-out mechanisms that are a “default setting” and instead requires signals to represent a consumer’s “affirmative, freely given, and unambiguous” choice to opt out. The Department’s questionnaire sought feedback as to whether a consumer’s selection of a tool marketed for its privacy features without taking additional action would satisfy the requirement for user intent (an approach that regulators in California appear to have endorsed). This inquiry generated a broad range of responses. For example, a Wesleyan University professor asserted that the selection of “privacy-preserving products” including FireFox, Brave, and DuckDuckGo Privacy Essentials can unambiguously reflect an intent to opt out of targeted advertising and other forms of data monetization without requiring a user to take additional steps. Industry groups such as the Colorado Chamber of Commerce typically rejected this view, arguing that “any mechanism involving a default or pre-selected opt-out choice in effect would be an opt-in, rather than the opt-out required by the statute.” The Future of Privacy Forum called for a context-specific approach, arguing that while the installation of a single-purpose plug-in may reflect unambiguous consumer choice to opt out, the use of a multi-feature product such as a web browser would be unlikely to satisfy the CPA’s statutory requirements.
Opt-Out Signal Authentication: Under the CPA, opt-out mechanisms are required to allow recipient organizations to authenticate a signal’s user as a Colorado resident and to determine that the signal represents a legitimate opt out request. Numerous commenters expressed concern that establishing strict authentication procedures could have the effect of frustrating consumer intent in exercising their privacy rights and suggested regulatory workarounds. For example, the Colorado Privacy Policy Commission suggested a standard that opt-out signal authentication must require no more than three steps to complete. Separately, several organizations including Consumer Reports and the Network Advertising Initiative (NAI) suggested that regulations could permit authenticating residency with a user’s IP address. However, the State Privacy and Security Coalition (SPSC) and TechNet raised concerns about VPNs and other technologies that can make determining location by IP addresses unreliable, and further posited that the CPA may raise Constitutional concerns if enforcement of opt-out mechanisms extends beyond authenticated Colorado residents.
Signal Scope: A significant technical and policy challenge for the use of opt-out mechanisms is whether a signal can and should apply to data collected outside of the signal’s medium. For example, can a browser-based signal be used to exercise consumer rights over information that was previously collected at a brick-and-mortar retail store? Consumer Reports argued that while regulations should not require the collection of additional information in order to process opt out signals, a signal should apply beyond its present interaction “if the user is authenticated to the service by an identifier that applies in other contexts.” In contrast, business groups highlighted technical limitations with opt-out signals as they presently exist, for example, the Computer and Communications Industry Association (CCIA) posited that “if only browser extensions can serve as [opt out signals], the requirement to honor [opt out signals] should only extend to browsers.”
2. Consent
The CPA requires covered entities to obtain individual consent in certain circumstances, including for the processing of sensitive personal data and for incompatible secondary uses of information. The Act requires that consent be “freely given, specific, informed, and unambiguous,” closely matching the definition in other state laws and modeled on European privacy law. The Department sought information about each of these elements of consent as well as existing consent mechanisms.
Revoking Consent: Multiple organizations pointed to the lack of an explicit right to “revoke” consent as a potential gap in the statute to cover through rulemaking. The Electronic Privacy Information Center (EPIC) and The Samuelson-Glushko Technology Law & Policy Clinic at Colorado Law (TLPC) explained that while the CPA requires that it be just as easy to withdraw consent as it is to provide it in the case of overriding a universal opt out, there is no explicit right to revoke consent for other instances of data processing in the Act. Future of Privacy Forum pointed to broader rights of revocation in the GDPR and Connecticut Data Privacy Act as potential models to follow, recommending that “forthcoming regulations follow an approach similar as Connecticut by providing that consumers may, at any time, withdraw previously provided consent.” Law firm Husch Blackwell also highlighted model rights of revocation in other privacy regimes, further noting that “although it could be argued that the right to revoke consent is implicit in the CPA, it is not clear that Colorado law supports this position based on analogizing from existing court decisions.”
Implied Consent: Industry and advocacy groups alike also weighed in on when, if at all, implied consent could meet the statutory requirements of the CPA. CCIA contended that an “affirmative act” where a consumer purposefully provides personal data should not require additional consent procedures: “For instance, a consumer who intentionally submits sensitive demographic data (such as citizenship status or religious affiliation) while completing an online form should be deemed to have consented to the collection and processing of that demographic data.” On the other hand, EPIC and Consumer Reports sought stricter standards for obtaining consent. Consumer Reports proposed mandating that any request for consent include a “dedicated prompt” that “clearly and prominently describes the processing for which the company seeks to obtain consent,” while EPIC argued that consent should not be implied when a consumer exits a pop-up window that asks for consent.
3. Dark Patterns
The Colorado Privacy Act states that a consumer’s consent is not valid if obtained through the use of “dark patterns” which are defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” This language originated in the proposed DETOUR Act introduced by U.S. Senators Warner (D-VA) and Fischer (R-NE) in 2019. In the context of the CPA, the concept of dark patterns is a subset of the Act’s approach to individual consent. Nevertheless, the Department posed several specific questions on dark patterns, including whether the rules should outline specific types of dark patterns and what standards or principles could best guide design choices to avoid dark patterns.
Dark Patterns Definition and Scope: Several business groups raised concerns with the CPA’s definition of “dark patterns”, such as CTIA, which argued that the term is “vague” and leaves the door open to confusion on the part of both consumers and businesses. Numerous industry commenters encouraged the Department to avoid a prescriptive approach to the term and to instead focus on practices that amount to consumer deception or fraud, pointing to a long line of Federal Trade Commission enforcement actions in this realm. In contrast, some advocacy groups called for an expansive interpretation and application of the term “dark patterns” in order to protect consumers beyond the context of CPA’s “consent” requirements. For example, Common Sense Media recommended “prohibiting asymmetric platform design practices that limit users’ ability to change user settings, delete personal data, or delete their account.” Colorado Public Interest Research Group (CoPIRG) went a step further, recommending the development of rules that “prohibit platforms from using dark patterns in any consumer interaction.” However, it is unclear whether the Attorney General would have the statutory authority to issue expansive new restrictions on user interface designs along these lines.
4. Data Protection Assessments
Data Protection Assessments (“Assessments”) are an increasingly common requirement in privacy and data protection regimes around the globe. The CPA is no exception and requires an assessment for processing that “presents a heightened risk of harm to a consumer.” Assessments must weigh the risks and benefits of the processing activity and must be made available to the Attorney General upon request, though they are exempt from disclosure under the Colorado Open Records Act. The Department’s questions on this topic sought to clarify what circumstances should allow them to request an assessment and what requirements should exist for the form and content of the assessment.
Parameters for Requesting Assessments: TLPC recommended treating assessments as an ongoing process, with consistent feedback and input from affected consumers, controllers, and the Department of Law. In contrast, industry groups, including NAI, CCIA, CTIA, SPSC, and the Denver Metro Chamber of Commerce, asked that the Department establish specific parameters for when they may ask for an assessment to be conducted or disclosed. For example, the Alliance for Automotive Innovation (AAI) discouraged a regular cadence for iterating upon assessments, instead proposing that controllers be required to “update them only when there is a material change in processing activities that is likely to have an impact on consumer privacy.”
Form and Content of Assessments: In general, privacy advocates sought to establish more detailed parameters for the form and content for assessments, while industry representatives such as NAI, AAI, and various Chambers of Commerce sought more flexibility. For instance, while EPIC provided a list of preferred mandatory requirements, the Colorado Chamber of Commerce suggested that the Department “publish a set of voluntary factors that the controller could consider as they undertake a data protection assessment.”
5. Profiling
The CPA creates a new right to opt out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. Once again, this right is common to many emerging state privacy laws and is based on language that originated in the European Union. The Department raised numerous topics concerning profiling, including the disclosures about automated processing necessary for consumers to make informed opt out decisions, whether the rules should address specific legal or civil rights concerns or specific applications of profiling, whether there could be negative impacts of immediately implementing a request to opt out of profiling, and how the statute should apply to “partial” automated decisions.
Transparency: The application of the CPA’s transparency requirements to automated decision making systems was a significant focus for commenters. Industry comments typically sought limitations on disclosures, with the Denver Metro Chamber of Commerce arguing that “requiring granular visibility into each rapidly changing processing activity could cripple business.” CCIA further called for “explicit protections for intellectual property, trade secrets, and other legal rights of the business in question.” In contrast, EPIC called for broader disclosures about profiling activities such as “the sources and life cycle of the data processed by the system, including any brokers or other third-party sources involved in the data life cycle; and how the system has been evaluated for accuracy and fairness, including links to any audits, validation studies, or impact assessments.”
Opt Out Rights: Commenters also engaged on the range of profiling activities that should be subject to the consumer opt out right. Industry groups highlighted beneficial processing operations that could be disrupted by a broad reading of the language, including processing necessary for vehicle safety systems, fraud prevention, maintaining system integrity and security, and ad measurement and reporting. Many of these groups also called for regulations to limit the opt out right to “solely” automated decisions (that lack any human oversight), as Connecticut lawmakers have done. On this point, the Future of Privacy Forum recommended that consumer opt out rights still apply in situations where the human review of a profiling decision amounts to little more than a “rubber stamp.”
6. Miscellaneous Topics
Given the Attorney General’s broad rulemaking authority, any CPA topic is theoretically on the table for rulemaking, even if not specifically addressed in the questionnaire. Commenters sought regulatory tweaks and clarifications on many additional topics including:
- Biometric Data: While, the Colorado Privacy Act designates “biometric” data as inherently “sensitive,” subject to consent requirements, it does not provide a definition of the term. As there is no general consensus on the precise scope of what “biometric” data entails, several organizations requested clarification through rules. Commenters recommend following either the definitions used by Connecticut (broader) or Virginia/Utah (narrower).
- Publicly Available Information (PAI): Unlike other privacy laws, the CPA does not explicitly exempt “widely distributed media” as a category of publicly available information that is exempt from coverage under the Act. Multiple organizations including RELX and the Software and Information Industry Association (SIIA) responded by encouraging the Department to broaden the scope of the Act’s definition of PAI to match existing privacy regimes. Journalistic organizations including Denver TV station KMGH and various Colorado public radio stations also weighed in to express concern with the lack of carve outs for news-gathering activities and highlight potential First Amendment concerns with this omission.
- Deletion Requests: RELX and TechNet raised operational challenges for B2B companies in complying with deletion requests, suggesting instead that these companies achieve compliance through processing deletion requests as opt out requests. The Virginia legislature passed an amendment to their privacy law to this effect earlier this year. The People Search Services Coalition also explained that for companies that routinely pull data from public sources, constant updating of data inherently renders any compliance with a deletion request illusory and temporary.
- Non-Profit Organizations: Of the five comprehensive state privacy laws, the Colorado Privacy Act is alone in applying to non-profit organizations. In response, numerous non-profits involved in activities ranging from fraud detection to higher education filed comments, seeking either special regulatory consideration or entity-level carve outs from the CPA’s obligations.
- Non-Retaliation Right: The CPA specifies that controllers may not “increase the cost of, or decrease the availability of” a product or service “based solely on the exercise of” a privacy right. NAI urged the Department “to clarify that it is within a business’ duty, particularly for web and app publishers, to charge a reasonable fee for services, related to the value of a consumer’s data, if consumers choose not to share their data.” Consumer Reports’ comments aimed to prohibit the possibility of “differential treatment or pricing based on a consumer’s choosing to exercise a privacy choice.”
Next Steps
The Attorney General has announced a goal of issuing draft regulations in the fall of 2022 (note: AG Weiser is on the ballot for Colorado’s General Election in November, the outcome of which may influence this timeline). Pursuant to the Colorado Administrative Procedure Act, publishing draft regulations will begin a formal notice-and-comment phase, which will also include at least one formal hearing. Given the importance of Colorado’s rulemaking process to the U.S. privacy landscape and the range of directions that the Attorney General could take on rulemaking (in both scope and substance), it can be expected that stakeholders will remain actively engaged in this process.