Uniform Law Commission Finalizes Model State Privacy Law
This month, the Uniform Law Commission (ULC) voted to approve the Uniform Personal Data Protection Act (UPDPA), a model bill designed to provide a template for uniform state privacy legislation. After some final amendments, it will be ready to be introduced in state legislatures in January 2022.
The ULC has been engaged in an effort to draft a model state privacy law since 2019, with the input of advisors and observers, including the Future of Privacy Forum. First established in 1892, the ULC has a mission of “providing states with non-partisan, well-conceived and well-drafted legislation that brings clarity and stability to critical areas of state statutory law.” Over time, many of its legislative efforts have been very influential and become law in the United States — for instance, the ULC drafted the Uniform Commercial Code in 1952. More recently, the ULC drafted the Uniform Fiduciary Access to Digital Assets Act (2014-15), which has been adopted in at least 46 states.
The UPDPA departs in form and substance from existing privacy and data protection laws in the U.S., and indeed internationally. The law would provide individuals with fewer, and more limited, rights to access and otherwise control data, with broad exemptions for pseudonymized data. Further, narrowing the scope of application, UPDPA only applies to personal data “maintained” in a “system of records” used to retrieve records about individual data subjects for the purpose of individualized communication or decisional treatment. The Prefatory Note of a late-stage draft of the UPDPA notes that it seeks to avoid “the compliance and regulatory costs associated with the California and Virginia regimes.”
Central to the framework, however, is a useful distinction between “compatible,” “incompatible,” and “prohibited” data practices, which moves beyond a purely consent model based on the likelihood that the data practice may benefit or harm a data subject. We also find that the model laws’ treatment of Voluntary Consensus Standards offers a unique approach towards implementation that is context and sector-specific. Overall, we think the ULC model bill offers an interesting alternative for privacy regulation. However, because it departs significantly from existing frameworks, it could be slow to be adopted in states that are concerned with interoperability with recent laws passed in California, Virginia, and Colorado.
The summary below provides an overview of the key features of the model law, including:
- Scope
- Maintained Personal Data
- Rights of Access and Correction
- Pseudonymized Data
- Compatible, Incompatible, and Prohibited Data Practices
- Responsibilities of Collecting Controllers, Third-Party Controllers, and Processors
- Voluntary Consensus Standards
- Enforcement and Rulemaking
Read More:
- Read the approved Uniform Personal Data Protection Act here.
- Read the amendments made by the ULC here.
- Read comments from stakeholders on drafts since 2019 here.
- Read the Prefatory Note here.
Scope
UPDPA applies to controllers and processors that “conduct business” or “produce products or provide services purposefully directed to residents” in the state of enactment. Government entities are excluded from the scope of the Act.
To be covered, businesses must meet one of the following thresholds:
- during a calendar year maintains personal data about more than [50,000] data subjects who are residents of the state, excluding data subjects whose data is collected or maintained solely to complete a payment transaction;
- earns more than [50] percent of its gross annual revenue during a calendar year from maintaining personal data from data subjects as a controller or processor;
- is a processor acting on behalf of a controller the processor knows or has reason to know satisfies paragraph (1) or (2); or
- maintains personal data, unless it processes the personal data solely using compatible data practices.
The effect of threshold (4) is that UPDPA applies to smaller firms that maintain personal data, but relieves them of compliance obligations as long as they use the personal data only for “compatible” purposes.
Maintained Personal Data
UPDPA applies to “personal data,” which includes (1) records that identify or describe a data subject by a direct identifier, or (2) pseudonymized data. The term does not include “deidentified data.” UPDPA also does not apply to information processed or maintained in the course of employment or application for employment, and “publicly available information,” defined as information lawfully made available from a government record, or available to the general public in “widely distributed media.”
- “Direct identifier” is defined as “information that is commonly used to identify a data subject, including name, physical address, email address, recognizable photograph, telephone number, and Social Security number.”
- “Deidentified data” is defined as “personal data that is modified to remove all direct identifiers and to reasonably ensure that the record cannot be linked to an identified data subject by a person that does not have personal knowledge or special access to the data subject’s information.”
Narrowing the scope of application, UPDPA only applies to personal data “maintained” in a “system of records” used to retrieve records about individual data subjects for the purpose of individualized communication or decisional treatment. The committee has commented that the definition of “maintains” is pivotal to understanding the scope of UPDPA. To the extent that data collected by businesses related to individuals is not maintained as a system of records for the purpose and function of making individualized assessments, decisions, or communications, it would not be within the scope of the Act (for instance, if it were maintained in the form of emails or personal photographs). According to the committee, the definition of “maintains” is modeled after the federal Privacy Act’s definitions of “maintains” and “system of records”. 5 U.S.C. §552a(a)(3), (a)(5).
- “Maintains” with respect to personal data, means “to retain, hold, store, or preserve personal data as a system of records used to retrieve records about individual data subjects for the purpose of individualized communication or decisional treatment.”
- “Record” is defined as information: (A) inscribed on a tangible medium; or (B) stored in an electronic or other medium and retrievable in perceivable form.
Rights of Access and Correction
Access and Correction Rights: UPDPA grants data subjects the rights to access and correct personal data, excluding personal data that is pseudonymized and not maintained with sensitive data (as described below). Controllers are only required to comply with authenticated data subject requests. “Data subject” is defined as an individual who is identified or described by personal data. According to the committee, the access and correction rights extend not only to personal information provided by a data subject, but also commingled personal data collected by the controller from other sources, such as public sources, and from other firms.
Non-Discrimination: UPDPA prohibits controllers from denying a good or service, charging a different rate, or providing a different level of quality to a data subject in retaliation for exercising one of these rights. However, controllers may still make a data subject ineligible to participate in a program if the corrected information requested by them makes them ineligible, as specified by the program’s terms of service.
No Deletion Right: Notably, UPDPA does not grant individuals the right to delete their personal data. The ULC committee has enumerated various reasons for taking this approach, including: (1) the wide range of legitimate interests for controllers to retain personal data, (2) difficulties associated with ensuring that data is deleted, given how it is currently stored and processed, and (3) compatibility with the First Amendment of the U.S. Constitution (free speech). The committee has also stated that UPDPA’s restrictions on processing for compatible uses or incompatible uses with consent should provide sufficient protection.
Pseudonymized Data
“Pseudonymized data” is defined as “personal data without a direct identifier that can be reasonably linked to a data subject’s identity or is maintained to allow individualized communication with, or treatment of, the data subject. The term includes a record without a direct identifier if the record contains an internet protocol address, a browser, software, or hardware identification code, a persistent unique code, or other data related to a particular device.”
Pseudonymized data is subject to fewer restrictions than more identifiable forms of personal data. Generally, consumer rights contained in UPDPA (access and correction) do not apply to pseudonymized data. However, these rights do still apply to “sensitive” pseudonymized data to the extent that it is maintained in a way that renders the data retrievable for individualized communications and treatment.
“Sensitive data” includes personal data that reveals: (A) racial or ethnic origin, religious belief, gender, sexual orientation, citizenship, or immigration status; (B) credentials sufficient to access an account remotely; (C) a credit or debit card number or financial account number; (D) a Social Security number, tax-identification number, driver’s license number, military identification number, or an identifying number on a government-issued identification; (E) geolocation in real time; (F) a criminal record; (G) income; (H) diagnosis or treatment for a disease or health condition; (I) genetic sequencing information; or (J) information about a data subject the controller knows or has reason to know is under 13 years of age.
In practice, the ULC committee has stated that a collecting controller that stores user credentials and customer profiles can avoid the access and correction obligations if it segregates its data into a key code and a pseudonymized database so that the data fields are stored with a unique code and no identifiers. The separate key will allow the controller to reidentify a user’s data when necessary or relevant for their interactions with the customers. Likewise, a collecting controller that creates a dataset for its own research use (without maintaining it in a way that allows for reassociation with the data subject) will not have to provide access or correction rights even if the pseudonymized data includes sensitive information. Additionally, a retailer that collects and transmits credit card data to the issuer of the credit card in order to facilitate a one-time credit card transaction is not maintaining this sensitive pseudonymized data.
Compatible, Incompatible, and Prohibited Data Practices
UPDPA distinguishes between “compatible,” “incompatible,” and “prohibited” data practices. Compatible data practices are per se permissible, so controllers and processors may engage in these practices without obtaining consent from the data subject. Incompatible data practices are permitted for non-sensitive data if the data subject is given notice and an opportunity to withdraw consent (an opt-out right). However, opt-in consent is required for a controller to engage in incompatible data processing of “sensitive” personal data. Controllers are prohibited from engaging in prohibited data practices.
UPDPA’s distinctions between “compatible,” “incompatible,” and “prohibited” data practices are based on the likelihood that the data practice may benefit or harm a data subject:
Compatible data practices: A controller or processor engages in a compatible data practice if the processing is “consistent with the ordinary expectations of data subjects or is likely to benefit data subjects substantially.”
- Per se compatible practices: UPDPA includes a list of 9 data practices that are per se compatible, such as the effectuation of a transaction to which the data subject is a participant, compliance with legal obligations, research, and processing that is “reasonably necessary to create pseudonymized or deidentified data.”
- Factors: The list of compatible data practices is not exhaustive, and UPDPA also contains 6 factors to determine whether a processing activity is a compatible data practice, such as the data subject’s relationship with the controller, the type and nature of the personal data to be processed, and the risk of negative consequences to the data subject associated with the use or disclosure of the personal data. This catch-all provision aims to allow controllers and processors to create innovative data practices that are unanticipated and unconventional, so long as data subjects substantially benefit from the practice.
- Targeted advertising: The Act specifies that a controller may use personal data, or disclose pseudonymized data to a third-party controller “to deliver targeted advertising and other purely expressive content to a data subject.” However, the Act distinguishes between “purely expressive content” and “differential treatment,” the latter of which does not constitute a compatible data practice (as it is unanticipated, and does not substantially benefit the data subject). For instance, it excludes the use of personal data to offer terms relating to price or quality to a data subject that are different from those generally offered.
Incompatible data practices: A controller or processor engages in an incompatible data practice if the processing:
- (1) is not a compatible data practice and is not a prohibited data practice; or
- (2) is otherwise a compatible data practice but is inconsistent with a controller or processor’s privacy policy.
- In other words, an incompatible data practice is an unanticipated use of data that is likely to cause neither substantial harm nor substantial benefit to the data subject. The ULC committee has stated that an example of an incompatible data practice is a firm that develops an app that sells user data to third party fintech firms for the purpose of creating novel credit scores or employability scores.
Prohibited data practices: Processing personal data is a prohibited data practice if the processing is likely to:
- (1) subject a data subject to specific and significant: (A) financial, physical, or reputational harm; (B) embarrassment, ridicule, intimidation, or harassment; or (C) physical or other intrusion on solitude or seclusion if the intrusion would be highly offensive to a reasonable person;
- (2) result in misappropriation of personal data to assume another’s identity;
- (3) constitute a violation of other law, including federal or state law against discrimination;
- (4) fail to provide reasonable data-security measures, including appropriate administrative, technical, and physical safeguards to prevent unauthorized access; or
- (5) process personal data without consent in a manner that is an incompatible data practice.
- The collection or creation of personal data by reidentifying or causing the reidentification of pseudonymized or de-identified data is considered to be a “prohibited data practice,” unless: (a) the reidentification is performed by a controller or processor that previously pseudonymized or de-identified the personal data, (b) the data subject expects the personal data to be maintained in identified form by the controller performing the reidentification, or (c) the purpose of the reidentification is to assess the privacy risk of deidentified data and the person performing the reidentification does not use or disclose reidentified personal data except to demonstrate a privacy vulnerability to the controller or processor that created the de-identified data.
Responsibilities of Collecting Controllers, Third-Party Controllers, and Processors
UPDPA creates different obligations for “controllers” and “processors,” and further distinguishes between “collecting controllers” and “third party controllers.”
- “Controller” is defined as a person that, alone or with others, determines the purpose and means of processing.
- “Collecting controller” is defined as “a controller that collects personal data directly from a data subject.” Collecting controllers are responsible for providing means for data subjects to access and correct their personal data, including establishing a procedure to authenticate the identity of the data subject. They are also required to transmit credible requests to downstream third-party controllers and processors who have access to the personal data, and to make reasonable efforts to ensure that these third parties have made the requested change.
- “Third party controller” is defined as “a controller that receives from another controller authorized access to personal data or pseudonymized data and determines the purpose and means of additional processing.” Third-party controllers are under most of the same obligations as collecting controllers. However, they are not obliged to respond to access or correction requests directly, but rather to make a reasonable effort to assist the collecting controller to satisfy a data subject request. Any third-party controller that receives a request from a collecting controller must transmit the request to any processor or other third-party controller that it has engaged so that the entire chain of custody of personal data is corrected.
- “Processor” is defined as “a person that processes personal data on behalf of a controller.” Processors are responsible for providing the controller, upon their request, with a data subject’s personal data or enabling them to access the personal data, correct inaccuracies in a data subject’s personal data upon request of the controller, and abstain from processing personal data for a purpose other than the one requested by the controller.
If a person with access to personal data engages in processing that is not at the direction and request of a controller, that person becomes a controller rather than a processor, and is therefore subject to the obligations and constraints of a controller.
Aside from complying with access and correction requests, controllers have a number of additional responsibilities, such as notice and transparency obligations, obtaining consent for incompatible data practices, abstaining from engaging in a prohibited data practice, and conducting and maintaining data privacy and security risk assessments. Processors are also required to conduct and maintain data privacy and security risk assessments.
Voluntary Consensus Standards
UPDPA enables stakeholders representing a diverse range of industry, consumer, and public interest groups to negotiate and develop voluntary consensus standards (VCS’s) to comply with UPDPA in specific contexts. VCS’s may be developed to comply with any provision of UPDPA, such as the identification of compatible practices for an industry, the procedure for securing consent for an incompatible data practice, or practices that provide reasonable security for data. Once established and recognized by a state Attorneys General, any controller or processor can explicitly adopt and comply with it. It is also worth noting that compliance with a similar privacy or data protection law, such as the GDPR or CCPA, is sufficient to constitute compliance with UPDPA.
Enforcement and Rulemaking
UPDPA grants State Attorneys General rulemaking and enforcement authority. The “enforcement authority, remedies, and penalties provided by the [state consumer protection act] apply to a violation” of UPDPA. However, notwithstanding this provision, “a private cause of action is not authorized for a violation of this Act or under the consumer protection statute for violations of this Act.”
What’s Next for the UPDPA?
The future of the UPDPA is, as yet, unclear. The drafting committee is currently developing a legislative strategy for submitting the Act to state legislatures on a state-by-state basis. It remains to be seen whether state legislators will have an appetite to introduce, consider, and possibly pass the UPDPA during the legislative session of 2022 and beyond. As an alternative, legislators may wish to adapt certain elements of the model law, such as the voluntary consensus standards (VCS), flexibility for research, or the concept of “compatible,” “incompatible,” or “prohibited” data practices based on the likelihood of substantial benefit or harm to the data subject, rather than purely notice and consent.