The Garden State Joins the Comprehensive Privacy Grove
On January 16, 2024, Governor Murphy signed S332 into law, making New Jersey the thirteenth U.S. State to adopt a comprehensive privacy law to govern the collection, use, and transfer of personal data. S332 endured a long and circuitous route to enactment, having been introduced in January 2022 and amended six times before being passed by both chambers during the waning hours of New Jersey’s legislative session. The law will take effect on January 15, 2025. S332 bears a strong resemblance to other laws following the Washington Privacy Act (WPA) framework, particularly those passed in Delaware, Oregon, and Colorado. Nevertheless, S332 diverges from existing privacy frameworks in several significant ways. In this blog we highlight eight unique, ambiguous, or otherwise notable provisions that set S332 apart in the U.S. privacy landscape.
1. Private Right of Action Confusion
One ongoing controversy regarding S332 is whether the law could provide the basis for a private right of action. S332 specifies that the New Jersey Attorney General has “sole and exclusive authority” to enforce a violation of S332 and that nothing in the law shall be construed as providing the basis for a private right of action for violations of S332. A late amendment removed language stating that S332 should not be construed as providing the basis for a private right of action “under any other law.” Industry members raised concerns that the removal of this language opens up the possibility of private lawsuits by tying alleged violations of the law to causes of action under other laws. In his signing statement, Governor Murphy attempted to assuage industry fears by noting that “nothing in this bill expressly establishes such a private right of action” and “this bill does not create a private right of action under this law or under any other law.” Some industry members remain unconvinced, however, and continue to advocate for clarifying amendments.
2. Data Protection Assessments Prior to Processing
New Jersey joins the majority of state privacy laws in requiring that controllers conduct a data protection assessment (DPA) for any data processing activity that “presents a heightened risk of harm to a consumer.” New Jersey is notable, however, for explicitly requiring that the DPA occur before initiating any such high risk processing activities. Prior to New Jersey, only the Colorado Privacy Act’s implementing regulations required that DPAs occur prior to initiating processing. Following the NetChoice v. Bonta litigation, which saw California’s Age-Appropriate Design Code Act preliminarily enjoined, this requirement could raise First Amendment concerns if it is interpreted as a prior restraint on speech.
3. Thresholds for Applicability
S332 is notable for not including a revenue threshold in its applicability provisions. The law applies to controllers that control or process the personal data of either (a) at least 100,000 New Jersey residents annually, or (b) at least 25,000 New Jersey residents annually and the controller derives revenue from the sale of personal data. Prong (b) differs from the majority of existing privacy frameworks, which tend to require that the controller derive at least a certain percentage of revenue from personal data sales (e.g., 25%) to be covered. This is another similarity between S332 and the Colorado Privacy Act, which sets the same thresholds.
The carve outs in S332 are similar to those in the Delaware Personal Data Privacy Act. S332 includes data-level exemptions for protected health information subject to the Health Insurance Portability and Accountability Act (HIPAA) and “personal data collected, processed, sold, or disclosed by a consumer reporting agency” insofar as those processing activities are compliant with the Fair Credit Reporting Act (FCRA). With respect to the financial industry, S332 joins the majority of states by providing entity-level and data-level exemptions for financial institutions and their affiliates subject to Title V of the Gramm-Leach-Bliley Act (GLBA). Notably, however, S332 does not contain exemptions for nonprofits, higher education institutions, or personal data regulated by the Family Educational Rights and Privacy Act (FERPA).
New Jersey becomes just the third state, after California and Colorado, to provide for rulemaking in its comprehensive privacy law. The Act charges the Director of the Division of Consumer Affairs in the Department of Law and Public Safety with promulgating rules and regulations necessary to effectuate the purposes of S332. This provision includes no details on the timeframe or substance of rulemaking, other than that the New Jersey Administrative Procedure Act applies. As the rulemaking process unfolds, this could be a valuable opportunity for stakeholders to seek clarity on some of S332’s ambiguous provisions.
5. Ambiguity on Authorized Agents and UOOMs
New Jersey joins Colorado, Connecticut, Delaware, Montana, Oregon, and Texas in allowing an individual to designate an authorized agent to exercise the individual’s right to opt out of processing for certain purposes. S332’s authorized agent provision has two ambiguities. First, subsection 8(a) specifies that an individual can designate an authorized agent to “act on the consumer’s behalf to opt out of the processing and sale of the consumer’s personal data.” (Emphasis added.) As written, this provision would create a broad opt-out right with respect to all processing, distinct from the explicitly established opt-out rights in the bill. It is more likely that this provision is intended to be limited to opting-out of processing for the purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal or similarly significant effects. The second ambiguity is the qualifier that an individual can use an authorized agent designated using technology to opt-out of profiling only “when such technology exists.” It is not clear who or what determines the availability of such technology.
S332 also joins California, Colorado, Connecticut, Montana, Oregon, and Delaware in requiring that controllers allow individuals to opt-out of the processing of personal data for targeted advertising or the sale of personal data on a default basis through a universal opt-out mechanism (UOOM). Designed to reduce the burden on individuals’ attempting to exercise opt-out rights, UOOMs encompass a range of tools providing individuals with the ability to configure their devices to automatically exercise opt out rights through a preference signal when interacting with a controller through a desktop or mobile application. S332’s statutory requirements for a UOOM, however, are ambiguous and inconsistent with those in existing privacy frameworks. Specifically, one requirement is that a UOOM cannot “make use of a default setting that opts-in a consumer to the processing or sale of personal data.” (Emphasis added.) This is clearly inconsistent with the purpose of a universal opt-out mechanism, which is to opt individuals out of such processing.
6. Adolescent Privacy
S332 continues and builds upon a trend of increased privacy protections for adolescents (while legislating around the existing, largely preemptive COPPA regime for individuals 12 and under). For individuals whom the controller actually knows are 13-16 years old or willfully disregards their age, the controller must obtain consent from the teens before processing their personal data for the purposes of targeted advertising, sale, or profiling in furtherance of decisions that produce legal or similarly significant effects. Several states have iterated on adolescent privacy protection in recent years by requiring consent for these processing purposes. Delaware raised the bar when it required such consent for individuals aged 13 through 17, but it did not extend the opt-in consent requirement to profiling. Oregon was the first state to include profiling in the opt-in consent requirement, but its age range was slightly narrow at 13 through 15. New Jersey is unique and arguably goes the furthest by extending the opt-in consent requirement to cover individuals aged 13 through 16 and extending this requirement to profiling in furtherance of decisions that produce legal or similarly significant effects.
7. Expansive Definitions of Sensitive Data and Biometric Data
S332’s definitions of sensitive data and biometric data (which require opt-in consent to process) continue and build upon trends seen in stronger iterations of the WPA framework. S332’s definition of sensitive data includes additional categories seen in a minority of existing privacy frameworks, such as “status as transgender or non-binary” and “sex life.”
S332’s definition of sensitive data also goes beyond the other WPA-style laws in two ways. First, the coverage of health data is slightly expanded to include mental or physical health treatment (in addition to condition or diagnosis). Second, sensitive data also includes “financial information,” which it specifies “shall include a consumer’s account number, account log-in, financial account, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a consumer’s financial account.” This category is new to the non-California laws.
The definition of biometric data is also broader than in most of the WPA-style laws, which consistently define biometric data as “data generated by automatic measurements of an individual’s biological characteristics.” S332, in contrast, defines biometric data as “data generated by automatic or technological processing, measurements, or analysis of an individual’s biological, physical, or behavioral characteristics,” and it explicitly includes facial mapping, facial geometry, and facial templates in its list of examples. This language is similar to the definitions of biometric data and biometric identifiers in the Colorado Privacy Act Rules.
8. Expanded Right to Delete
Finally, S332 provides an expanded right to delete with respect to third party data, first observed in Delaware. When a controller has lawfully obtained an individual’s personal data from a third party and the individual submits a deletion request, the controller must either (a) retain a record of the deletion request and the “minimum data necessary” to ensure that the individual’s personal data remains deleted and not use that retained information for any other purpose, or (b) delete such data. This is different from the majority of states, which instead allow a controller that obtains personal data from third party sources to respond to a deletion request by retaining such data but opting the individual out of processing activities that are not subject to a statutory exemption (such as fraud prevention or cybersecurity monitoring).