Shining a Light on the Florida Digital Bill of Rights
On May 4, 2023, the Florida ‘Digital Bill of Rights’ (SB 262) cleared the state legislature and now heads to the desk of the Governor for signature. SB 262 bears many similarities to the Washington Privacy Act and its progeny (specifically the Texas Data Privacy and Security Act). However, SB 262 is unique given its narrow scope of businesses regulated and other significant deviations from current trends in U.S. state privacy legislation, as well as its inclusion of a section in the style of Age-Appropriate Design Code (AADC) regulations but with broader application than the “comprehensive” parts of the bill. This blog highlights five unique and key features of the Florida Digital Bill of Rights:
1) SB 262 includes a section on “Protection of Children in Online Spaces”, which draws inspiration from the California AADC but diverges in many key aspects.
2) The scope of the comprehensive privacy provisions of SB 262 covers only businesses making $1 billion in revenue and meeting other threshold requirements.
3) SB 262 creates both familiar and novel consumer rights surrounding sensitive data and targeted advertising, raising compliance questions.
4) Under SB 262, controllers and processors will have new responsibilities including creating retention schedules and disclosure obligations for the sale of sensitive or biometric data.
5) Businesses regulated under SB 262 that utilize voice or face recognition, or have video or audio features in devices, will be subject to heightened restrictions for data collected through these services, regardless of whether the device can identify an individual.
Additionally, FPF is releasing a chart to help stakeholders assess how SB 262’s “Protections for Children in Online Spaces” compares to the California Age-Appropriate Design Code Act (California AADC).
1. The “Protection of Children in Online Spaces” Section Draws Inspiration from the California AADC but Diverges in Many Key Aspects
Many amendments were added to SB 262 at the eleventh hour, including several provisions on the ‘Protection of Children in Online Spaces’ (“Section 2”). FPF’s comparison chart assesses each requirement of Section 2 against the California AADC. Section 2 will govern a far broader set of covered entities than the bulk of SB 262’s provisions on privacy, and while it clearly incorporates language and concepts from the California AADC, it contains significant deviations in both scope and substance.
Scope of covered entities
The scope of entities subject to Section 2 is both broader and narrower than the California AADC. While the California AADC broadly applies to all online products, services, and features that are “likely to be accessed by children” under age 18, Section 2 only applies to “online platforms,” covering social media and online gaming platforms. The definition of “social media platform” includes “a form of electronic communication through which users create online communities or groups to share information, ideas, personal messages, and other content” and does not list any exemptions. “Online gaming platforms” is undefined. While seemingly narrower in scope than the California AADC, Section 2 contains no minimum revenue or user applicability thresholds, meaning that smaller businesses not subject to California’s law may be within scope. Additionally, it is possible that the scope of “social media platform” could encompass a number of non-obvious organizations, depending on how broadly the definition is construed.
No explicit DPIA or age estimation requirements
While Section 2 does not require a data protection impact assessment (DPIA) as required by the California AADC, it instead places a burden of proof on online platforms to demonstrate that processing personal information does not violate any of the law’s prohibitions. Covered platforms may therefore ultimately need to conduct a DPIA or similar assessment to meet this burden of proof.
Like the California AADC, Section 2 defines a child as an individual under 18, though, unlike the AADC, Section 2 does not affirmatively require age estimation. Section 2 also modifies the California AADC’s “likely to be accessed by children” standard to include predominantly likely to be accessed by children, but does not lay out any factors for assessing whether a service is likely to be accessed by children.
Prohibitions
Two key points on which Section 2 of SB 262 diverges from the California AADC are in the restrictions on processing and profiling.
Under Section 2, covered services may not process the personal information of a person under 18 if they have actual knowledge or willfully disregard that processing may result in “substantial harm or privacy risk to children.” The absence of affirmative age estimation requirements and the inclusion of an “actual knowledge or willfully disregard” knowledge standard modifier could be a response to First Amendment objections raised in the NetChoice v. Bonta litigation seeking to strike down the California AADC. The “substantial harm or privacy risk” language is reminiscent of California AADC’s prohibition on processing children’s data in a materially detrimental manner. However, while “material detriment” is undefined in California AADC, Section 2 defines “substantial harm or privacy risk” to include: mental health disorders; addictive behaviors; physical violence, online bullying and harassment; sexual exploitation; the promotion and marketing or tobacco, gambling, alcohol, or narcotic drugs; and predatory, unfair, or deceptive marketing practices or other financial harms.
Both the California AADC and Section 2 contain limits on profiling of people under 18 except in certain circumstances. While both contain an exception for when necessary to provide an online service, product, or feature, the California AADC contains an exemption if the business can demonstrate a “compelling reason that profiling is in the best interests of children.” In contrast, Section 2 contains an exemption if an online platform can demonstrate a compelling reason that profiling does not “pose a substantial harm or privacy risk to children.” It is possible that the affirmative showing required by the California AADC may be a higher threshold to meet than that of Section 2, especially given that the “best interests of children” standard is undefined and is not an established U.S. legal standard outside of the family law context. Furthermore, profiling is defined more broadly in Section 2 to include “any form of automated processing performed on personal information to evaluate, analyze, or predict personal aspects relating to the economic situation, health, personal preferences, interests, reliability, behavior, location, or movements of a child,” rather than “any form of automated processing of personal information to evaluate aspects relating to a person.”
2. The Digital Bill of Rights’ ‘Comprehensive’ Privacy Provisions Will Cover Very Few Businesses.
The types of entities subject to the remaining bulk of SB 262’s ‘comprehensive’ privacy provisions outside of Section 2 are much narrower than comparable U.S. state privacy laws, even the more limited ones. Florida SB 262 will only apply to a handful of companies that meet a threshold annual gross revenue requirement of $1 billion and either (1) make over 50% of revenue from targeted advertising, (2) operate a “consumer smart speaker and voice command component,” or (3) operate an app store with at least 250,000 software applications. This can be compared to recently enacted privacy laws in Iowa and Indiana, which will apply to businesses that either process personal data of at least 100,000 state residents or derive 50% of gross revenue from the sale of personal data of at least 25,000 consumers. Though the terms “targeted advertising” and “consumer smart speaker” in SB 262 could be construed liberally, the revenue requirement means that Floridans will not receive new rights or protections with respect to the vast majority of businesses that collect their personal data in the Sunshine State.
3. The Bill Creates A Complex Stack of both Familiar and Novel Consumer Rights
SB 262 will establish many rights that are now familiar from U.S. state privacy laws, including confirmation of processing, correction of inaccuracies, deletion, obtaining a copy of a person’s personal data in a portable format, and the ability to opt out of “solely” automated profiling in furtherance of decisions that produce legal or similarly significant effects. However, there are a number of new and unique provisions in the consumer rights sections:
- Processing Sensitive Data Requires Both Opt-In and Opt-Out Consent: SB 262 provides that consumers have a right to opt out of the collection or processing of sensitive data. Notably, controllers are separately prohibited from processing sensitive data without first obtaining consumer “consent”. This diverges from other common frameworks that either require consent for processing sensitive data, such as the Connecticut Data Privacy Act, or only create an opt-out right, like Iowa.
- Sale of Sensitive Data Also Requires Opt-In and Opt-Out Consent: Similar to Texas HB 4, SB 262 will require all businesses (including those that do not meet SB 262’s revenue thresholds) to obtain consent prior to the sale of sensitive data, while also providing consumers the general right to opt-out of the sale of any personal data. “Sale” is defined broadly to include transfers of personal data for “monetary or other valuable consideration” and lacks common exceptions, such as transfers as part of a merger or acquisition.
- Targeted Advertising: SB 262 establishes a right to opt out of the processing of personal data for the purpose of targeted advertising. However, targeted advertising has a unique definition that includes personal data obtained over time across both affiliated and unaffiliated websites and online applications. While the Digital Bill of Rights will permit consumers to opt out of targeted advertising that relies on demonstrably ‘pseudonymous’ person information, by governing data collected across a company’s affiliated properties, it may apply more broadly than most other state privacy laws, which are focused on third-party advertisements.
- Parents Exercising Rights on Behalf of Teens: SB 262 defines “child” as an individual under 18, diverging from other state consumer privacy laws and the federal Children’s Online Privacy Protection Act (COPPA). Unlike the California and Connecticut privacy laws, which created heightened opt-in rights for teens aged 13 to 15, SB 262 does not include any further distinctions between young children and teens, suggesting that parents or guardians will have sole responsibility to exercise consumer rights on behalf of youth up to age 18.
4. Controllers and Processors Will Have New Responsibilities for Purging Data and Disclosing Certain Practices
Unlike existing comprehensive state privacy laws, SB 262 would require that covered businesses and their processors implement a retention schedule for the deletion of personal data. The text of this provision appears influenced by the Illinois Biometric Information Privacy Act (BIPA). Under SB 262, controllers or processors may only retain personal data until (1) the initial purpose for the collection was satisfied; (2) the contract for which the data was collected or obtained is expired or terminated; or (3) two years after the consumer’s last interaction with the regulated business (subject to exceptions). However, unlike BIPA, SB 262 would not require that the retention schedule be made publicly available and would permit retention necessary to prevent or detect security incidents.
Further, in addition to the typical privacy notices required by state comprehensive laws, SB 262 creates two distinct disclosure requirements. First, again similar to Texas HB 4, if a controller sells sensitive or biometric data, they must provide the following notice: “NOTICE: This website may sell your [sensitive and/or biometric] personal data and/or biometric personal data.” Second, a controller that operates a search engine is required to disclose the main parameters in ranking results, “including the prioritization or deprioritization of political partisan or political ideology” in search results.
5. Businesses that Utilize Voice or Face Recognition, or Have Video or Audio Features in Devices, Have Particular but Perplexing Obligations
Finally, one of SB 262’s most unique provisions is a requirement that covered businesses may not provide consumer devices that engage in “surveillance” when not in active use unless “expressly authorized” by the consumer. Though “surveillance” and “active use” are not defined, the prohibition applies to devices that have any of the following features: voice recognition, facial recognition, video recording, audio recording, “or other electronic, visual, thermal, or olfactory feature” that collects data. SB 262 further fails to define “express authorization,” raising questions as to whether express authorization is analogous to “consent” under the bill, or if a higher standard will be required for express authorization, such as that required in the recently enacted Washington State “My Health, My Data” Act.
SB 262 further provides consumers with the right to opt out of personal data collected by voice or face recognition systems. Voice recognition is broadly defined as collecting, storing, analyzing, transmitting, and interpreting spoken words or other sounds – seemingly encompassing almost all audio-based consumer-facing systems. Facial recognition and the other features are not defined, though one can infer they would have a similarly broad definition as voice recognition. As a result, despite SB 262’s requirement that “biometric data” be used for unique identification of an individual in order to be subject to the legislation’s requirements for sensitive data, most general voice and face systems unrelated to identification will still need to provide consumers’ the ability to opt-out under these provisions. These restrictions and requirements may prove difficult for the functionality of some products that rely on these features, such as accessibility features that use natural language processing to transcribe spoken words. Moreover, despite SB 262’s revenue threshold, these prohibitions and restrictions will likely flow down to any other entity utilizing (or has a software plug-in to) voice assistant devices like Amazon Echo or Apple Siri for customer service, customer ordering, or other forms of user engagement through contractual agreements and requirements.
Conclusion
Given that many of the consumer rights and business obligations of SB 262 will directly apply to very few businesses, it is understandable why the Florida Digital Bill of Rights may have flown under the radar thus far. However, SB 262 is worth a close read, particularly the short-but-impactful section on “Protection of Children in Online Spaces” and provisions creating novel consumer rights. Given Governor DeSantis’ public support for the legislation, we can anticipate the Digital Bill of Rights will be enacted shortly and will go into effect on July 1, 2024–giving stakeholders just over a year to understand compliance obligations. We note, however, that the specific consumer rights and business obligations under SB 262 may evolve as the State Attorney General’s office is granted both mandatory and permissive rulemaking authority.