Starting Point for Negotiation: An Analysis of Senate Democratic Leadership’s Landmark Comprehensive Privacy Bill
Today, Senate Commerce Committee Ranking Member Maria Cantwell (D-WA), joined by top Democrats on the Senate Commerce Committee – Senators Markey, Schatz and Klobuchar – introduced a new comprehensive federal privacy bill, the Consumer Online Privacy Rights Act (COPRA). The bill is consistent with the Senate Democratic leadership positions announced last week and comes in advance of a December 4th Senate Commerce Committee hearing convened by Senator Wicker (R-Miss), Examining Legislative Proposals to Protect Consumer Data Privacy.
In substance, the bill primarily emphasizes individual control, codifying strong rights for individuals to be informed of data processing, and to be able to access, delete, correct, and port their data. The definition of covered data is broad, aligning with the GDPR and most other US privacy bills to date (data that “identifies, or is linked or reasonably linkable to an individual or a consumer device, including derived data”), although it excludes “de-identified data.” The FTC is tasked with rulemaking to enable centralized opt-outs for non-sensitive data, while “sensitive data” requires opt-in consent.
Notably, the bill contains a nuanced exception to support ethical commercial research if approved, monitored, and governed by an Institutional Review Board (IRB) or an IRB-like oversight entity that meets standards promulgated by the FTC. Such oversight would provide stronger legal protections for “scientific, historical, or statistical research in the public interest” in situations where informed consent is impractical, such as commercial research conducted on Big Data or other large, less readily identifiable datasets.
Below are FPF’s highlights of COPRA’s other provisions.
- Read Senator Cantwell’s Press Release (11.26.19); Read the Text of the Bill; Read FPF CEO Jules Polonetsky’s Statement (11.26.19); See Witnesses and Details for December 4th Hearing
1. Jurisdictional Scope
- “Covered entity” is defined in the bill as “any entity or person that is subject to the Federal Trade Commission Act (15 U.S.C. 41 et seq) other than a small business.
- Excludes small businesses, defined as those that do not maintain an average annual revenue of over $25,000,000; annually process covered data of an average of 50,000 individuals, households, or devices; and derive half or more of their annual revenue from transferring covered data.
- Unlike other federal proposals, this bill would not govern non-profit organizations, political campaigns, or any other entity not already subject to the FTC’s jurisdiction.
2. Data Minimization and Data Security
- The bill features a general right to data minimization, a provision that may place significant limits on collection of data. Businesses may not process or transfer data beyond what is “reasonably necessary, proportionate, and limited” to (1) carry out the specific processing purposes and transfers described in the privacy policy made available by the covered entity; (2) carry out a specific processing purpose or transfer for which the covered entity has obtained affirmative express consent; or (3) for a purpose specifically permitted by the Act.
- Covered entities must maintain “reasonable data security practices,” including specific obligations to assess vulnerabilities, take preventative and corrective actions, implement policies for information retention and disposal, and conduct employee training.
3. Sensitive Data (and Opt-Outs for Non-Sensitive Data)
- The bill requires “prior, affirmative, express consent” to collect or process sensitive data, which includes (among other things): biometric data, information that reveals past, present, or future physical or mental health, disability, or diagnosis of an individual, precise geolocation, details of private communications, phone or text logs, race or ethnicity, union membership, sexual orientation or sexual behavior, and intimate images (a notable inclusion that may intersect with Section 230 and bills like the ENOUGH Act). The FTC can also determine other data to be sensitive through rulemaking.
- Sensitive data also includes “email address,” “phone number,” and “browsing history,” which are commonly collected for online and mobile advertising and marketing purposes under an “opt out” standard promoted by industry self-regulation bodies.
- Prior, affirmative opt-in consent is required for both processing and transferring of sensitive data. For all other data (“non-sensitive”), the bill provides a right to opt-out of the transfer of personal information, with the FTC tasked with promulgating rules for how such opt-outs should be governed, including the extent to which they may be centralized in ways similar to the FTC’s Do Not Call Registry.
4. Third Parties and Service Providers
- Third parties and service providers are defined similarly to how these terms are defined in the California Consumer Privacy Act:
- “Service providers” are entities that process or transfer covered data “in the course of performing a service or function on behalf of, and at the direction of, another covered entity,” but only to the extent that such processing or transferral (i) relates to the performance of such service or function; or (ii) is necessary to comply with a legal obligation or to establish, exercise, or defend legal claims.
- “Third parties” include any other non-service provider entity that processes or transfers third party data; and that is not under common ownership, corporate control, and branding as the covered entity.
- Businesses must identify “third parties” to whom data is shared, and allow individuals to opt-out or object to transfers of data to third parties through rules promulgated by the FTC. Service providers may not transfer data to a third party without express affirmative consent from the linkable individual. Meanwhile, “third parties” are prohibited from processing covered data in a manner “inconsistent with the expectations of a reasonable individual.”
- The bill also requires the FTC to promulgate strict rules limiting processing and restricting transfers of “biometric data” to third parties.
5. Interaction with State and Federal Laws
- This law preempts only “directly conflicting” state laws, and then only to the extent of the conflict. It does not preempt any aspects of state laws that afford “a greater level of protection to individuals protected under this Act.”
- Similarly, it would not preempt or displace any “common law rights or remedies, or any statute creating a remedy for civil relief, including any cause of action for personal injury, wrongful death, property damage, or other financial, physical, reputational, or psychological injury based in negligence, strict liability, products liability, failure to warn, an objectively offensive intrusion into the private affairs or concerns of the individual, or any other legal theory of liability under any Federal or State common law, or any State statutory law.”
- Similarly, the bill does not supersede any existing federal sectoral laws. However, covered entities that are already required to comply with privacy requirements of federal sectoral laws — including Gramm-Leach-Bliley, the Fair Credit Reporting Act (FCRA), the Health Information Technology for Economic and Clinical Health Act, the Family Educational Rights and Privacy Act (FERPA), or Health Insurance Portability and Accountability Act (HIPAA) — will be “deemed to be in compliance” with the related requirements of the Act (with the exception of the data security requirements in Section 107).
6. Algorithmic Discrimination and Civil Rights
- The bill prohibits covered entities from processing or transferring covered data on the basis of an individual’s or class of individuals’ actual or perceived race, color, ethnicity, religion, national origin, sex, gender, gender identity, sexual orientation, familial status, biometric information, lawful source of income, or disability – (A) for the purpose of advertising, marketing, soliciting, offering, selling, leasing, licensing, renting, or otherwise commercially contracting for a housing, employment, credit, or education opportunity, in a manner that unlawfully discriminates against or otherwise makes the opportunity unavailable to the individual or class of individuals; or (B) in a manner that unlawfully segregates, discriminates against, or otherwise makes unavailable, to an individual or class of individuals, goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation.
- Covered entity must designate privacy and security officers (see below) who among other things are required to annually conduct algorithmic decision-making impact assessments. These assessments must describe and evaluate the development of the entity’s algorithmic decision-making processes (including the design and training data used to develop the algorithmic decision-making process), how they were tested for accuracy, fairness, bias, and discrimination, and whether the system produces discriminatory results on the basis of protected characteristics. A covered entity may use external, independent auditors or researchers to make such assessments.
- Three years after enactment, the bill provides that the Commission shall publish a report containing the results of a study examining the use of algorithms for the purposes of processing or transferring covered data to make or facilitate advertising for housing, education, employment, or credit opportunities, or determining access to any place of public accommodation.
- The bill disallows covered entities from charging individual fees to exercise their rights, and prohibits covered entities from conditioning a service or product to an individual in exchange for waiving privacy rights.
7. Enforcement, Accountability, and Whistleblower Protections
- Within two years, the bill calls for the Federal Trade Commission to establish a “new Bureau” within the Commission. The purpose of the new Bureau would be to assist the Commission in exercising their authority under the Act and other Federal laws addressing privacy, data security, and related issues. Violations of the Act are treated as unfair or deceptive acts under the FTC Act, and are enforceable by the Federal Trade Commission. State Attorneys General are also able to bring actions under the Act. The bill also gives the Federal Trade Commission broad rulemaking authority to promulgate regulations under the Act.
- The individual rights granted by the bill are complemented by a private right of action, and the bill provides that any violation of the Act, or of a regulation promulgated under it, should be considered an “injury in fact.” The bill also renders pre-dispute arbitration agreements and joint action waivers invalid or unenforceable with respect to privacy or data security disputes.
- The bill also mandates that the chief executive officer, CISO and chief privacy officers of “large data holder[s]” must annually certify to the Commission that the entity maintains adequate internal controls to comply with the Act. “Large data holder” is defined as an entity that, in the last calendar year, processed or transferred the covered data of more than 5,000,000 individuals, devices, or households; or processed or transferred the sensitive data of over 100,000 individuals, devices, or households.
- The bill prohibits covered entities from discharging, demoting, suspending, threatening, harassing, or in any other manner discriminating against a covered individual of the entity for, inter alia, taking a lawful action in providing the Federal Government or State Attorney General with information relating to acts or omissions they believe to be violations of the Act or regulations promulgated under the Act. Various forms of relief are available, including reinstatement, back pay, financial damages, and attorneys’ fees.
- The bill calls for the designation of qualified employees as privacy and data security officers to facilitate the covered entities ongoing compliance with the Act. Among others, responsibilities include the facilitation of ongoing compliance with the Act; the implementation of comprehensive privacy and data security programs; employee training & risk assessments; and annually conducting mandated algorithmic decision-making impact assessments (described above).
READ MORE: