Future of Privacy Forum Releases Analysis of Washington Privacy Act

FPF CEO: “Most comprehensive state privacy legislation proposed to date”

WASHINGTON, DC – January 13, 2020 – The Future of Privacy Forum today released an in-depth analysis of the Washington Privacy Act (Washington State Senate Bill 6281), as well as the following statement by Future of Privacy Forum CEO Jules Polonetsky about the bill:

“The Washington Privacy Act is the most comprehensive state privacy legislation proposed to date. The bill addresses concerns raised last year and proposes strong consumer protections that go beyond the California Consumer Privacy Act. It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not.”

According to the FPF analysis, the Act would be a holistic, GDPR-like comprehensive law that: (1) provides protections for residents of Washington State; (2) grants individuals core rights to access, correct, delete, and port data; (3) creates rights to opt out of sale, profiling, and targeted advertising; (4) imposes obligations to perform risk assessments; (5) requires opt-in consent for the processing of sensitive data; and (6) creates collection and use limitations. In addition, the Act contains provisions for controllers and processors utilizing facial recognition services.

READ THE FPF ANALYSIS OF THE WASHINGTON PRIVACY ACT.

It’s Raining Privacy Bills: An Overview of the Washington State Privacy Act and other Introduced Bills

By Pollyanna Sanderson (Policy Counsel), Katelyn Ringrose (Christopher Wolf Diversity Law Fellow) & Stacey Gray (Senior Policy Counsel)

 

Today, on the first day of a rapid-fire 2020 legislative session in the state of Washington, State Senator Carlyle has introduced a new version of the Washington Privacy Act (WPA). Legislators revealed the Act during a live press conference on January 13, 2020 at 2:00pm PST. Meanwhile, nine other privacy-related bills were introduced into the House today by Representative Hudgins and Representative Smith. 

If passed, the Washington Privacy Act would enact a comprehensive data protection framework for Washington residents that includes individual rights that mirror and go beyond the rights in the California Consumer Privacy Act (CCPA), as well as a range of other obligations on businesses that do not yet exist in any U.S. privacy law.

“The Washington Privacy Act is the most comprehensive state privacy legislation proposed to date,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “The bill addresses concerns raised last year and proposes strong consumer protections that go beyond the California Consumer Privacy Act. It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not.”

Earlier Senate and House versions of the Washington Privacy Act narrowly failed to pass last year in the 2019 legislative session. Read FPF’s comments on last year’s proposal. The version introduced today contains strong provisions that largely align with the EU’s General Data Protection Regulation (GDPR), and commercial facial recognition provisions that start with a legal default of affirmative consent. Nonetheless, legislators must work within a remarkably short time-frame to pass a law that can be embraced by both House and Senate within the next six weeks of Washington’s legislative session.

Below, FPF summarizes the core provisions of the bill, which if passed would go into effect on July 31, 2021. The Act would be a holistic, GDPR-like comprehensive law that: (1) provides protections for residents of Washington State; (2) grants individuals core rights to access, correct, delete, and port data; (3) creates rights to opt out of sale, profiling, and targeted advertising; (4) creates a nuanced approach to pseudonymised data; (5) imposes obligations on processors and controllers to perform risk assessments; (6) creates collection, processing, and use obligations; and (7) requires opt-in consent for the processing of sensitive data. In addition, the Act contains provisions for controllers and processors utilizing facial recognition services. 

Read the Bill Text HERE. Read the 9 other bills introduced today at the end of this blog post (Below).

Update (1/21/20): A substitute bill was released on January 20 by Senator Carlyle and cosponsors (see PSSB 6281). At 10:00am on January 23, the Senate committee on Environment, Energy & Technology will hold a hearing on this and other bills.

1. Jurisdictional and Material Scope

The Act would provide comprehensive data protections to Washington State residents, and would apply to entities that 1) conduct business in Washington or 2) produce products or services targeted to Washington residents. Such entities must control or process data of at least 100,000 consumers; or derive 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers (with “consumers” defined as natural persons who are Washington residents, acting in an individual or household context). The Act would not apply to state and local governments or municipal corporations.

The Act would regulate companies that process “personal data,” defined broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person” (not including de-identified data or publicly available information “information that is lawfully made available from federal, state, or local government records”), with specific provisions for pseudonymous data (see below, Core consumer rights).

2. Individual Rights to Access, Correct, Delete, Port, and Opt-Out of Data Processing

The Act would require companies to comply with basic individual rights to request access to their data, correct or amend that data, delete their data, and access it in portable format (“portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data… without hindrance, where the processing is carried out by automated means”). These rights would not be permitted to be waived in contracts or terms of service, and would be subject to certain limitations (for example, retaining data for anti-fraud or security purposes). 

Along with these core rights, the Act would also grant consumers the right to explicitly opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal, or similarly significant, effects. Such effects include the denial of financial and lending services, housing, insurance, education enrollment, employment opportunities, health care services, and more. Unlike the CCPA, the Act would not prescribe specific opt out methods (like a “Do Not Sell My Information” button on websites), but instead require that opt-out methods be “clear and conspicuous.” It would also commission a government study on the development of technology, such as a browser setting, browser extension, or global device setting, for consumers to express their intent to opt out. 

For all of these individual rights, companies are required to take action free of charge, up to twice per year, within 45-90 days (except in cases where requests cannot be authenticated or are “manifestly unfounded or excessive”). Importantly, the law would also require that companies establish a “conspicuously available” and “easy to use” internal appeals process for refusals to take action. With the consumer’s consent, the company must submit the appeal and an explanation of the outcome to the Washington Attorney General, whether any action has been taken, and a written explanation. The Attorney General must make such information publicly available on its website. When consumers make correction, deletion, or opt out requests, the Act would oblige controllers to take “reasonable steps” to notify third parties to whom they have disclosed the personal data within the preceding year.

Finally, the Act would prohibit companies from discriminating against consumers for exercising these individual rights. Such discrimination could include the denial of goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods and services.

3. Obligations for De-identified and Pseudonymous Data

Under the Act, companies processing “pseudonymous data” would not be required to comply with the bulk of the core individual rights (access, correction, deletion, and portability) when they are “not in a position” to identify the consumer, subject to reasonable oversight. Notably, the Act defines pseudonymous data consistently with the GDPR’s definition of pseudonymization, as “personal data that cannot be attributed to a specific natural person without the use of additional information, provided that such additional information is kept separately and is subject to appropriate technical and organizational measures to [protect against identification].” This is also consistent with the Future of Privacy Forum’s Guide to Practical Data De-Identification. Pseudonymous data is often harder to authenticate or link to individuals, and can carry lessened privacy risks. For example, unique pseudonyms are frequently used in scientific research (e.g., in a HIPAA Limited Dataset, John Doe = 5L7T LX619Z). 

In addition, companies may refuse to comply with requests to access, correct, delete, or port data if the company: (A) is not reasonably capable of associating the request with the personal data, or it would be unreasonably burdensome to associate the request with the personal data; (B) does not use the personal data to recognize or respond to the data subject, or associate the personal data with other data about the same specific consumer; and (C) does not sell personal data to any third party or otherwise voluntarily disclose the personal data to any third party other than a processor (service provider). 

Importantly, other requirements of the overall bill, including Data Protection Assessments (below), and the right to Opt Out of data processing for targeted advertising, sale, and profiling (above) would still be operational for pseudonymous data.

Finally, the Act would not apply to de-identified data, defined as “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable natural person, or a device linked to such person,” subject to taking reasonable measures to protect against re-identification, including contractual and public commitments. This definition aligns with the FTC’s longstanding approach to de-identification. 

Legislators revealing the Act during a live press conference on January 13, 2020 at 2:00pm PST.

4. Obligations of Processors (Service Providers)

In a structure that parallels the GDPR, the Act distinguishes between data “controllers” and data “processors,” establishing different obligations for each. Almost all of the provisions of the Act involve obligations that adhere to a controller, defined as “natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.”

Data processors, on the other hand, “natural or legal person who processes personal data on behalf of a controller,” must adhere (as service providers) to controllers’ instructions and help them meet their obligations. Notwithstanding controller instructions, processors must maintain security procedures that take into account the context in which personal data is processed; ensure that individual processors understand their duty of confidentiality, and may only engage a subcontractor once the controller has had the chance to object. At the request of the controller, processors must delete or return personal data. Processors must also aid in the creation of data protection assessments.

5. Transparency (Privacy Policies)

The Act would require companies to provide a Privacy Policy to consumers that is “reasonably accessible, clear, and meaningful,” including making the following disclosures:

Additionally, if a controller sells personal data to third parties or processes data for certain purposes (i.e. targeted advertising), they would be required to clearly and conspicuously disclose such processing, as well as how consumers may exercise their right to opt out of such processing. 

6. Data Protection Assessments

Companies would be required under the Act to conduct confidential Data Protection Assessments for all processing activities involving personal data, and again any time there are processing changes that materially increase risks to consumers. In contrast, the GDPR requires Data Protection Impact Assessments only when profiling leads to automated decision-making having a legal or significant effect upon an individual (such as credit approval), when profiling is used for evaluation or scoring based on aspects concerning an individual’s economic situation, health, personal preferences or interests, reliability or behavior, location or movements, or when it is conducted at large-scale on datasets containing sensitive personal data.

Under the WPA, in weighing benefits against the risks, controllers must take into account factors such as reasonable consumer expectations, whether data is deidentified, the context of the processing, and the relationship between the controller and the consumer. If the potential risks of privacy harm to consumers are substantial and outweigh other interests, then the controller would only be able to engage in processing with the affirmative consent of the consumer (unless another exemption applies, such as anti-fraud measures and research). 

7. Sensitive Data 

Companies must obtain affirmative, opt-in consent to process any “sensitive” personal data, defined as personal data revealing:

Although the Act requires consent to process data from a “known child,” an undefined term, it notably also exempts data covered by the Family Educational Rights and Privacy Act (FERPA) and entities that are compliant with the Children’s Online Privacy Protection Act (COPPA). The Act defines a child as a natural person under age thirteen, meaning it does not follow the approach of CCPA and other bills around the country that extend child privacy protections to teenagers. 

8. Collection, Processing, and Use Limitations 

In addition to consumer controls and individual rights, the Act would create additional obligations on companies that align with the GDPR:

The obligations imposed by the Act would not restrict processing personal data for a number of specified purposes. Those exemptions include cooperating with law enforcement agencies, performing contracts, providing requested products or services to consumers, processing personal data for research, consumer protection purposes, and more. If processing falls within an enumerated exception, that processing must be “necessary, reasonable, and proportionate” in relation to a specified purpose. Controllers and processors are also not restricted from collecting, using, or retaining data for specific purposes such as conducting internal product research, improving product and service functionality, or performing internal operations reasonably aligned with consumer expectations. 

9. Enforcement

The Act would not grant consumers a private right of action. Instead, it would give the Attorney General exclusive authority to enforce the Act. The Act would cap civil penalties for controllers and processors in violation of the Act at $7,500 per violation. A “Consumer Privacy Account,” in the state treasury, would contain funds received from the imposition of civil penalties. Those funds would be used for the sole purpose of the office of privacy and data protection. The Attorney General would also be tasked with compiling a report evaluating the effectiveness of enforcement actions, and any recommendations for changes. 

10. Commercial Facial Recognition 

In addition to its baseline requirements, the Act contains provisions specifically regulating commercial uses of facial recognition. The Act would require affirmative, opt in consent as a default requirement, and place heightened obligations on both controllers and processors of commercial facial recognition services, particularly with respect to accuracy and auditing, with a focus on preventing unfair performance impacts. A limited exception is provided for using this technology for uses such as to track the unique number of users in a space, when data is not maintained for more than 48 hours and users are not explicitly identified.

Definitions

The Act provides a number of core definitions that are relevant only to the facial recognition provisions (Section 18, the final section of the bill). Given the standalone nature of this section of the overall bill, the definitions can be very impactful. The term “facial recognition service” is defined as technology that analyzes facial features and is used for identification, verification, or persistent tracking of consumers in still or video images. 

Additional definitions are as follows: 

Additional Duties on “Processors” and “Controllers” of Facial Recognition Services

The Act would place affirmative duties on processors, or service providers (see above for definitions of controller and processor under the Act), when they provide facial recognition services. Those duties include enforcing current provisions against illegal discrimination, as well as providing an API or other means for controllers and third parties to conduct fairness and accuracy tests. If such tests reveal unfair performance differences (e.g. bias based on a protected characteristic), the processor must develop and implement a plan to address those differences.

Controllers must also take affirmative steps to post notice in public spaces where facial recognition services are deployed; obtain consent from consumers prior to enrollment in a service operating in physical premises open to the public; ensure meaningful review for potentially harmful uses of the service; test the service and take reasonable steps to ensure quality standards; and engage in staff training. Conspicuous public notice includes, at a minimum, the purpose for which the technology is deployed and information about where consumers can obtain additional information (e.g. a link for consumers to exercise their rights). 

Consent would not be required for enrolling images for security or safety purposes, but the consumer must have engaged in or be suspected of engaging in criminal activity (e.g. shoplifting); the controller must review the safety/security database no less than biannually and remove templates from individuals no longer under suspicion or who have been in the database for more than three years; and, finally, the controller must have an internal process whereby a consumer may correct or challenge enrollment. Furthermore, controllers must ensure that decisions which could pose legal or significant harms (e.g. the loss of employment opportunities, housing, etc.) are subject to meaningful human review. 

Finally, the Act would prohibit controllers from disclosing personal data obtained from a facial recognition service to law enforcement, unless: required by law in response to a warrant, subpoena or legal order; when necessary to prevent or respond to an emergency involving danger of death or serious physical injury to any person, upon a good faith belief by the controller; or to send information to the national center for missing and exploited children. In addition to these duties, controllers must also comply with consumer requests outlined elsewhere in the Act. 

Insight: Senator Nguyen (jointly with Senator Carlyle and others) have introduced a separate bill regulating state and local government agency uses of facial recognition technologies. In a recent news article, he stated that he did so in order to avoid getting “caught up in any potential political fight.”

OTHER WASHINGTON STATE HOUSE BILLS INTRODUCED TODAY

Washington legislators have been busy drafting a number of other consumer privacy bills. The following nine House Bills, filed by Representatives Smith (D) and Hudgins (D) were also introduced on January 13, 2020 and are intended to accompany the WPA. These bills would:

Did we miss anything? Let us know at [email protected] as we continue tracking developments in Washington State.

Statement by Future of Privacy Forum CEO Jules Polonetsky on the Washington Privacy Act

WASHINGTON, DC – January 13, 2020 – Statement by Future of Privacy Forum CEO Jules Polonetsky regarding the introduction of the Washington Privacy Act (Washington State Senate Bill 6281):

“The Washington Privacy Act is the most comprehensive state privacy legislation proposed to date. The bill addresses concerns raised last year and proposes strong consumer protections that go beyond the California Consumer Privacy Act. It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not.”

READ THE FPF ANALYSIS OF THE WASHINGTON PRIVACY ACT.

# # #

ICYMI: National PTA, Future of Privacy Forum Host Student Privacy Briefing for Parents

On December 12th, the Future of Privacy Forum (FPF) and National PTA recently co-hosted a webinar for parents to learn more about the critical importance of safeguarding their child’s data privacy at school. FPF Director of Youth & Education Privacy Amelia Vance led the discussion about key student privacy laws and trends.

As school districts across the country cope with a wave of new privacy requirements – 130+ new laws specific to student privacy have passed in 41 states since 2013Vance highlighted the ongoing but critical challenge for states to balance access to student data with privacy and security, noting several instances where new laws have inadvertently created limitations on students’ educational opportunities.

“Oftentimes legislatures have to go back and fix the laws… because they didn’t check in with teachers, they didn’t check in with administrators, they didn’t talk to parents about what are the most important things to them,” Vance said. “[Not only about] what are the privacy protections that are important, but also what are the services being provided to your kids that are most important?”

Specifically, Vance cited examples in New Hampshire and Louisiana where strict new privacy laws raised questions about whether schools were permitted to conduct routine activities such as hanging student artwork in hallways, sharing classroom recordings with special education or homebound students, or even producing a yearbook.

Parents play a critical role in bringing these types of issues to light, and the webinar included a review of key questions for parents to ask schools to better understand their privacy policies. Additionally, parents received tips for navigating practical scenarios such as keeping photos of a child off of social media. The webinar highlighted FPF and National PTA’s Parent’s Guide to Student Data Privacy with additional tools and resources for parents.

Vance observed that policymakers’ recent focus on new restrictions for districts, schools, and edtech companies is shifting. “We now see fewer bills imposing more privacy protections, and more bills adding data sharing or ways of surveilling students that could potentially cause harm down the road,” Vance noted.

During the presentation, Vance pointed to a particularly concerning example of overly broad data sharing: Florida’s new “School Safety Portal.” In an effort to prevent school violence, the controversial portal allows school threat assessment teams to access students’ personal information, including whether a child is in foster care or has been bullied due to their disability or sexual orientation.

However, Vance cautioned: “We know that data is not relevant to whether or not a student is a threat, and so it has the potential to cause additional bias and harm students, instead of pointing out actual threats.” FPF and 32 other disability, privacy, education, and civil rights groups first sounded the alarm about this database in a July 2019 letter to Florida Governor Ron DeSantis.

Click here to watch to the full webinar, and access additional student privacy resources for parents here.

To learn more about the Future of Privacy Forum, visit www.fpf.org and subscribe to FPF’s student privacy newsletter.

CONTACT

[email protected]

Award-Winning Paper: "Privacy's Constitutional Moment and the Limits of Data Protection"

For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting to lawmakers and regulators award-winning research representing a diversity of perspectives. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 is Privacy’s Constitutional Moment and the Limits of Data Protection by Woodrow Hartzog of Northeastern University School of Law and Neil Richards of the Washington University School of Law. Whatever your perspective on potential federal privacy legislation, you’ll find this paper to be thought-provoking.


The authors present a case for national privacy legislation that looks beyond data protection and fair information processing (FIPs) principles – the central elements of the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA).

They argue that privacy faces a “constitutional moment” that presents an opportunity to define the structure of our budding digital society. Their position is that while data-protection-focused legislation represents an important step, data protection alone is insufficient. Instead, they suggest an approach to privacy that addresses corporate matters, trustworthy relationships, data collection and processing, and personal data’s externalities.

Corporate Privacy Matters

Hartzog and Richards argue that a framework for privacy legislation should address market power and corporate structures. For example, they write: “Privacy law should be concerned with a number of corporate matters, including limiting how the corporate form is used to shield bad actors from personal liability.” By empowering privacy officers within the corporate structure with more decision-making abilities and protection from executive pushback, policymakers could promote improved corporate responsibility. Additionally, the paper proposes increased antitrust enforcement to broadly limit the power of corporations.

Relational Privacy

The authors recommend passing legislation designed to protect the trust that people place in companies when they share personal information. Legislation could foster discretion, honesty, and loyalty to create trusting relationships between data collectors and individuals. Hartzog and Richards explain that “If companies are to keep the trust they have been given, it is not enough to be merely passively ‘open’ or ‘transparent.’ Trust requires an affirmative obligation of honesty to correct misinterpretations and to actively dispel notions of mistaken trust.”

Information Collection and Processing

Though the authors argue that the GDPR alone would be inadequate in the United States, they recognize the importance of data protection. They recommend stricter limits on data collection, rigid mandatory deletion requirements, and the prioritization of obscurity to improve upon the data protection foundations present in the GDPR.

The External Impacts of Data Collection

The authors claim that the effects of data collection go far beyond the individual, impacting the environment, mental health, civil rights, and democratic values – concerns that could be addressed in privacy legislation.

If you’re interested in “outside-the-box” thinking about the foundations of potential privacy legislation, you’ll want to read the full paper.


The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website. For more information or to RSVP, please visit this page.

Examining Industry Approaches to CCPA “Do Not Sell” Compliance

By Christy Harris and Charlotte Kress

Over the past year, the online advertising (“ad tech”) industry has grappled with the practical challenges of complying with the new California Consumer Privacy Act (CCPA). Once the new law — the first of its kind in the United States — goes into effect on January 1, 2020, businesses operating in California will be required by law to provide California residents (“consumers”) with “explicit notice” and the opportunity to opt-out of the sale of their personal information, thus establishing powerful individual rights that represent a major step forward in US privacy legislation.

Practically speaking, however, the law’s notice and “Do Not Sell” obligations present unique structural challenges for ad tech companies, many of whom operate as intermediaries, lack a direct relationship with users, and may or may not have formal contractual relationships with data supply chain partners, including publisher properties where the personal information and insights about user activity are utilized to power data-driven advertising. In light of the imminent effective date of CCPA and with an aim to address these challenges, several key ad tech players have developed approaches to comply with specific CCPA requirements that demonstrate a variety of perspectives toward viable compliance solutions.

In late November, the Digital Advertising Alliance (DAA) announced the release of new guidelines and a “Do Not Sell Tool” for publishers and third parties, including a new icon that, when clicked, provides users with access to a page where they can opt out of the “sale” of their personal information by participating companies, and access more information about how consumers can exercise their other CCPA rights with those companies. The IAB also introduced its CCPA Compliance Framework for Publishers and Technology Companies, which includes a master Limited Service Provider Agreement and technical specifications from the IAB for passing CCPA-related signals from publishers to supply chain partners. Google recently unveiled its CCPA “Restricted Data Processing” mechanism, which, when enabled by a business, restricts Google’s data processing to activities permitted to service providers under the CCPA. 

In addition to these solutions, the Network Advertising Initiative (NAI) published an analysis to aid companies in determining whether a business activity may, or may not, constitute a “sale” under the CCPA.

In this blog post, FPF summarizes and compares these industry tools and approaches to advertising within the CCPA’s requirements.

Digital Advertising Alliance (DAA):

The DAA announced new voluntary guidelines and technological specifications for implementing a new icon, providing participating businesses with a mechanism to provide notice to users as well as certain options for CCPA purposes. The guidance and tools are separate and distinct from the DAA’s existing AdChoices program – an established initiative that relies on blue icons and links to inform users about third parties’ use of data for advertising, and allows users to opt out of receiving targeted ads. The familiar icon, which is green, but otherwise resembles the DAA’s existing AdChoices and political ad icons, can be implemented by businesses to take users to a page where they can access a new DAA CCPA Opt Out tool, which they can use to opt out of “sales” of personal information for CCPA purposes. Businesses using the icon should use accompanying language such as “CA Do Not Sell My Info” (recommended by the DAA), or other language that would comply with the CCPA.

According to the DAA, the new tools “will allow users to opt out of the sale of their personal information by any or all of the participating companies…including third parties collecting and selling personal information through the publisher’s site or app.” The DAA’s guidance and opt-out tools differ from other CCPA compliance approaches in several notable respects. First, the DAA’s CCPA opt-out will apply to participating third party companies’ activity across all publishers on which those companies operate, as opposed to strictly limiting the “sale” of consumers’ personal information at the individual publisher or business level. Further, participating third party companies would not be permitted to serve targeted ads to an opted-out user, even using data the company obtained prior to a CCPA opt out request. 

In addition, the DAA’s existing Self-Regulatory Principles and Guidelines require companies to commit to not processing “sensitive data” without consent, including health and financial data, regardless of users’ opt-out status, a restriction which does not yet exist in the current California law (although a feature of a proposed 2020 ballot initiative may change that).

Interactive Advertising Bureau (IAB):

In early December, the IAB unveiled the first official version of its CCPA Compliance Framework for Publishers & Technology Companies, which aims to help digital publishers and their downstream ad tech partners in the programmatic advertising environment address the challenges of the CCPA’s “Do Not Sell” obligation. The framework has an accompanying master contract, called the Limited Service Provider Agreement (LSPA), that binds supply chain partners to specific behaviors to meet the law’s provisions, and a set of corresponding technical specifications that guide companies on how to technologically implement the contract in their operations.

The LSPA requires publishers to include a “Do Not Sell My Personal Information” link or icon on their digital property (e.g., webpage) and defines what IAB framework participants (including websites, apps, and advertising partners) must do when a consumer clicks a “Do Not Sell My Personal Information” link (“DNSMPI link”). The agreement creates “service provider” relationships among publishers and third parties when California consumers opt out, thereby restricting data use to only those specific and limited business purposes that are permitted under the CCPA. Publishers’ service providers are prohibited from augmenting an existing consumer profile or creating a new profile where one did not previously exist for those consumers who have opted out. In addition, service providers are prohibited from making ad-buying decisions based on the personal information of a consumer that has clicked a DNSMPI link. Importantly, however, this prohibition does not extend to personal information that was (1) available about the Consumer before that Consumer clicked the link (i.e. available before the 90 day look-back period mentioned in the draft Regulations), (2) is/was sold to the service provider from another property where the Consumer has not opted out, or (3) constitutes “Aggregate Consumer Information” or “Deidentified” information, as defined in the CCPA. Put another way, consumers who opt out via publishers using the IAB Framework may continue to receive targeted ads from companies participating in the Framework in some circumstances. The Framework also requires publishers that “sell” personal information through programmatic ad delivery to provide “explicit” notice regarding their rights under the CCPA, to explain in clear terms what will happen to their data, and to communicate to downstream participants in an auditable manner that such disclosures were given.

The technical specifications detail a series of signals to be sent from publishers to downstream recipients that will indicate whether: (1) “explicit notice” and the opportunity to opt out has been provided (i.e., CCPA applies and proper notice was given), (2) the user has opted-out of the sale of their personal information, and (3) the publisher is a signatory to the IAB Limited Service Provider Agreement (LSPA) and the publisher declares that the transaction is covered as a “Covered Opt Out Transaction” or a “Non Opt Out Transaction” as defined in the agreement. Together, these specifications provide a common baseline for those who choose to participate in the IAB Framework for communicating between consumers, publishers, advertisers, and other ad tech companies.

In contrast to the DAA’s CCPA Opt Out Tool, which will allow consumers to opt out of a participating third party’s “sale” of their personal across all properties where the third party operates, the opt-out limitations of the IAB’s LSPA are instead applicable at the publisher level on a site-specific basis. Under the IAB Framework, a third party may use the personal information of a user provided by a publisher when the user has not opted out. In instances where a user has opted-out, the third party becomes a “service provider” to that publisher and is permitted to use personal information subject to certain limitations. Specifically, personal information that (a) was made available by the publisher more than 90 days prior to user’s opt-out or (b) is received from other publishers (or digital properties) where the user has not opted out.

Google:

Google recently expanded its “restricted data processing” setting to enable websites and apps using its advertising services to comply with the CCPA. At a publisher’s discretion, restricted data processing may be implemented to apply to all users in California or on a per-user basis when a user clicks a “Do Not Sell My Information” link. When enabled, Google will act as the publisher’s “service provider,” meaning that Google will restrict its use of the personal information it receives to only certain permissible business purposes as enumerated in the CCPA. As a result, certain features of Google Ads, including ad retargeting (or “remarketing”) and adding users to audience seed lists, will not be available to advertisers when the data comes from publishers that have enabled restricted data processing. Per an addendum to its Data Processing Terms, Google Analytics will also act as a service provider for affected businesses when they have disabled sharing with Google products and services.

Similar to the DAA and IAB solutions, the Google solution will not impact certain post-opt out data uses that are permitted under CCPA, including ad delivery, reporting, measurement, security and fraud detection, debugging and product improvement information.

Google’s solution also will not apply to “the sending or disclosure of data to third parties” that advertisers, publishers or partners may have enabled in Google’s products and services. This means that other third-party ad tracking or serving (such as data sharing or other uses integrated with, but not provided by Google) will not be affected when restricted data processing is enabled unless disabled by a publisher. Instead, ads will continue to be served on the Google Display Network and other networks. Businesses will need to independently review these practices to ensure compliance with CCPA obligations. Google will not respond to bid requests for cross-exchange display retargeting (remarketing) ads when a publisher sends an opt-out signal.

Google will also integrate certain technical components of the IAB’s CCPA Compliance Framework. Specifically, restricted data processing will be applied in response to the IAB CCPA Framework opt-out signals in certain Google advertising services. When an IAB signal indicates an opt out in AdSense, AdMob and Ad Manager, Google will not pass the bid request on to any third parties via real time bidding. When Google’s DV360 receives an IAB Framework opt-out signal as part of a bid request from third party exchanges, Google will not place a bid.

Network Advertising Initiative (NAI):

While the organizations described previously have announced various frameworks, guidance, and specific tools that companies can use to comply with CCPA requirements, the NAI has developed a high-level analysis. This resource aims to assist ad tech companies in determining whether or not a business activity may be classified as a “sale” under the CCPA, prior to determining if, or which, mechanisms (including any of the available frameworks and tools) should be employed. 

The CCPA defines a “sale” as “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.” The NAI’s analysis explains that the definition of “sale” may be broken down into three main elements, which, when satisfied, the NAI states, would make an ad tech use case a “sale.” 

  1. The use case must involve “personal information.” 
  2. The use case must involve the movement of personal information from one business to another business or third party. 
  3. The use case must involve the exchange of monetary or other valuable consideration for the personal information.

To satisfy the third element identified by the NAI, the requisite monetary or other valuable consideration “must be provided [by the recipient of the personal information] specifically for the purpose of obtaining” personal information, as opposed to having received the personal information incidental to another purpose. For that reason, the NAI points out, it would be difficult to broadly categorize all business activities involving the transfer of personal information as “sales,” because the purpose of a transaction is determined by the intent of the parties to the transaction, and not entirely by any data flows. However, the NAI cautions that whenever a company receives personal information and uses it for purposes that could be monetized independently, such as for profiling or segmenting, that would likely be seen as evidence that the purpose of the transaction was at least partly for the personal information, which could render the exchange a “sale.”

Comparison

Impact of CCPA opt-out on retargeting:

1- https://digitaladvertisingalliance.org/digital-advertising-alliance-do-not-sell-tool-publishers-and-third-parties

2- https://support.google.com/google-ads/answer/2453998

3- https://support.google.com/google-ads/answer/9614122

4- https://www.networkadvertising.org/sites/default/files/nai_code2020.pdf

Impact of CCPA opt-out on analytics:

Scope of CCPA Do Not Sell My Personal Information option:

New White Paper Provides Guidance on Embedding Data Protection Principles in Machine Learning

Immuta and the Future of Privacy Forum (FPF) today released a working white paper, Data Protection by Process: How to Operationalise Data Protection by Design for Machine Learning, that provides guidance on embedding data protection principles within the life cycle of a machine learning model. 

Data Protection by Design (DPbD) is a core data protection requirement introduced in Article 25 of The General Data Protection Regulation (GDPR). In the machine learning context, this obligation requires engineers to integrate data protection and privacy measures from the very beginning of a new ML model’s life cycle and then take them into account at every stage throughout the process. The requirement has frequently been criticized for being vague and difficult to implement in practice. 

The paper, co-authored by Sophie Stalla-Bourdillion of Immuta, Alfred Rossi of Immuta, and Gabriela Zanfir-Fortuna of FPF, provides clear instructions on how to fulfill the DPbD obligation and how to build a DPbD strategy in line with data protection principles. 

“The GDPR has been criticised by many for being too high-level or outdated and therefore impossible to implement in practice,” said Stalla-Bourdillon, senior privacy counsel and legal engineer at Immuta. “Our work aims to bridge the gap between theory and practice, to make it possible for data scientists to seriously take into account data protection and privacy requirements as early as possible. Working closely with engineers, we have built a framework to operationalise data protection by design, which should be seen by all as the backbone of the GDPR.” 

The authors have released this as a working paper and welcome your comments. Please share your comments by sending an email to [email protected]. The ultimate goal of the paper is to begin shaping a framework for leveraging DPbD when developing and deploying ML models. 

READ THE FULL REPORT

Comparing Privacy Laws: GDPR v. CCPA

In November 2018, OneTrust DataGuidance and FPF partnered to publish a guide to the key differences between the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act of 2018 (CCPA). 

Since then, a series of bills, signed by the California Governor on 11 October 2019, amended the CCPA to exempt from its application certain categories of data and to provide different requirements for submission of consumer requests, among other things. The Guide has been updated to take into account these amendments.

Finally, the California Attorney General issued, on 10 October 2019, Proposed Regulations under the CCPA which are intended to provide practical guidance to consumers and businesses. The Proposed Regulations were open for public consultation until 6 December 2019 and, when finalised, will provide an additional layer of regulatory requirements that companies will have to comply with.

This updated Guide aims to assist organisations in understanding and comparing the relevant provisions of the GDPR and the CCPA, to ensure compliance with both laws.

READ THE GUIDE

State Legislators Prioritizing Privacy

Last week, The Council of State Governments (CSG) held its annual conference in Puerto Rico, bringing together bi-partisan state lawmakers from across the country to engage in thoughtful discourse and learn about issues impacting their constituents. Policy Fellow Jeremy Greenberg shares key privacy themes that resonated through the event: 

Greenberg also shared takeaways from the three panels he spoke on: 

In order to provide policy experts with tools and resources to be better informed on data privacy legislation in 2020, FPF recently launched its Privacy Legislation Series. In this series, we are exploring specific legislative aspects of comprehensive privacy laws, in an effort to provide U.S. lawmakers and other policy experts with the resources and tools needed to be prepared to evaluate the range of Federal and State efforts in 2020.

Future of Privacy Forum Submits Comments to FTC on the Children’s Online Privacy Protection Act Rule

WASHINGTON, D.C. – Yesterday, the Future of Privacy Forum (FPF), one of the nation’s leading nonprofit organizations focused on privacy leadership and scholarship, submitted comments to the Federal Trade Commission (FTC) regarding the Children’s Online Privacy Protection Act (COPPA) in response to the agency’s ongoing review of the federal statute.

“As COPPA enters its third decade, we believe it is important for the FTC to be conducting this rule review. As more technology is adopted in both the classroom and the home, the FTC has a responsibility to ensure that COPPA is keeping pace,” says Amelia Vance, Director of FPF’s Youth and Education Privacy Project.

In the letter, FPF urges the FTC to modernize COPPA in three key areas to ensure the law can continue to adapt to the rapidly-changing technology landscape and respond to the evolving ways children use the Internet:

  1. Clarify Policies Related to Voice-Enabled Technologies. The prevalence and accuracy of voice-enabled technologies have increased rapidly as a result of powerful machine learning on large datasets. FPF recommends additional guidance regarding how COPPA’s existing privacy protections apply to voice-enabled technologies, recognizing important distinctions concerning the use of voice data. FPF also recommended codifying the existing nonenforcement policy for operators that do not obtain verifiable parental consent before collecting an audio file of a child’s voice, provided the file is collected solely to perform a verbal instruction or request and is deleted immediately after the purpose fulfillment.
  2. Develop Guidance on COPPA’s “Actual Knowledge” Definition. Creating an internet that is safe and welcoming for children can conflict with preserving the internet that is useful and responsive for adults. The “actual knowledge” standard can be a pragmatic way to balance those interests. However, an influx of technology that analyzes large data sets from general audience websites and services raises questions about the meaning of “actual knowledge.” FPF recommends the FTC develop guidance regarding COPPA’s definition of “actual knowledge” that provide greater clarity for businesses and parents in line with reasonable public policy goals.
  3. Encourage Greater FERPA Alignment. While the primary federal student privacy law, FERPA, has fairly clear requirements for school relationships with education technology (edtech) providers, the requirements of COPPA when edtech providers collect students’ personal information from schools are not. Schools need to know when they may exclusively exercise COPPA’s rights regarding the access and deletion of children’s data. FPF recommends that the FTC promptly clarify the circumstances in which schools may exclusively exercise COPPA rights regarding student data.

To read the FPF’s full comments to the FTC, click here. For a new “Myth Busters” blog from FPF that explains common misunderstandings about COPPA, click here.

For more information and resources from the Future of Privacy Forum, visit www.fpf.org.