New FPF Study: More Than 200 European Companies are Participating in Key EU-US Data Transfer Mechanism

Co-Authored by: Daniel Neally & Jeremy Greenberg

EU Companies’ Participation in Privacy Shield Grew by More than One-Third Over the Past Year 

EU-US Privacy Shield Essential to Leading European Companies

From Major Employers such as Aldi and Dr. Oetker to Leading Technology Firms like CRISPR Therapeutics and Workwave, European Companies Depend on the EU-US Agreement

Privacy Shield Program Supports European Employment While Adding to Employee Data Protections – Nearly One-Third of Privacy Shield Companies Rely on the Framework to Transfer HR Information of European Staff

The Future of Privacy Forum conducted a study of the companies enrolled in the US-EU Privacy Shield program and determined that 202 European headquartered companies are active Privacy Shield participants. This is a 32% increase from last year’s total of 152 EU companies in the cross-border data transfer framework. These European firms rely on the program to transfer data to their US subsidiaries or to essential vendors that support their business needs. Nearly one-third of Privacy Shield companies use the mechanism to process human resources data – information that is crucial to employ, pay, and provide benefits to workers.

Thousands of major companies, many of which are headquartered or have offices in Europe, rely on the protections granted under the data transfer agreement. With dozens of new companies joining each week to retain and pay their employees or create new job opportunities in Europe, the Privacy Shield has become an integral data protection mechanism for European companies and the European marketplace as a whole.

Overall, FPF found that more than 5,000 companies have signed up for Privacy Shield since the program’s inception – more than 1,300 participants joined in the last year.

Leading EU companies that rely on Privacy Shield include:

–       ALDI, German grocery market chain

–       Alter Domus S.a.r.l., Luxembourg corporate and management services company

–       CRISPR Therapeutics, Swiss gene editing technology

–       Dr. August Oetker KG, German food, drink, household goods and industrial firm

–       EuroNext, European stock exchange

–       International Drug Development Institute SA, Belgian biostatistical and eClinical services company

–       LVMH (also known as Louis Vuitton), French luxury goods maker

–       Modern Times Group, Swedish digital media entertainment company

–       Omni Partners, British hedge fund

–       Randstad – Dutch human resources consultants

–       Workwave – Swedish software developer

FPF research also determined that more than 1,580 companies, nearly one-third of the total number analyzed, joined Privacy Shield to transfer their human resources data.

The research identified 202 Privacy Shield companies headquartered or co-headquartered in Europe. This is a conservative estimate of companies that rely on the Privacy Shield framework – FPF staff did not include global companies that have major European offices but are headquartered elsewhere. The 202 companies include some of Europe’s largest and most innovative employers, doing business across a wide range of industries and countries. EU-headquartered firms and major EU offices of global firms depend on the Privacy Shield program so that their related US entities can effectively exchange data for research, to improve products, to pay employees and to serve customers.

The Privacy Shield is a living and growing instrument, guaranteeing protection of the personal data of European consumers and employees as the backbone of fundamental rights protection within transatlantic commercial exchanges. Given the importance of this mechanism to supporting protection while enabling commerce on both sides of the Atlantic, we encourage careful review of the agreement this year by the European Commission and continued oversight and enforcement of the framework by authorities on both sides of the Atlantic.

The conclusions follow previous FPF studies, which highlighted similar increases in participation and reliance by EU firms on the Privacy Shield program.

Methodology:

For the full list of European companies in the Privacy Shield program, or to schedule an interview with Jeremy Greenberg, John Verdi, or Jules Polonetsky, email [email protected].

FTC Reaches Landmark Settlement Regarding Kids’ Privacy, Clarifies Platforms’ and Video Creators’ COPPA Obligations for Child-Directed Content

By Sara Collins

Last week the Federal Trade Commission (FTC) released details of a settlement with YouTube under the Children’s Online Privacy Protection Act (COPPA). Although notable for its landmark monetary penalty, the settlement is probably more important for the other requirements that it places on YouTube and content creators. Some of YouTube’s settlement obligations exceed COPPA’s statutory and regulatory requirements. Under the settlement:

According to a public statement, YouTube will disable personalized advertising (i.e. behavioral advertising), commenting, and public playlist sharing on content identified as “child-directed.” While not required by the settlement, YouTube will also deploy a machine-learning algorithm to identify videos that are in fact child-directed, but have not been self-identified by content creators.

Basis for the Settlement

The FTC concluded that YouTube had actual knowledge that many channels hosting content on its platform are “child-directed” and therefore trigger COPPA obligations for the company. Google and its subsidiary YouTube will pay a record $170 million to settle allegations by the Commission and the New York Attorney General that the YouTube video sharing service illegally collected personal information from children without their parents’ consent. The settlement includes a monetary penalty of $136 million to the FTC and $34 million to New York State. The $136 million penalty is by far the largest amount the FTC has ever obtained in a COPPA case since Congress enacted the law in 1998; the previous record was $5.7 million against Music.ly/TikTok.

Actual Knowledge

The settlement hinges on COPPA’s “actual knowledge” criteria. COPPA imposes obligations on online services that have actual knowledge that they are collecting, using, or disclosing personal information from children under 13. Services with actual knowledge of such data collection must implement privacy safeguards, including: obtaining verified parental consent to collection of personal information from children; providing parents with access to children’s data retained by the company; and publishing enhanced transparency regarding data practices.

The settlement characterizes YouTube as a third-party operator, rather than a platform. According to the FTC’s COPPA FAQ, third-party operators are deemed to have actual knowledge that children are using a service if:

  1. a child-directed content provider (which is strictly liable for any collection) directly communicates the child-directed nature of its content to the third-party operator; or

  2. a representative of the third-party operator’s ad network recognizes the child-directed nature of the content.

YouTube meets both prongs of this test. The complaint details Google’s knowledge that the YouTube platform hosts numerous child-directed channels. YouTube marketed itself as a top destination for kids in presentations to the makers of popular children’s products and brands. For example, Google and YouTube told Mattel, maker of Barbie and Monster High toys, that “YouTube is today’s leader in reaching children age 6-11 against top TV channels” and told Hasbro, which makes My Little Pony and Play-Doh, that YouTube is the “#1 website regularly visited by kids.” Several channel owners also told YouTube and Google that their channels’ content was directed to children. In other instances, YouTube’s own content rating system identified content as directed to children. 

This set of facts, combined with those from the Musical.ly settlement, puts companies on notice that their interactions with the public, board members, potential advertising partners, and content creators are all relevant factors when the FTC determines whether  platforms have actual knowledge that they collect information from children under the age of thirteen.

Content Creators = Operators

Chairman Simons’ and Commissioner Wilson’s statement notes that individual channels on a general audience platform are “website[s] or online service[s]” and, therefore, each individual creator who operates a channel is “on notice that [the Commission] consider[s] them to be standalone ‘operators’ under COPPA.” This means that YouTube creators may face strict liability for COPPA non-compliance — likely a shock to most creators as they do not control the information collected by YouTube, contract with advertisers, and cannot negotiate how revenue is split. However, because creators can choose whether or not to allow targeted advertising on their channels, the Commission construes YouTube as collecting personal information from children on behalf of individual creators. This raises economic and compliance challenges for creators, because YouTube warns creators that opting out of targeted advertising “may significantly reduce [the] channel’s revenue.” Channel owners who publish child-directed content will likely face a choice: either eschew targeted advertising and the associated data collection, or obtain verified parental consent from parents of under-thirteen viewers to collect personal information from their children.

The statement also announces that the FTC will conduct sweeps of channels following YouTube’s implementation of the order’s provisions. Because COPPA applies to commercial entities (defined broadly to include anyone who makes money from an enterprise that is not a governmental body or non-profit), monetized channels will need to be aware of potential COPPA implications. 

YouTube also stated that the platform would voluntarily implement a machine-learning algorithm to identify child-directed content that is not self-designated by content creators. We expect that this will be similar to other algorithmic screeners, such as those used for DMCA compliance. Critics have charged that YouTube’s algorithms are biased, opaque, and inaccurate. It will be challenging for the platform to algorithmically determine what content is or is not child directed; this analysis will necessarily be nuanced and complex. The analysis is also likely to be subjective. Some YouTube content is plainly child-directed, some channels are clearly not child-directed, and reasonable people can and will disagree about a diverse range of content in between. Any algorithm will be imperfect, creating the risk that the assessments will be both over- and under-inclusive when determining what is and is not child-directed content. 

One probable issue: videos that are appropriate for children (but directed to adults or general audiences) can be inadvertently flagged as directed toward children. The Commission’s Complaint notes that YouTube videos were manually reviewed prior to being featured on the homepage of the YouTube Kids app — a separate service specifically designed for kids, and distinct from the general audience YouTube service. The Commission did not detail what the review entailed; however, one could imagine reviewers making determinations based on the content’s appropriateness for children rather than assessing the content to see if it was specifically directed toward children. For example, a David Attenborough video about sharks might be perfectly acceptable on the homepage of the YouTube Kids app, but on the general audience YouTube service, it probably would not be considered child-directed. Conflating child-appropriate content and child-directed content is common. Machine-learning algorithms may mitigate or exacerbate this problem. The implications for creators of children’s media (and child-adjacent media) are uncertain, but, at a minimum, creating content for kids will likely be less lucrative.

Targeted Advertising vs. Internal Operations

Several aspects of the settlement hinge on the Commission’s interpretation of COPPA’s “internal operations” exception – a provision that permits operators to collect persistent identifiers from children without parental consent if the identifiers are only used for internal operations – e.g. personalizing content, protecting user security, or ensuring legal compliance. The settlement reinforces the FTC’s interpretation that, in the context of child-directed content, targeted advertising without verifiable parental consent is prohibited by COPPA. While this position is not necessarily controversial — the DMA, IAB, NAI, and most industry leaders agree — it is useful for the FTC to reiterate this point.

The “internal operations” exception also drives the Commissioners’ civil penalty analysis. Commissioner Chopra notes that YouTube used viewing data to enhance the effectiveness of its recommendation engines, and that the value of that enhancement should have been considered when negotiating the ultimate monetary penalty of the settlement. Chairman Simons and Commissioner Wilson disagree, characterizing recommendation engine data as being used within the bounds of the “internal operations” exception to obtaining parental consent. They argue that “obtaining penalties in this matter based on the argument that enhancement of Google’s other products and services through analytics such as page views, time spent on a video, or algorithms for recommending videos is ill-gotten, is highly speculative.” The FTC has requested comment regarding the appropriate boundaries of the “internal operations” exemption, and the subject is likely to be a focus of the upcoming FTC COPPA workshop.