The Right to be Let a Lone Star State: Texas Passes Comprehensive Privacy Bill

Over Memorial Day weekend Texas lawmakers passed the Texas Data Privacy and Security Act (TDPSA) with unanimous votes in both the State House and Senate. If enacted by Governor Abbott, Texas will become the tenth U.S. state (and fifth in 2023) to enact broad-based data privacy legislation governing the collection, use, and transfer of consumer data. TDPSA contains several drafting innovations that drove backers of the bill to call it the “strongest data privacy law in the country.” While this is likely to be a controversial statement (especially to regulators in states such as California, Colorado, and Connecticut), TDPSA’s novel provisions deserve close attention by stakeholders:

  1. Coverage thresholds are tied to the U.S. Small Business Administration’s standards;
  2. Small businesses must obtain consent to sell sensitive personal data;
  3. ‘Opt-Out Preference Signals’ are included (with caveats);
  4. Standalone disclosures are required for certain data sales;
  5. Pseudonymous data is explicitly treated as personal data under certain circumstances.

Despite these unique attributes, TDPSA shares a common underlying framework with every non-California state to enact comprehensive privacy legislation and contains many rights, obligations, and exceptions that will be familiar to stakeholders. For example, the bill includes consumer rights to access, correct, and delete personal information, opt-in requirements for the processing of sensitive data, and opt-out requirements for data sales (defined broadly), targeted advertising, and significant profiling decisions. TDPSA also contains routine business obligations including transparency, security, data protection assessments, non-retaliation, and contractual terms for service providers. Finally, TDPSA would be exclusively enforced by the Attorney General with a right-to-cure that does not expire.   

Below we examine the unique provisions of TDPSA in greater depth.

1. Coverage thresholds are tied to the U.S. Small Business Administration’s standards

To date, both state and federal data privacy framework laws and pending proposals have carved out certain small businesses from coverage. Typically, businesses will fall within the scope of a state privacy law based upon the number of in-state residents about whom it processes data, ranging between 50,000 individuals (Montana) to 175,000 (Tennessee). TDPSA breaks from this trend by exempting companies if they meet the United States Small Business Administration’s (SBA) definition of “small business.”

The SBA designates organizations as “small businesses” using an industry-specific model that incorporates both revenue and employee thresholds. While the typical small business carve outs in state privacy laws have long been recognized as inherently arbitrary and inconsistent between states of different populations, it is not clear whether TDPSA’s approach is inherently superior. For example, a company in a non-data intensive line of business may have a large number of employees, whereas a startup or other organization with very few employees could still process massive amounts of sensitive consumer data for privacy-diminishing purposes.

According to the SBA there are only just over 20,000 U.S. firms that do not meet its definition of “small business”. Given that the large population of Texas (nearly 30 million in 2021) would make it relatively easy for organizations to meet the typical consumer-data threshold if TDPSA used standard coverage provisions, the incorporation of the SBA definition means that TDPSA will likely apply to a far narrower range of organizations than it may have otherwise. 

2. Small businesses must obtain consent to sell sensitive personal data

Perhaps the most significant impact of TDPSA (which has already influenced Florida’s Digital Bill of Rights, passed on May 4) is a novel requirement that small businesses operating within the state obtain prior consent of an individual before selling their sensitive personal data. TDPSA defines “sale of personal data” broadly to include transfers for both monetary “or other valuable consideration” by a controller to a third party, likely implicating data transfers as part of the online advertising ecosystem. For ‘large businesses’ in scope of the full bill, TDPSA broadly requires consent for ‘processing’ sensitive personal data, consistent with several other state privacy laws.

3. ‘Opt-Out Preference Signals’ are included (with caveats)

TDPSA will be the fifth U.S. state privacy law that explicitly permits individuals to exercise certain rights on a default basis through technological signals, such as those sent by an Internet browser setting or extension. While these provisions should significantly ease the burdens of privacy self-management for individuals, TDPSA will also give businesses greater leeway to ignore these signals than other states if certain conditions are met. For example, TDPSA will not require a covered entity to respond to an otherwise valid signal if it “does not possess the ability to process the request” or “does not process similar or identical requests” for the purpose of complying with other state privacy laws.

4. Standalone disclosures are required for certain data sales

Like all other state privacy laws, TDPSA will require controllers to post a privacy notice that includes descriptions of the data that it collects, their processing purposes, and under what circumstances data may be transferred to third parties. However, TDPSA is unique in requiring that businesses, where applicable, post the following disclaimers, verbatim, in the same location and in the same manner as their privacy notices: “NOTICE: we may sell your [sensitive personal data / biometric personal data.]”

5. Pseudonymous data is explicitly treated as personal data under certain circumstances

The treatment of personal data that has been pseudonymized is a significant, but commonly overlooked aspect of state privacy laws. Like Virginia, Connecticut, and some other states, TDPSA’s individual rights of access, correction, and deletion do not extend to data that is demonstrably pseudonymized (though consumer opt-out rights are not included in this carveout). However, Texas is unique in explicitly including pseudonymous data in the definition of “personal data” in circumstances where such data is used “in conjunction with additional information that reasonably links the data to an identified or identifiable individual.” In practice, whether this addition proves to be a meaningless tautology or expands consumer protections is likely to depend on evolving interpretations, business practices, and enforcement priorities both in Texas and other states.

During Asian Pacific American Heritage Month, a look at how better data can benefit AANHPI individuals and communities

May is Asian Pacific American Heritage Month (APAHM), a celebration of Asian Americans, Native Hawaiians, and Pacific Islanders (AANHPIs) in the United States. However, there are challenges that this rapidly growing racial group experiences, specifically regarding the collection and use of AANHPI data. In honor of APAHM, we are highlighting the gaps in research and data collection around AANHPI populations that have led to, among other things, a loss of political representation and valuable resources, as well as concrete steps that can be taken today and into the future to ensure more equitable outcomes. 

Individuals from AANHPI communities hail from more than 40 countries and include even more ethnic identities. This term captures a collection of people with diverse languages, histories, ethnicities, idiosyncrasies, and cultures. It also represents the fastest-growing racial and ethnic group within the U.S. However, errors in data collection and management mean that individuals from these communities are politically under-represented and economically and socially under-resourced. An analysis of data from the 2020 U.S. Census revealed that 68% of U.S. counties undercounted their Asian populations, and 55% undercounted the number of Native Hawaiian and Pacific Islander (NHPI) individuals.

Data plays a critical role in policymaking by providing empirical evidence and offering insights that inform decision-making processes helping to ensure that policymaker choices are grounded in facts. The U.S. decennial census provides a wellspring of data from across the country every ten years. This data determines the number of seats each state receives in the U.S. House of Representatives and is used for redistricting at various levels of government, including state legislatures, county commissions, and city councils. Census data is also instrumental in allocating federal funding to communities for community and social services. More generally, census data provides valuable insight into the characteristics and needs of a community and can reveal disparities across different communities. Policymakers and government leaders can tailor policies and programs to their communities by using census data like race, income levels, education levels, and housing conditions; researchers and analysts may use that same data to recommend interventions to address economic, healthcare, housing, and educational needs. 

In addition to contributing to a lack of resource and political allocation, data quality issues related to AANHPI individuals may also perpetuate common themes of racism and stereotyping. For example, aggregated data for AANHPIs hides the health and economic struggles of some Asian subgroups, subjecting them to false beliefs that all AANHPIs are successful and well-adapted, when in reality, their struggles are rendered invisible.

When it comes to ensuring proper representation, recognizing the problem is only the first step. The following recommendations are necessary to start the process of better accounting for and serving the diverse needs of all individuals in the U.S., including individuals from AANHPI communities:

1. Create pathways for privacy-protective data disaggregation: It is crucial to recognize the heterogeneity within the AANHPI population and avoid treating it as a monolithic group in data analysis and policy formulation. Disaggregated data breaks down population-level information by detailed sub-categories and can reveal details about sub-groups that are experiencing deprivations and inequalities that are not fully reflected in aggregated data. Civil rights groups have consistently advocated for disaggregated data to better understand the needs of the AANHPI community. While there are privacy risks with disaggregation that need to be recognized and accounted for, disaggregated data for the AANHPI population can also provide detailed information on specific ethnic subgroups within the population allowing for an understanding of disparities and tailoring of policies to address unique needs. For example, disaggregated data for AANHPI populations reveals that while Indian Americans have an average poverty rate of 6%, Mongolian Americans and Burmese Americans have a poverty rate of 25%. Disaggregated data also reveals that while 75% of Taiwanese Americans hold a bachelor’s degree, only 14% of Laotian Americans do.

2. Avoid stereotyping and generalization: Stereotypes and generalizations about AANHPIs can influence how data is collected, analyzed, and interpreted. Researchers conducting studies on certain populations may overlook conducting the same studies on AANHPI populations due to harmful assumptions, including the model minority myth. For instance, a 2017 report on perinatal morbidity and mortality of AANHPI women found that despite a higher socioeconomic status, AANHPI women “experience higher rates of maternal morbidity and mortality.” Additional studies on perinatal experiences of women during COVID-19 pointed out that AANHPI women had been “largely underrepresented in study samples” and emphasized a need for targeted research for culturally responsive care for perinatal women. Another similar report stated that data on AANHPI women was “limited due to scarcity of existing studies.”

3. Knockdown language barriers: AANHPIs, particularly those with limited-English proficiency or who are recent immigrants, may face barriers to accessing and participating in data collection processes. Language barriers and cultural differences can hinder accurate data collection and representation in datasets, leading to underreporting of important data. The methods used to collect data, such as surveys, should be available in multiple languages, especially in the language spoken by the communities it is targeting. In 2020, the U.S. Census made strides in this area by ensuring respondents could respond to questions directly in 12 different languages, up from six in 2010. In addition, surveys should also be developed and reviewed by trusted community members to ensure translations are accurate and understandable.

4. Ensure accurate reporting: The COVID-19 pandemic and the rise in anti-Asian hate created a climate of fear and distrust towards the AANHPI community that resulted in horrific violence and discrimination. Each year the FBI releases its report on hate crimes statistics, mandated under the Hate Crimes Statistics Act of 1990. However, reporting by law enforcement to the FBI on this information is voluntary, meaning it is likely very under-representative of what is happening. Civil rights organizations have argued against the “voluntary” nature of reporting and argue that the resulting data is likely unreliable and incomplete. FBI Director Christopher Wray stated to Congress that, “Some jurisdictions fail to report hate crime statistics, while others claim there are no hate crimes in their community.” In contrast, AAPI Data released a report that found that 10% of AANHPI adults experienced “hate crimes and hate incidents in 2021,” significantly higher than the national average. Only about 30% of AANHPI individuals report feeling comfortable reporting a hate crime to law enforcement and have cited fear of retaliation, lack of confidence that justice will be served, and concerns about undue attention to their families as their reasoning.

5. Where possible, adjust for bias and underrepresentation in data sets: If the aforementioned barriers to participation in data collection are not addressed, the result will be continued underrepresentation in datasets. As companies, organizations, and government agencies further deploy algorithmic tools for a wider variety of tasks and programs, this under-representation in the data those tools are trained on may lead to even more significant impacts on individuals from impacted communities. Organizations developing tools need to be aware of potential data quality issues and appropriately and ethically adjust for bias and underrepresentation when possible to ensure that decisions made by these tools do not exacerbate the existing challenges for individuals and communities.

Addressing underrepresentation in data requires improved data collection methodologies that acknowledge the diversity within the AANHPI population, allocate resources to support comprehensive data collection efforts, and promote disaggregated reporting. Policymakers and researchers should work collaboratively with AANHPI communities to ensure accurate and representative data collection, analysis, and interpretation to inform policies that address the unique needs and challenges faced by different AANHPI groups.

What to Expect from the Review of Australia’s Privacy Act

The author thanks Anna Johnston and Alex Kotova (Salinger Privacy) for their review and comments and Gabriela Zanfir-Fortuna, Josh Lee Kok Thong, Lee Matheson, and Isabella Perera (FPF) for their support with editing this post. 

On February 16, 2023, Australia’s Attorney-General’s Department (AGD) released a final report (Review Report) on its multi-year review of Australia’s main privacy law, the Privacy Act 1988 (Privacy Act). The Review Report presents over a hundred concrete proposals to reform the Privacy Act. 

A common trend across many of these proposals is the AGD’s desire to bring the protections in the Privacy Act, which was first enacted over 30 years ago, closer to those provided by other major data protection laws internationally, in particular, the GDPR. Notably, the Review Report proposes: 

Other notable proposals in the Review Report include heightened transparency requirements for automated decision-making (ADM), and the introduction of a direct right of action for breaches of the APPs and separate statutory tort for serious invasions of privacy.

The Australian Government will likely translate at least some of the Review Report’s proposals into a Bill to amend the Privacy Act, which may be introduced in Parliament later in 2023.

Background

Australia’s Privacy Act, which was passed by the Australian Parliament in 1988 and took effect in 1989, was one of the world’s first data protection laws.

In the three decades since the Act was passed, it has been regularly updated through legislative amendments. While the Act originally only extended to federal government agencies, major reforms in 2000 expanded the scope of the Act to the private sector for the first time. Further amendments in 2014 introduced a unified set of 13 “Australian Privacy Principles” (APPs) applying to all “APP entities” – a broad term which comprises a wide range of public- and private-sector entities other than small businesses and registered political parties. 

The APPs establish rights and obligations regarding:

The latest round of amendments to the Privacy Act, which were passed in December 2022 in response to several high-profile data breaches, increased penalties under the Act and expanded the enforcement powers of the privacy authority, the Office of the Australian Information Commissioner (OAIC).

In 2019, the Australian Competition and Consumer Commission (ACCC) published a final report on its three-year “Digital Platforms Inquiry” (DPI Report), which focused on the impact of online search engines, social media platforms, and other digital aggregation platforms on competition in the advertising and media markets. Broadly, the DPI Report recommended amending the Privacy Act to increase protection of personal information, including strengthening notice and consent requirements, enabling individuals to request erasure of their personal information, and providing individuals with a right of action against entities that interfere with their privacy.   

In 2020, the AGD initiated a comprehensive review of the Privacy Act in response to the ACCC’s recommendations in the DPI Report. To that end, the AGD held two rounds of public consultation before releasing its final Review Report.

Overview of the Review Report’s Proposals

The Report makes 116 proposals to reform the Privacy Act in 28 key areas. Broadly, these proposals aim to strengthen the protection of personal information and individuals’ control over how their personal information is collected, used, and disclosed.

Several of the most notable proposals from the Review Report include:

  1. Amending the definition of “personal information” to broaden it and bring it in line with other modern data protection laws, like EU’s General Data Protection Regulation (GDPR);
  2. New requirements around de-identified information;
  3. Removing the “small business exemption” for private-sector entities with an annual turnover below AU$3 million;
  4. Amending the Privacy Act’s notice, consent, and transparency requirements;
  5. A new “fair and reasonable” test for all forms of processing of personal information;
  6. A new requirement to conduct a PIA for high-risk activities;
  7. New requirements concerning children’s privacy;
  8. New rights for individuals, such as the right to erasure of one’s personal information, to be delisted from online search engines, and to opt out of certain forms of processing, including ADM;
  9. New transparency requirements for ADM;
  10. New protections against direct marketing, targeting, and trading in personal information;
  11. A new controller-processor distinction;
  12. Refining the Privacy Act’s cross-border data transfer mechanisms;
  13. Introducing a tiered penalty framework;
  14. Introducing a direct right of action for breaches of the APPs and separate statutory tort for serious invasions of privacy; and
  15. Amending the Privacy Act’s data breach notification requirements.

1. The definition of “personal information” would be broadened (Proposals 4.1-4.4)

The Review Report proposes amending the definition of “personal information” in the Privacy Act so that the term would no longer refer to information “about” an individual but, rather, information that “relates to” an individual. This proposed change would bring the Privacy Act’s definition of “personal information” closer to the definitions of similar terms in other leading data protection laws, such as the EU’s General Data Protection Regulation (GDPR).

The Report also makes several other proposals to clarify the scope of the term “personal information,” including suggesting examples of personal information to be included in the Act and encouraging the OAIC to issue guidance on the relevant factors for determining whether information relates to an individual and how entities should assess whether an individual is “reasonably identifiable.”

2. De-identified information would be defined and included under some of the provisions of the Privacy Act (Proposals 4.5-4.8)

The Review Report proposes:

In particular, APP entities would be required to take reasonable steps to protect de-identified information from:

The Report proposes that, when disclosing de-identified information overseas, an APP entity would be required to take reasonable steps to ensure that the overseas recipient does not breach the APPs, including ensuring that the recipient does not re-identify the information or further disclose the information in a way that would undermine the effectiveness of de-identification.

The Report also proposes prohibiting re-identification of de-identified information, subject to exceptions, and holding further consultations on creating a criminal offense for malicious re-identification.

3. The small business exemption would be removed (Proposals 6.1-6.2)

The Privacy Act currently applies to private sector entities only if they either have an annual turnover of AU$3 million or above or if they undertake certain activities, such as providing a health service. The Review Report proposes removing the exception for small businesses whose annual turnover is below AU$3 million, based on community expectations that entities should protect personal information regardless of their annual turnover and the risk posed by serious data breaches.

However, the Review Report also proposes that before the exception is removed, an impact assessment should be conducted, and other measures should be undertaken, to ensure that small businesses are in a position to comply with the Privacy Act’s requirements.

4. The requirements for notice, consent, and transparency would be toughened (Proposals 11-12)

The current version of the Privacy Act generally requires an entity to provide individuals with a collection notice containing certain prescribed information when the entity collects personal information directly from those individuals (see APP 5). 

However, the Act generally only requires the entity to obtain the individuals’ consent in a limited set of circumstances, such as collection of sensitive information, and use or disclosure of personal information for a secondary purpose (see APP 6) or for direct marketing (see APP 7). 

The Review Report does not alter these fundamental requirements but instead, proposes clarifying these requirements by:

The Report also encourages the OAIC to develop guidance on how online services should design consent requests, and on standardized templates and layouts for privacy policies and collection notices, using standardized terminology and icons.

5. A requirement that collection, use, and disclosure of personal information must be “fair and reasonable in the circumstances” would be introduced (Proposal 12)

The Privacy Act currently requires that the collection of personal information must be done by lawful and fair means (APP 3.5). The Review Report proposes replacing this with a much broader requirement that any collection, use, or disclosure of personal information by an APP entity must be “fair and reasonable in the circumstances.” This requirement would apply regardless of whether the entity had obtained consent for collecting, using, or disclosing the personal information in question.

The Report proposes that the reasonability of any given processing activity be assessed objectively based on some or all of the following factors:

6. A PIA would be required for high-risk activities (Proposal 13.1)

The Review Report proposes introducing a new requirement that entities conduct a PIA prior to commencing any activities that are “likely to have a significant impact on the privacy of individuals,” and provide the PIA to the OAIC on request.

The Report also recommends that the OAIC should issue guidance on the relevant factors for determining whether an activity is likely to have a significant impact on individuals’ privacy and provide examples of such activities, which may include:

7. The Review Report seeks to enhance children’s privacy (Proposals 16 and 20)

Proposal 16 of the Review Report proposes introducing several new provisions on children’s privacy in the Privacy Act, including a statutory definition of a “child” as an individual who has not reached 18 years of age.

On children’s capacity to consent to processing of their personal information, the Review Report proposes introducing an express provision in the Privacy Act stating consent is only valid if:

This provision would also cover circumstances in which it would be appropriate or inappropriate for an entity to obtain consent from a child’s parent or guardian.

In line with the Review Report’s proposal to grant the OAIC the power to determine codes of practice, the Report proposes that the OAIC should issue a “Children’s Online Privacy Code” for online services that are likely to be accessed by children. The Code would align with the UK’s Age Appropriate Design Code, and would provide guidance on how collection notices and privacy policies should be designed with children in mind.

Proposal 20 of the Report also proposes prohibiting:

8. A right to erasure and a right to be delisted would be introduced  (Proposals 18, 19, and 20)

The Privacy Act currently provides individuals with the rights to access personal information about them that is held by an APP entity (see APP 12) and if the information is inaccurate, out-of-date, incomplete, irrelevant or misleading, require the APP entity to correct the information (see APP 13). 

The Review Report proposes establishing several new rights for individuals over their personal information in addition to those under APPs 12 and 13. The proposed new rights would include the rights to:

These rights would be subject to exceptions for countervailing public interest or legal interests, or where compliance would be impossible, unreasonable, frivolous, or vexatious.

The Review Report also proposes introducing new obligations for entities to help individuals to exercise their rights under the Privacy Act, including obligations to:

9. A right to explanation of substantially automated decision-making is proposed (Proposal 19)

The Review Report proposes introducing several new transparency requirements concerning the use of ADM in processing of personal information. In particular:

10. Absolute opt-out from direct marketing and targeting is included in the reform (Proposal 20)

The Review Report proposes expanding the Privacy Act’s provision on direct marketing to:

The Report proposes that the Privacy Act should grant individuals an unqualified right to opt-out of the use and disclosure of their personal information for direct marketing purposes and to receiving targeted advertising. Entities would also be required to obtain individuals’ consent before trading in individuals’ personal information and, more broadly, provide information about targeting, including their use of algorithms and profiling to recommend content.

11. A controller-processor distinction would be made (Proposal 22)

Currently, the APPs apply to entities that “hold” personal information. This includes both entities that control such information and those that simply possess a record of it. During the review of the Privacy Act, the AGD received feedback that this scope of application presents compliance challenges for APP entities that hold an individual’s personal information but do not have a direct relationship with the individual (e.g., outsourced service providers).

The Review Report, therefore, proposes introducing the concepts of “controllers” and “processors” into the Privacy Act. This would expand the scope of the Privacy Act to non-APP entities that process personal information on behalf of APP entity controllers and would bring the Act closer to other data protection laws that recognize a controller-processor distinction, such as the GDPR and the data protection laws of Brazil, Hong Kong, Japan, New Zealand, Singapore, and South Korea.

If the AGD’s proposal is adopted in its current form, the Privacy Act would be amended to include the new concepts of:

12. New requirements around cross-border data transfers would be added (Proposal 23)

The Review Report recommends retaining the Privacy Act’s existing framework for cross-border data transfers (see APP 8) but proposes several additions to this framework, including:

Notably, in its 2021 discussion paper, the AGD proposed removing consent as a basis for transferring personal information out of Australia in situations where the entity has not taken reasonable steps to ensure the overseas recipient does not breach the APPs. 

However, as this proposal met resistance from numerous stakeholders, the Review Report proposed retaining consent as one of several options to transfer personal information out of Australia. However, the AGD also proposed adding a requirement for entities that seek to rely on consent for this purpose to consider, and specifically inform individuals of, any privacy risks that may result from cross-border transfers of their personal information.

13. Penalties would be restructured (Proposal 25)

The Review Report proposes replacing the Privacy Act’s existing penalty framework with a three-tiered framework on a scale of severity spanning:

As an alternative to issuing low-tier penalties, the Review Report proposes empowering the OAIC to issue infringement notices. The amount payable under an infringement notice is typically 20% or less of the maximum amount of the related civil penalty provision.

14. Direct right of action and statutory tort for invasion of privacy would be introduced (Proposals 26 and 27)

The Review Report proposes introducing:

The proposed direct right of action would be available to any individual or group of individuals (i.e., in a class action) who have suffered loss or damage due to a privacy interference by an APP entity. Loss or damage would need to be established within the existing meaning of the Act, including injury to the person’s feelings or humiliation.

To exercise the direct right of action, a claimant would first need to make a complaint to the OAIC and have their complaint assessed for conciliation by the OAIC or a recognized External Dispute Resolution scheme. If the complaint is deemed unsuitable for conciliation, or if conciliation is unlikely to resolve the dispute, the complainant would have the option to pursue the matter further in the Federal Court or the Federal Circuit and Family Court of Australia. Available remedies would be any order that the Court sees fit, including any amount of damages.

The proposed statutory tort for serious invasions of privacy would require a claimant to prove that there had been an intrusion into seclusion or misuse of the claimant’s private information that was committed intentionally or recklessly, in circumstances where the claimant otherwise had a reasonable expectation of privacy. The claimant would not need to prove that the invasion caused actual damage, as damages could be awarded for emotional distress. However, the claim would be subject to a “balancing exercise” in which the Court would need to be satisfied that the public interest in privacy outweighs any countervailing public interests.

Proposed defenses to the statutory tort would include:

15. Data breach notification requirements would be clarified (Proposal 28)

The Review Report also proposes amending Section 26WK(2)(b) of the Privacy Act to require an entity to prepare a statement regarding a suspected data breach and give a copy of the statement to the Information Commission within 72 hours.

The Report also proposes introducing new requirements that the statement must set out the steps that the entity has taken or intends to take in response to the breach, including, where appropriate, steps to reduce any adverse impacts on the individuals to whom the relevant information relates. This requirement would be subject to exceptions for any disclosure that would require the entity to reveal personal information or where the harm from disclosure would outweigh any benefit. 

Concluding Notes

While there is still some way to go before these proposals are reflected in actual legislation, several observations can be made. First, the proposed changes in the Review Report represent some of the most extensive proposed reforms to the Privacy Act since its enactment. Second, while several of these reforms bring some parts of Australia’s privacy regime closer in line with other global equivalents like the GDPR (such as in the case of the definition of “personal information” and the controller-processor distinction), they also continue to ensure that Australia’s privacy regime remains uniquely Australian (such as the “fair and reasonable” requirement for the processing of personal data). Third, these proposals come at a time when Australia has been rocked by major data breaches, such as the Optus and Medibank data breaches, and more recently, the Latitude Financial data breach in March 2023. These data breaches may supply additional political will to implement these changes to Australia’s privacy regime. With the government’s response to these proposals expected sometime in 2023 or 2024, the FPF APAC office will continue to track these developments closely.

Shining a Light on the Florida Digital Bill of Rights

On May 4, 2023, the Florida ‘Digital Bill of Rights’ (SB 262) cleared the state legislature and now heads to the desk of the Governor for signature. SB 262 bears many similarities to the Washington Privacy Act and its progeny (specifically the Texas Data Privacy and Security Act). However, SB 262 is unique given its narrow scope of businesses regulated and other significant deviations from current trends in U.S. state privacy legislation, as well as its inclusion of a section in the style of Age-Appropriate Design Code (AADC) regulations but with broader application than the “comprehensive” parts of the bill. This blog highlights five unique and key features of the Florida Digital Bill of Rights: 

1) SB 262 includes a section on “Protection of Children in Online Spaces”, which draws inspiration from the California AADC but diverges in many key aspects.

2) The scope of the comprehensive privacy provisions of SB 262 covers only businesses making $1 billion in revenue and meeting other threshold requirements. 

3) SB 262 creates both familiar and novel consumer rights surrounding sensitive data and targeted advertising, raising compliance questions. 

4) Under SB 262, controllers and processors will have new responsibilities including creating retention schedules and disclosure obligations for the sale of sensitive or biometric data. 

5) Businesses regulated under SB 262 that utilize voice or face recognition, or have video or audio features in devices, will be subject to heightened restrictions for data collected through these services, regardless of whether the device can identify an individual.

Additionally, FPF is releasing a chart to help stakeholders assess how SB 262’s “Protections for Children in Online Spaces” compares to the California Age-Appropriate Design Code Act (California AADC).

1. The “Protection of Children in Online Spaces” Section Draws Inspiration from the California AADC but Diverges in Many Key Aspects

Many amendments were added to SB 262 at the eleventh hour, including several provisions on the ‘Protection of Children in Online Spaces’ (“Section 2”). FPF’s comparison chart assesses each requirement of Section 2 against the California AADC. Section 2 will govern a far broader set of covered entities than the bulk of SB 262’s provisions on privacy, and while it clearly incorporates language and concepts from the California AADC, it contains significant deviations in both scope and substance.

Scope of covered entities

The scope of entities subject to Section 2 is both broader and narrower than the California AADC. While the California AADC broadly applies to all online products, services, and features that are “likely to be accessed by children” under age 18, Section 2 only applies to “online platforms,”  covering social media and online gaming platforms. The definition of “social media platform” includes “a form of electronic communication through which users create online communities or groups to share information, ideas, personal messages, and other content” and does not list any exemptions. “Online gaming platforms” is undefined. While seemingly narrower in scope than the California AADC, Section 2 contains no minimum revenue or user applicability thresholds, meaning that smaller businesses not subject to California’s law may be within scope. Additionally, it is possible that the scope of “social media platform” could encompass a number of non-obvious organizations, depending on how broadly the definition is construed.

No explicit DPIA or age estimation requirements

While Section 2 does not require a data protection impact assessment (DPIA) as required by the California AADC, it instead places a burden of proof on online platforms to demonstrate that processing personal information does not violate any of the law’s prohibitions. Covered platforms may therefore ultimately need to conduct a DPIA or similar assessment to meet this burden of proof.

Like the California AADC, Section 2 defines a child as an individual under 18, though, unlike the AADC, Section 2 does not affirmatively require age estimation. Section 2 also modifies the California AADC’s “likely to be accessed by children” standard to include predominantly likely to be accessed by children, but does not lay out any factors for assessing whether a service is likely to be accessed by children.

Prohibitions

Two key points on which Section 2 of SB 262 diverges from the California AADC are in the restrictions on processing and profiling.

Under Section 2, covered services may not process the personal information of a person under 18 if they have actual knowledge or willfully disregard that processing may result in “substantial harm or privacy risk to children.” The absence of affirmative age estimation requirements and the inclusion of an “actual knowledge or willfully disregard” knowledge standard modifier could be a response to First Amendment objections raised in the NetChoice v. Bonta litigation seeking to strike down the California AADC. The “substantial harm or privacy risk” language is reminiscent of California AADC’s prohibition on processing children’s data in a materially detrimental manner. However, while “material detriment” is undefined in California AADC, Section 2 defines “substantial harm or privacy risk” to include: mental health disorders; addictive behaviors; physical violence, online bullying and harassment; sexual exploitation; the promotion and marketing or tobacco, gambling, alcohol, or narcotic drugs; and predatory, unfair, or deceptive marketing practices or other financial harms.

Both the California AADC and Section 2 contain limits on profiling of people under 18 except in certain circumstances. While both contain an exception for when necessary to provide an online service, product, or feature, the California AADC contains an exemption if the business can demonstrate a “compelling reason that profiling is in the best interests of children.” In contrast, Section 2 contains an exemption if an online platform can demonstrate a compelling reason that profiling does not “pose a substantial harm or privacy risk to children.” It is possible that the affirmative showing required by the California AADC may be a higher threshold to meet than that of Section 2, especially given that the “best interests of children” standard is undefined and is not an established U.S. legal standard outside of the family law context. Furthermore, profiling is defined more broadly in Section 2 to include “any form of automated processing performed on personal information to evaluate, analyze, or predict personal aspects relating to the economic situation, health, personal preferences, interests, reliability, behavior, location, or movements of a child,” rather than “any form of automated processing of personal information to evaluate aspects relating to a person.”

2. The Digital Bill of Rights’ ‘Comprehensive’ Privacy Provisions Will Cover Very Few Businesses.

The types of entities subject to the remaining bulk of SB 262’s ‘comprehensive’ privacy provisions outside of Section 2 are much narrower than comparable U.S. state privacy laws, even the more limited ones. Florida SB 262 will only apply to a handful of companies that meet a threshold annual gross revenue requirement of $1 billion and either (1) make over 50% of revenue from targeted advertising, (2) operate a “consumer smart speaker and voice command component,” or (3) operate an app store with at least 250,000 software applications. This can be compared to recently enacted privacy laws in Iowa and Indiana, which will apply to businesses that either process personal data of at least 100,000 state residents or derive 50% of gross revenue from the sale of personal data of at least 25,000 consumers. Though the terms “targeted advertising” and “consumer smart speaker” in SB 262 could be construed liberally, the revenue requirement means that Floridans will not receive new rights or protections with respect to the vast majority of businesses that collect their personal data in the Sunshine State.

3. The Bill Creates A Complex Stack of both Familiar and Novel Consumer Rights 

SB 262 will establish many rights that are now familiar from U.S. state privacy laws, including confirmation of processing, correction of inaccuracies, deletion, obtaining a copy of a person’s personal data in a portable format, and the ability to opt out of “solely” automated profiling in furtherance of decisions that produce legal or similarly significant effects. However, there are a number of new and unique provisions in the consumer rights sections: 

4. Controllers and Processors Will Have New Responsibilities for Purging Data and Disclosing Certain Practices

Unlike existing comprehensive state privacy laws, SB 262 would require that covered businesses and their processors implement a retention schedule for the deletion of personal data. The text of this provision appears influenced by the Illinois Biometric Information Privacy Act (BIPA). Under SB 262, controllers or processors may only retain personal data until (1) the initial purpose for the collection was satisfied; (2) the contract for which the data was collected or obtained is expired or terminated; or (3) two years after the consumer’s last interaction with the regulated business (subject to exceptions). However, unlike BIPA, SB 262 would not require that the retention schedule be made publicly available and would permit retention necessary to prevent or detect security incidents.

Further, in addition to the typical privacy notices required by state comprehensive laws, SB 262 creates two distinct disclosure requirements. First, again similar to Texas HB 4, if a controller sells sensitive or biometric data, they must provide the following notice: “NOTICE: This website may sell your [sensitive and/or biometric] personal data and/or biometric personal data.” Second, a controller that operates a search engine is required to disclose the main parameters in ranking results, “including the prioritization or deprioritization of political partisan or political ideology” in search results.

5. Businesses that Utilize Voice or Face Recognition, or Have Video or Audio Features in Devices, Have Particular but Perplexing Obligations

Finally, one of SB 262’s most unique provisions is a requirement that covered businesses may not provide consumer devices that engage in “surveillance” when not in active use unless “expressly authorized” by the consumer. Though “surveillance” and “active use” are not defined, the prohibition applies to devices that have any of the following features: voice recognition, facial recognition, video recording, audio recording, “or other electronic, visual, thermal, or olfactory feature” that collects data. SB 262 further fails to define “express authorization,” raising questions as to whether express authorization is analogous to “consent” under the bill, or if a higher standard will be required for express authorization, such as that required in the recently enacted Washington State “My Health, My Data” Act.

SB 262 further provides consumers with the right to opt out of personal data collected by voice or face recognition systems. Voice recognition is broadly defined as collecting, storing, analyzing, transmitting, and interpreting spoken words or other sounds – seemingly encompassing almost all audio-based consumer-facing systems. Facial recognition and the other features are not defined, though one can infer they would have a similarly broad definition as voice recognition. As a result, despite SB 262’s requirement that “biometric data” be used for unique identification of an individual in order to be subject to the legislation’s requirements for sensitive data, most general voice and face systems unrelated to identification will still need to provide consumers’ the ability to opt-out under these provisions. These restrictions and requirements may prove difficult for the functionality of some products that rely on these features, such as accessibility features that use natural language processing to transcribe spoken words. Moreover, despite SB 262’s revenue threshold, these prohibitions and restrictions will likely flow down to any other entity utilizing (or has a software plug-in to) voice assistant devices like Amazon Echo or Apple Siri for customer service, customer ordering, or other forms of user engagement through contractual agreements and requirements.

Conclusion

Given that many of the consumer rights and business obligations of SB 262 will directly apply to very few businesses, it is understandable why the Florida Digital Bill of Rights may have flown under the radar thus far. However, SB 262 is worth a close read, particularly the short-but-impactful section on “Protection of Children in Online Spaces” and provisions creating novel consumer rights. Given Governor DeSantis’ public support for the legislation, we can anticipate the Digital Bill of Rights will be enacted shortly and will go into effect on July 1, 2024–giving stakeholders just over a year to understand compliance obligations. We note, however, that the specific consumer rights and business obligations under SB 262 may evolve as the State Attorney General’s office is granted both mandatory and permissive rulemaking authority. 

New FPF Report: Unlocking Data Protection by Design and by Default: Lessons from the Enforcement of Article 25 GDPR

On May 17, the Future of Privacy Forum launched a new report on enforcement of the EU’s GDPR Data Protection by Design and by Default (DPbD&bD) obligations, which are outlined in GDPR Article 25. The Report draws from more than 92 data protection authority (DPA) cases, court rulings, and guidelines from 16 EEA member states, the UK, and the EDPB to provide an analysis of enforcement trends regarding Article 25. The identified cases cover a spectrum of personal data processing activities, from accessing online services and platforms, to tools for educational and employment contexts, to “emotion recognition” AI systems for customer support, and many more.

The Report aims to explore the effectiveness of the DPbD&bD obligations in practice, informed by how DPAs and courts enforced Article 25. For instance, we analyze whether DPAs and courts find breaches of Article 25 without links to other infringements of the regulation and what provisions enforcers tend to apply together with Article 25 the most, including the general data protection principles and requirements related to data security under Article 32. We also look at what controls and controller behavior are and are not deemed sufficient to comply with Article 25.

The GDPR’s DPbD&bD provisions in Article 25 oblige controllers to: 1) adopt technical and organizational measures (TOMs) that, by design, implement data protection principles into data processing and protect the rights of individuals whose personal data is processed; and 2) ensure that only personal data necessary for each specific purpose is processed. Given the breadth of these obligations, it has been argued that Article 25 makes the GDPR “stick” by bridging the gap between its legal text and practical implementation. GDPR’s DPbD&bD obligations are seen as a tool to enhance accountability for data controllers, implement data protection effectively, and add emphasis to the proactive implementation of data protection safeguards.

Our analysis on the enforcement, and ultimately the effectiveness, of Article 25 is all the more important, given the increasing development and deployment of novel technologies involving very complex personal data processing, like Generative AI, and rising data protection concerns. Understanding how Article 25 obligations manifest in practice and the requirements of DPbD&bD may prove essential for the next technological age.

This Report outlines and explores the key elements of GDPR Article 25, including the:

Additionally, we analyze the individual concepts of “by Design” and “by Default,” identify divergent enforcement trends, and explore three common applications of Article 25 (direct marketing, privacy preservation and Privacy Enhancing Technologies (PETs), and EdTech). This Report also includes a number of Annexes that seek to provide more information on the specific cases analyzed and a comparative overview of DPA enforcement actions. 

Our analysis determines that European DPAs diverge in how they interpret the preventive nature of Article 25 GDPR. Some are reluctant to find violations in cases of isolated incidents or where Article 5 GDPR principles are not violated, while others apply Article 25 preventively before further GDPR breaches or even planned data processing. Our research also finds that most DPAs are reluctant to specify appropriate protective measures and to explicitly outline the role of PETs. Ultimately, the Report shows that despite the novelty of Article 25, and the criticism surrounding its vague and abstract wording, it is a frequent source of some of the highest GDPR fines, highlighting the need for organizations to maintain a firm grasp over the concepts of DPbD&bD.

Vietnam’s Personal Data Protection Decree: Overview, Key Takeaways, and Context

Author: Kat MH Hille

The following is a guest post to the FPF blog from Kat MH Hille, an attorney with expertise in corporate, aviation, and data protection law. She graduated with a J.D. from the University of Iowa, School of Law, and has extensive experience practicing law in both the United States and Vietnam (contact: https://www.linkedin.com/in/katmhh/). The guest blog reflects the opinion of the author only. Guest blog posts do not necessarily reflect the views of FPF.

On April 17, 2023, the Vietnamese Government promulgated the Decree of Personal Data Protection (Decree), which was initially published as a draft on February 9, 2021 and went through several revisions. Before the Decree’s issuance, personal data protection in Vietnam was governed by 19 different laws and regulations, resulting in a fragmented legal framework. The Decree aims to fill these gaps and provide a comprehensive and uniform approach to personal data protection in Vietnam, extending safeguards for personal data to over 97 million people.

This post provides an overview of the Decree, including key dates, context, legal effects, requirements and how they fare with other comprehensive data protection law regimes around the world. Building on this foundation, certain key provisions and notable features of the Decree that warrant attention, including:

These provisions will be discussed in detail below.

1. Overview

The Decree is significant despite its lower status in Vietnam’s hierarchy of laws

As personal data protection is a new and developing area of law in Vietnam, Vietnam’s first legislative instrument on personal data protection takes the form of a “decree,” which is ranked lower in Vietnam’s statutory hierarchy than a code or law, and it is the result of executive action. A benefit of enacting a decree is that it can be done so more easily, without the need for approval from the National Assembly. Nevertheless, the Vietnamese Government’s goal is to ultimately enact a comprehensive and robust law for effective and enforceable personal data protection in 2024, according to a Decision issued by the Prime Minister in January 2022.

However, the Decree’s status means that in the event of conflicting regulations on the same issue, codes and laws would take precedence over the Decree. That said, the Decree remains the first comprehensive personal data protection regulation in Vietnam. Despite its lower legal status,  the Decree still carries significant weight and impact in regulating personal data protection in Vietnam, and those who fail to comply with its provisions will still face legal consequences.

The Decree incorporates a unique blend of global standards and Vietnamese characteristics

Like other data protection laws inspired by the European Union (EU)’s General Data Protection Regulation (GDPR), the Decree sets out the responsibilities of organizations and individuals that process personal data, as well as the rights of  individuals over their personal data. 

However, the Decree also includes unique provisions that are specific to Vietnam’s context, such as a prohibition on the sale and purchase of personal data through any means, unless otherwise provided by law (Article 3.4), which may have significant consequences on the activity of data brokers and other businesses engaged in commodification of personal data. Additionally, organizing the collection, transfer, purchase, or sale of personal data without the consent of the data subject or the act of establishing software systems, as well as implementing technical measures for these purposes constitutes a violation of the Decree.

The Decree introduces the concept of “Personal Data Controllers and Processors,” which are entities or individuals that function both as Personal Data Controllers and Personal Data Processors. This definition is unique to the Decree and distinguishes it from other data protection laws around the world that typically only recognize the separate categories of Personal Data Controllers and Personal Data Processors. While the inclusion of Personal Data Controllers and Processors is meant to provide greater clarity and precision in defining the roles and responsibilities of different actors involved in personal data processing, it may actually add unnecessary complexity to the already complex landscape of privacy laws. This is because a single entity could be classified as both a Personal Data Controller and a Personal Data Processor depending on the specific definition being used, making it difficult to navigate and comply with the requirements of different privacy laws across different jurisdictions.

Further, the enacted Decree does not include a specific fine structure for violation of the Decree (the 2021 draft of the Decree proposed specific fines for single violations of the Decree, including fines of up to 5% of a personal data processor’s revenue for the most serious violations). Rather, the enacted Decree outlines a general provision that violators may be subject to disciplinary action, administrative penalties, or criminal prosecution, depending on the seriousness of the offense. 

Furthermore, compared with the 2021 draft of the Decree, the final Decree does not provide for the establishment of a personal data protection commission to enforce the regulation. Rather, the Decree assigns responsibility for enforcing its requirements to an existing agency within the Ministry of Public Security (MPS), the Cybersecurity and High-Tech Crime Prevention Department (A05).

While MPS will need to clarify key provisions in subsequent regulations, the Decree creates the first comprehensive foundation to govern data processing activities in Vietnam. The Decree will take effect on July 1, 2023, giving organizations only two months to make the necessary adjustments to their business and operations in order to comply with the new regulations. Significant aspects of the Decree are explored below in greater detail.

2. The (extra)territorial scope introduces a nationality criterion for covered entities

The Decree applies to Vietnamese agencies, organizations, and individuals (whether based within or outside of Vietnam), and to foreign agencies, organizations, and individuals that are either based in Vietnam or that are based overseas and directly participate in or are otherwise involved in personal data processing activities in Vietnam. 

Note that “personal data processing” covers a wide range of activities in relation to personal data, including collection, recording, analysis, verification, storage, alteration, disclosure, combination, access, retrieval, erasure, encryption, decryption, copying, sharing, transmission, provision, transfer, and deletion, as well as other related actions (Article 2.7).

There is still ambiguity as to the distinction between being “involved in” and “directly participating in” personal data processing activities, as well as the level of involvement with such activities that would bring a party within the scope of the Decree. Clarity on these issues through further regulations or guidance would be useful, especially considering that many third-party service providers or software vendors may arguably have some involvement in processing personal data.

3. The Decree recognizes a slightly different set of covered actors than other data protection laws

The Decree covers four categories of parties who process personal data:

In recognizing a distinction between controllers and processors, the final Decree removes ambiguity that was present in the 2021 draft of the Decree, which only provided for two categories of actors: personal data processors and third parties.

4. New processing principles, such as “no sale and purchase of personal data by any means”

The Decree outlines eight principles that govern data processing activities, which are similar to those recognized by the GDPR, including lawfulness, transparency of processing, purpose limitation, data minimization, accuracy, storage limitation, and appropriate measures to ensure the security of personal data. However, there are some notable differences.

Sale or Purchase of Personal Data: The Decree takes a more stringent stance than the GDPR by explicitly prohibiting the sale and purchase of personal data in any form, unless otherwise permitted by law. However, another provision in the Decree states that the act of “setting up software systems, technical measures or organization of the … purchase and sale of personal data without the consent of the data subject” is a violation (Article 22). Read together, the two provisions appear to imply that the purchase or sale with consent from the data subject could be permissible. Due to its ambiguity, further clarification is needed.

This stringent prohibition is a direct response to the numerous cases of personal data misuse that have occurred in Vietnam in recent years, including identity theft, financial fraud, intrusive advertising, and the exploitation of vulnerable individuals. A report showed that in 2022 alone, more than 17 million pieces of personal data were illegally harvested and sold for fraud and each personal data entry has been traded 987 times per day. However, the inclusion of a strict prohibition may conversely have a significant impact on industries that rely heavily on the use of personal data to drive innovation and business growth. It is possible that future circulars or guidelines may provide more clarity on this issue, including potential exceptions or allowances for certain use cases.

Notwithstanding this broad prohibition, PDCs and PDCPs may still share personal data with others if they obtain the data subject’s consent to do so, except when such sharing could harm national defense, national security, or public order and safety or could affect the safety or physical or mental health of others (Article 14). However, business entities and individuals providing marketing, product launching, and advertising services may only utilize personal data of their customers collected through their own business activities for conducting such services, if they obtain the data subject’s consent (Article 21).

Purpose Limitation: The Decree imposes a stricter purpose limitation compared to the GDPR, which allows for additional processing if it is compatible with the original purpose. Under the Decree, personal data can only be processed for the specific purposes that have been “registered” or “declared” by the PDC, PDP, PDCP, or TP. This requires these entities to ensure that their data processing activities do not deviate from or expand upon the registered and declared purposes. However, it is important to note that the Decree does not provide any guidance on how processing purposes are to be registered.

5. Covered data: broad definition of sensitive personal data, and stricter accountability rules for its processing

The Decree provides a broad definition of personal data, aligned with other comprehensive data protection laws. It defines personal data as any information that is expressed in the form of symbol, text, digit, image, sound or in similar forms in an electronic environment that is associated with a particular natural person or helps identify a particular natural person. Personally identifiable information means any information that is formed from the activities of an individual and, when used with other maintained data and information, can identify such particular natural person.

The Decree categorizes personal data into two groups: basic personal data and sensitive personal data, and includes an additional set of rules for the latter. 

Basic personal data includes the following forms of personal data:

Sensitive personal data is defined as personal data related to an individual’s privacy, a breach of which would directly affect the individual’s legitimate rights and interests. 

The Decree provides a non-exhaustive list of types of personal data that would be considered sensitive, including:

The list of sensitive personal data provided is more extensive than the GDPR’s definition of sensitive personal data. It includes types of data such as customer information from financial institutions and location data obtained through location services. As non-cash transactions and targeted advertising become increasingly prevalent in Vietnam, these types of data are frequently collected by most businesses. As a result, a wider range of entities, including small and medium businesses, may be subject to sensitive personal data protection requirements due to the broad scope of the list.

The Decree imposes more stringent protection measures for sensitive personal data than for basic personal data. For instance, regulated entities that process sensitive personal data must specifically notify data subjects of any processing of their sensitive personal data. Organizations that are covered by the Decree also must designate a department within their organization and appoint an officer which will be responsible for overseeing the protection of sensitive personal data and communicating with the A05.

Nevertheless, it is important to note that small, medium, and start-up enterprises are given a grace period of 2 years from their establishment to comply with these sensitive data requirements, unless such enterprises are directly engaged in processing personal data (Article 43). To qualify for the exemption, companies in agriculture, forestry, aquaculture, industrial, and construction sectors must have fewer than 200 employees and annual revenue below 200 billion Vietnamese dong (equivalent to approximately 8.7 million USD) or total capital below 100 billion Vietnamese dong (approximately 4.3 million USD), while commercial and service sector companies must have fewer than 100 employees and annual revenue below 300 billion Vietnamese dong (approximately 13 million USD) or total capital below 100 billion Vietnamese dong (approximately 4.3 million USD) in accordance with Decree No. 80/2021/ND-CP (2021) on Elaboration of Articles of the Law on Provision of Assistance for Small and Medium Enterprises.

6. Legal bases for processing personal data: no “legitimate interests,” but introducing “publicly disclosed” personal data

The Decree recognizes six legal bases for processing personal data, namely:

Additionally, under Article 18 of the Decree, competent governmental agencies may obtain personal data from audio and video recording activities in public places without the consent of data subjects. However, when conducting recording activities, the authorized agencies and organizations are responsible for informing data subjects that they are being recorded.

Notably, the Decree does not provide a “legitimate interests” lawful ground like the GDPR. Nevertheless, legitimate interests are recognized in other provisions of the Decree. In particular, Article 8 stipulates “Prohibited Acts,” including processing personal data to create information that affect “legitimate rights and interests of other organizations and individuals”.

As for “valid consent”, there are several conditions that must be met when obtaining it, pursuant to Article 11 of the Decree:

The given consent remains valid until it is withdrawn by the data subject or until a competent state agency requests otherwise in writing. PDCs and PDCPs bear the burden of proof in case of a dispute regarding the lack of consent from a data subject. 

Data subjects may request to withdraw their consent to processing of their personal data (Article 12). When a data subject does so, the PDC or PDCP must inform the data subject of any potential negative consequences or harms from the withdrawal of consent.

If the data subject still wishes to proceed, all parties involved in processing the personal data, including the PDC or PDCP and any PDPs or TPs, must cease processing the personal data. There is no set time frame for fulfilling this obligation, but it should be done within a reasonable period of time. 

The withdrawal of consent must be in a format that can be printed, copied in written form, or verified electronically. The withdrawal of consent shall not render unlawful any data processing activities that were lawfully performed based on the consent given prior to the withdrawal.

7. The rights of the data subject include transparency and control rights, but also rights to legal remedies

Article 9 of the Decree provides data subjects with 11 rights over their personal data, which are linked to corresponding obligations on entities that process personal data:

Note that all of these rights are subject to exceptions provided by the Decree or other relevant laws.

7.1. Transparency requirements include detailed notices and access rights on a tight deadline

According to Article 11 and 13, before processing a data subject’s personal data, a PDC or PDCP must provide a notification to the data subject containing the following information:

However, such notification is not required when personal data is being processed by a competent state authority or if the data subjects have been fully informed of, and have given valid consent to, the processing of their personal data.

Data subjects have the right to request that PDCs and PDCPs provide them with a copy of their personal data or share a copy of their personal data to a third party acting on their behalf (Article 14). The PDC or PDCP must fulfill such a request within 72 hours of receiving it. 

The request must be submitted in the Vietnamese language and made in a standardized format as set out in the Appendix to the Decree. The request must include the requestert’s full name, residential address, national identification number, citizen identification card number, or passport number; fax number, telephone number, and email address (if any); and the form of access and the reason and purpose for requesting the personal data. The data subject must also specify the name of the document, file, or record to which their request pertains (Article 14.6). This requirement can impose a significant burden on data subjects as they may not always be fully aware of which documents or records their personal data is contained within. Additionally, the complexity of data processing can further complicate matters and make it difficult for the data subject to identify the relevant documents.

It is important to note that, unlike the GDPR, the Decree does not require a PDC or PDCP to provide data subjects with comprehensive information about the processing of their personal data in a concise, transparent, intelligible, and easily accessible form, using clear and plain language. 

Moreover, there are certain circumstances in which a PDC or PDCP are not required to provide the data subject with a copy of their personal data. These include where:

7.2. The Decree provides for an absolute right to object to processing, as well as correction and deletion rights

A PDC or PDCP must promptly fulfill a data subject’s request to access their personal data, correct their personal data, or have their personal data corrected, according to Article 15.

The PDP and any third party shall be authorized to edit the personal data of the data subject only after obtaining written consent from the PDC and PDCP and ensuring that the data subject has given their consent

If the PDC or PDCP is unable to fulfill the request due to technical or other reasons, the PDC or PDCP must notify the data subject within 72 hours. 

If a data subject requests that the processing of their personal data be restricted or otherwise objects to the processing of their personal data, the PDC or PDCP must respond to the request within 72 hours of receiving it (Article 9). 

One important difference between this requirement and the one in the GDPR is that the Decree does not provide any exceptions to this requirement. Under the GDPR, a controller may be able to demonstrate compelling legitimate grounds that override the interests, rights, and freedoms of the data subject, or may be able to claim that they need the data for the establishment, exercise, or defense of legal claims.

According to Article 16, the PDC or PDCP must delete personal data about a data subject within 72 hours of a request by the data subject, if:

Personal data shall be deleted irretrievably by the PDC, PDCP, PDP, and/or TP if it was processed for improper purposes or the consented purpose(s) has been fulfilled, if storage is no longer necessary, or if the entity responsible for the data has dissolved or terminated business operations due to legal reasons.

Like the GDPR, the Decree recognizes certain exceptions to the right to delete personal data, such as where:

However, unlike the GDPR, personal data that has been lawfully made available to the public is also exempt from the right to deletion (Article 18). As a result, the PDC or PDCP may reject a data subject’s request to delete personal data that has become public, regardless of whether there are any other lawful grounds for retaining such data. This differs from the GDPR, which does not provide exceptions based solely on the public availability of data.

8. Obligations of Controllers and Processors, from written processing agreements to data security and accountability obligations

PDPs are under an obligation to only receive personal data from a PDC after signing an agreement on data processing with the PDC and only process the data within the scope of that agreement (Article 39). The Decree also provides that personal data must be deleted or returned to the PDC upon completion of the data processing.

8.1. Data security and data breach notification requirements

The Decree has dedicated data security requirements for PDCs. For instance, Article 38 asks them to implement organizational and technical measures, as well as appropriate security and confidentiality measures to ensure that personal data processing activities are conducted lawfully. They also need to review and update these measures as necessary, and record and store a log of the system’s personal data processing activities.

Appropriate security measures are also relevant in the PDC – PDP relationship, as PDCs must select a suitable PDP for specific tasks and only work with a PDP that has in place appropriate protection measures. Interestingly, both PDCs and PDPs have a distinct obligation to cooperate with the MPS and competent state agencies by providing information for investigation and processing of any violations of the laws and regulations on personal data protection.Organizations and individuals involved in personal data processing must implement measures to protect personal data and prevent unauthorized collection of personal data from their systems and service devices. Article 22 of the Decree also prohibits the use of software systems, technical measures, or the organization of activities for the unauthorized collection, transfer, purchase, or sale of personal data without the consent of the data subject.

Under Article 23 of the Decree, in the event of a violation of personal data protection regulations, both the PDC and the PDP, or PDCP, are required to promptly inform the A05. The notification must be made no later than 72 hours after the violation occurred. If the notification is delayed, the reason for the delay must be provided. The current wording in the Decree is broad and without further clarifications and guidance it could be interpreted as meaning a notification is required for any violation of the Decree, not just for data breaches. 

The notification must include a detailed description of the violation, such as the time, location, act, organization or individual involved, types and amount of personal data affected, contact details of those responsible for protecting personal data, potential consequences and damages of the violation, and measures taken to resolve or minimize harm. If it is not feasible to provide a complete notification at once, it can be done incrementally or progressively.

However, Decree 13 does not provide a specific procedure for A05 to handle complaints related to personal data protection violations. Further guidance or clarifications may be issued in the future.

8.2. “Impact Assessment Reports” that have to be made available for inspection

Article 24 of the Decree requires PDCs and PDCPs to compile an impact assessment report (IAR) from the commencement of personal data processing and make the report available for inspection by the A05 within 60 days thereafter.

The IAR must contain:

PDPs are also required to compile an IAR. However, the required content is slightly different, reflecting the difference in roles between PDCs/PCDPs and PDPs. For instance, the Decree requires a PDP to provide a description of the processing activities and types of personal data processed, rather than stating the purpose(s) for processing the data.

9. Cross-Border Data Transfers have a legal definition and a registration requirement

Article 25 of the Decree defines a cross-border transfer of personal data as:

This definition includes the:

In the absence of further specification and relying on a literal reading of the wording in Article 25, a possible interpretation of this definition is that processing outside of Vietnam the personal data of Vietnamese citizens who live outside Vietnam would also qualify as a cross-border data transfer under the Decree. If this interpretation is correct, it would mean that all foreign organizations or individuals processing personal data outside of Vietnam would be subject to the Decree’s “cross-border data transfer” requirements even if there is no actual border of Vietnam involved, insofar as they process the personal data of Vietnamese citizens. It should be noted that the scope of the Decree, as stipulated in Article 1.2, only applies to foreign agencies, organizations, and individuals that are in Vietnam or that directly participate or are involved in the personal data processing activities in Vietnam. This ambiguity may be clarified in a guidance document in the future.

Before a covered entity may transfer personal data out of Vietnam, the Decree requires that the entity must:

The DTA must contain the following information:

In light of the consent disclosure required as part of the DTA and in the absence of further regulatory guidance, it seems that consent is the only basis for cross-border transfers. In addition to all requirements for a valid consent, in the context of cross-border transfers, the consent shall include a clear explanation of the feedback mechanism and the available procedures for lodging complaints in the event of incidents or requests, ensuring a comprehensive understanding for the individuals involved.

The MPS will conduct inspection of the DTA annually unless a violation, data incident, or leakage occurs. The MPS may cease transfers in cases where:

It should be noted that data localization is separately governed under Decree No. 53/2022/ND-CP, which implements the Law on Cybersecurity. The decree applies to both domestic and foreign companies operating in Vietnam’s cyberspace, specifically those providing telecom, internet, and value-added services that collect, analyze, or process private information or data related to their service users. According to the decree, these companies must store the data locally and have a physical presence in Vietnam. They are also required to retain the data for a minimum of 24 months. The types of personal data subject to localization include “(i) personal information of cyberspace service users in Vietnam in the form of symbols, letters, numbers, images, sounds, or equivalences to identify an individual; (ii) data generated by cyberspace service users in Vietnam, including account names, service usage timestamps, credit card information, email addresses, IP addresses from the last login or logout session, and registered phone numbers linked to accounts or data; (iii) data concerning the relationships of cyberspace service users in Vietnam, such as friends and groups with whom these users have connected or interacted.” (Article 26, Decree 53). The governing authority responsible for these regulations is A05 as well.

However, it remains unclear from the provided information whether personal data falling within the scope of Decree 53 can be transferred cross-border after fulfilling all requirements, including obtaining valid consents from data subjects. It is possible that the regulations are strictly interpreted to prohibit cross-border transfers for such types of data.

10. Specific Requirements for Children Personal Data

Like the GDPR, Article 20 of the Decree provides special protection for children’s personal data, with a focus on safeguarding their rights and best interests. However, the age threshold for obtaining valid consent differs between the two laws. In Vietnam, the Decree requires the consent of a parent or legal guardian and of children aged seven or older, while the GDPR only allows individuals over 16 to give consent independently for processing of their personal data. 

It is important to note that in Vietnam, children under the age of 16 are not considered to have legal  capacity, meaning that they cannot legally enter into contracts on their own behalf except in exceptional cases. As such, the effect of the child’s consent absent that of a parent or legal guardian is not entirely clear, although the requirement to obtain consent from the child was likely included in the Decree to reflect the child’s opinion on the processing of their personal data.

PDCs, PDPs, PDCPs, and TPs must verify the age of children before processing their personal data. However, the Decree does not explicitly provide an age verification process. Processing of children’s personal data must cease, and the personal data must be deleted irretrievably, where:

The Decree states that only the child’s parent or legal guardian can withdraw consent for the processing of the child’s data, leaving it unclear whether the child can revoke their consent and have their data deleted if they wish to do so.

Conclusion

Vietnam’s new Decree on Personal Data Protection marks a significant milestone in protecting personal data in the country. The Decree introduces key concepts and principles of personal data protection, and sets out specific requirements for data processors and controllers. It also establishes a regulatory framework for obtaining consent for data processing activities, cross-border data transfers, and children data protection, which can contribute to safeguarding the privacy and security of individuals’ personal data.

While the Decree addresses many of the current challenges facing personal data protection in Vietnam, there are still gaps that need to be addressed in forthcoming guiding documents, including the lack of a specific procedure for handling complaints related to personal data protection violations, the conflicting provisions on the sale of personal data need to be clarified, the impact of cross-border data transfers and clear guidelines and requirements for such transfers and a more defined fine structure. It should also provide guidance on automated processing and establish regulations for biometric data. As Vietnam continues to develop its data protection laws, it is important for the law to address key issues such as automated personal data processing, biometrics or facial recognition, global data transfer baseline standards, and the need to balance business development with data protection.

In conclusion, the country’s commitment to personal data protection and privacy is a crucial step in the digital age. As Vietnam continues to strengthen its data protection framework, it will be interesting to see how it aligns with, and how it contributes to emerging frameworks in the region and around the world.

Editors: The success of this article would not have been possible without the dedicated efforts of Dominic Paulger, Josh Lee Kok Thong, and Isabella Perera, as well as the tremendous encouragement of Dr. Gabriela Zanfir-Fortuna from the Future of Privacy Forum.

Analysis of a Decade of Israeli Judicial Decisions Related to Data Protection (2012-2022)

Adv. Rivki Dvash with the assistance of Mr. Guy Zomer1

Background

The Future of Privacy Forum’s office in Tel Aviv (Israel Technology Policy Institute – ITPI) sought to examine the judicial decisions in civil actions under Israel’s Privacy Law, which includes rules that regulate data protection. We examined the extent to which the general public demands protection of the right to privacy through judicial proceedings. We also analyzed the privacy and data protection issues that concern the public enough to appeal to the court, as well as identified any patterns in the appeals.

It is important to note that there is a contradiction inherent in taking civil actions to remedy privacy and data protection violations since appealing to judicial bodies brings attention to and publicly catalogs the disputes. 2 As such, there is an occasional interest to not pursue these matters in order to prevent additional publication or exposure of information that could increase the harm of the initial violation of privacy. Accordingly, the data gathered in this analysis does not necessarily reflect the complete interest and desire the public has in protecting privacy, but rather the cases in which individuals chose to seek judicial remedy under the Privacy Law.

In order to examine all of these cases, we asked Mr. Guy Zomer of Octopus – Public Information for All (R.A.) – which works to make public information, including that related to judicial proceedings, accessible through the Tolaat Hamishpat – to compile all the rulings since 2012 that mention privacy violations and retrieve relevant metadata for our analysis.

The overview below highlights the information and insights gathered from the metadata.

Methodology

Collection of rulings from the Nevo website

In order to locate rulings related to privacy violations, we queried all published rulings issued from January 1, 2012, to December 31, 2022 to find those that included reference to Section 2 of the Privacy Law, 5741-1981 (from now on referred to as “the Law”), which defines an invasion of privacy and what constitutes a civil tort (and a criminal offense). The dataset only includes rulings issued in ordinary courts (magistrate, district, and supreme), and not those issued in special courts such as the Family Court and the Labor Court.

Initial screening

Since we wanted to concentrate on civil proceedings to discover common patterns, we removed criminal judgments and appeal proceedings from the dataset. We also chose to examine and compare decisions related to class actions separately from other civil proceedings.

We identified a total of 293 judgments issued in civil lawsuits and 29 judgments in class actions that referred to privacy violations.

Data collection

The dataset of civil claim decisions related to privacy violations initially only contained primary data such as the opening and closing dates of proceedings and the amount of the claims. We then added the following secondary data:

  1. The additional grounds in the civil lawsuit (defamation,  spam, etc.), if any;
  2. The specific grounds for which the claim was filed (in other words, which subsection of Section 2 is used), even if the court did not recognize the requested cause or all the grounds for which the claim was filed;
  3. The relationship between the plaintiff and the defendant (neighbor, employer-employee3, family, etc.);
  4. Whether the plaintiff claimed concrete damage or compensation without proof of damage;
  5. Whether the court recognized defense claims (this refers to the acceptance of defense claims in a judicial decision, and not to the fact that the defending party raised them);
  6. Who won the lawsuit;
  7. The amount of compensation mandated due to the violation of privacy;
  8. The amount of expenses that have been mandated; and
  9. The total amount of compensation that was mandated, including expenses or other grounds.

We examined class action cases separately from civil lawsuits since class actions focus more on potential harm to a group of people rather than an individual and the monetary compensation is structured differently with three components: individual winnings, group winnings, and lawyer fees, which are higher than is usually customary and serve as an incentive to file class actions.

Preliminary research findings

1) It should be noted that the data we examined only related to published judgments. We have yet to learn about the number of relevant claims in which the proceedings were stopped for various reasons (such as a settlement or lack of legal proceedings by the plaintiff or closed-door proceeding). Given that there is no labeling of privacy protection procedures in the Net HaMishpat (the computerized system for managing court cases in Israel), it is impossible to locate such information.

2) There is a small number of verdicts related to privacy violations and there are only several dozen privacy cases yearly. In comparison, in 2019, about 200,000 cases were closed in the Magistrates’ and District Courts. 4 Furthermore, in 2020, about 192,400 cases were closed in these courts. 5 In other words, the judgments in matters of privacy in Israel are a negligible percentage of all civil proceedings.

screenshot 2023 04 27 at 12.32.46 am

3) We looked at the approximate weight of published privacy violation claims as a percentage of total published civil lawsuits over several years to see whether there are any patterns. Although this method is not statistically accurate, it is still useful to examine the variable ratio between all judgments and privacy judgments published in Nevo.

However, even in the test mentioned above, we could not locate or indicate a clear trend, as seen below.

screenshot 2023 04 27 at 12.33.54 am

Findings

Civil Lawsuits

1. In all the cases, except for one,7 the plaintiffs preferred to claim compensation without proof of damage under section 29A of the Law.

2. The most common issue in civil lawsuits is the photographing of a person and placing of cameras in public, and sometimes even private spheres, accounting for 5.1% of claims. 

3. We did not find any civil lawsuits for torts from privacy violations in databases. The initial assumption was that such claims are found in class actions (see below).

4. Civil lawsuits for privacy violations were generally connected to legal claims for other torts. Less than 20% of the claims filed for privacy violations were filed as a single damage (17%).

5. 19.8% of plaintiffs chose to file their claim in “Small Claims Courts,” which allow for relatively quick and no-frills compensation in an amount limited to up to NIS 36,400 (roughly USD 10,000).

6. The main ground for civil lawsuits is the “spying on or tracing of a person,” or other harassment. This ground appears in 36.9% of civil court rulings. For context of how dominant this cause is, the second most common ground (photographing a person without their permission) is cited in only 16% of all judgments.

screenshot 2023 04 27 at 12.35.48 am

7. The most common relationship between plaintiffs and defendants is a consumer relationship (24%) or a neighbor’s dispute (21.8%). A citizen’s claims against the authorities account for 8.9% of all claims, with the leading cause of action for this type of relationship being a breach of the confidentiality obligation established by the Law (40%).

8. Although privacy violations from media exposure create significant harm due to their broad exposure of information, only a low percentage of filed claims are due to this type of violation (7.5%). Additionally, claims due to this type of violation are always accompanied by a civil lawsuit for other claims such as defamation. Generally, defamation claims appear next to privacy violation claims (52%).

9. 9.9% of privacy claims also involved spam claims filed under Section 30A of the Communications Law. This finding is interesting because during the legislative process for spam regulations, it was determined that they should be incorporated into the Communications Law instead of the Privacy Law. Regardless, even in decisions that recognized both privacy and spam violations, the compensation amounts remained extremely low (no more than a few thousand shekels).

10. In most cases (57.3%), the plaintiff won the claim, compared to 34.4% of cases in which the defendant won (in the other claims, there was no definitive decision). However, a deeper examination of these claims shows that only 46.7% of them were compensated for the privacy violation. In other words, sometimes the plaintiff won the case, but not on the grounds of the privacy violation, or general compensation was provided without specifically referring to the privacy violation.

screenshot 2023 04 27 at 12.38.47 am

11. In almost a quarter of the rulings (24.5%), the court recognized legal defense protections under the Law. 9 The most recognized protection (40.3%) is in the case of “legitimate personal interest” (section 18(2)(c)).

Class Actions

12. Class actions related to privacy violations  (29 cases) account for a small number of all class actions (6493 cases). However, the relative share (4.5%) is larger than the ratio of civil privacy violation claims compared to all civil claims (about 0.09%). This larger relative share is even more significant given that  privacy violation class actions in Israel are more limited tools than civil lawsuits since class actions can only apply to the specific types of claims listed in the second addendum to the Class Actions Law, 5766-2006. 10

13. Most of the class actions that include grounds for privacy violations are also related to consumer protection.

14. Spam violations constitute the additional (or, more precisely, the primary) ground in a significant share of privacy violation class actions (69%). Four cases (15.4%) also mentioned the issue of registering the databases that are the subjects of the claims.11 Furthermore, in four cases (15.4%), it was claimed that the information security of the databases in question were compromised.

15. In 17.2% of privacy violation cases the court rejected the motion to file a class action.

16. Of the 29 cases in which a judgment was given (including court rejection to form a class action), in 41.4% of cases, the court approved the settlement, and in 37.9% of cases, the court approved the plaintiff’s motion for leave.

screenshot 2023 04 27 at 12.40.54 am

17. 69.2% of claims ended in favor of the plaintiff, and only about 26.9% of the decisions favored the defendant, with plaintiffs liable for expenses in only four cases (15.4%).

screenshot 2023 04 27 at 12.41.29 am

Conclusion

Despite the difficulty in getting clear insights into privacy violation civil lawsuits and class actions due to the scarcity of rulings in this area, it is still necessary to examine these decisions.

The small number of claims in this area may indicate the public’s lack of interest in exercising its right to compensation when privacy violations occur. Part of this disinterest is likely due to the desire to prevent additional publication or exposure of information that could increase the harm from the initial privacy violation.  Interestingly, the larger amount of privacy violation class actions as a percentage of all class action lawsuits (compared to civil lawsuits) indicates that given a larger financial incentive and decreased risk of exposure of individuals’ personal information, the desire to file lawsuits may increase. This tentative hypothesis is supported by the higher numbers of class action and civil lawsuits related to spam violations, both of which have high compensation potential and do not reveal additional personal information about plaintiffs. However, given the small absolute number of both class action and civil lawsuits related to privacy violations, more research is needed to fully examine the motivations of plaintiffs.

Even with the small number of claims, there are still several interesting findings, including clarity into the types of privacy violations that concern the public. For example, it is evident that plaintiffs mostly bring violations related to neighbor disputes and placement of cameras in public spaces for surveillance. The research also shows that despite the higher potential for privacy violations from state authorities or the greater harm from violations of database-related provisions of the Law, there are almost no lawsuits concerning these issues. One potential hypothesis for the lack of these claims is that there are power gaps between citizens and authorities, as well as data subjects and database owners, that disincentivize lawsuits.  Although class actions can strengthen the power of the consumer, they still require proof of damage and also cannot be filed against the state.

In conclusion, it is impossible to point to a change or a clear trend of citizens exercising their right to privacy in civil lawsuits over the past decade.

Editor: Isabella Perera

This text has been translated and adapted into English from the original report published on January 30, 2023, available in Hebrew following this link.


1 Thanks to Adv. Limor Shmerling-Magazanik, former Director of ITPI, for her comments on this report.

2 In Israel, the default is that legal proceedings are published stating the parties’ names.

3 It should be noted that even in civil proceedings in ordinary courts (not the Labor Court), we still found claims related to employee-employer relationships.

4 See Annual Report 2019 – Court Administration (in Hebrew), pp. 25 and 37. In the district courts, 8,278 civil cases were closed, and in magistrates’ courts, 191,444 such cases were closed.

5 See Annual Report 2020 – Court Administration (in Hebrew), pp. 25 and 37. In the district courts, 7,578 civil cases were closed, and in magistrates’ courts, 184,874 such cases were closed.

6 We did not include 2022 because there was a change in the classification of cases in civil lawsuits that altered how the selected group was sampled.

7 Civil Action (Magistrate court – Haifa) 54043-11-12 Naor v. Clal Pension and Provident Fund Ltd. (11/4/2014) (in Hebrew), in which the plaintiff lost.

8 As of January 2023.

9 Section 18 of the Privacy Law.

10 Such as dealers, banking corporations, financial services providers, etc.

11 In Israel, there is still an obligation to register databases.

A New Paradigm for Consumer Health Data Privacy in Washington State

The Washington ‘My Health, My Data’ Act (MHMD or the Act) establishes a fundamentally new legal framework within U.S. law to regulate the collection, use, and transfer of consumer health data. Signed into law by Governor Inslee on April 27, MHMD was introduced by request of the Washington Attorney General in response to the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization (2022) (Dobbs).

While drafting quirks have caused uncertainty around the effectiveness dates of some of MHMD’s provisions, in general the Washington Legislature seems to intend for MHMD’s substantive data privacy requirements to come into effect on March 31, 2024 (or June 30, 2024 for small businesses). Other provisions, including the Act’s sections on geofencing and enforcement, will take effect in 90 days time. 

This post highlights six aspects of MHMD that could have paradigm-shifting consequences for data privacy regulation. For a more in-depth analysis of the Act, check out the Future of Privacy Forum’s MHMD Policy Brief.

1. ‘My Health, My Data’ applies to organizations that collect, process, or transfer covered data in any way that touches Washington State:

MHMD will impact a broad range of entities, both within and outside of Washington State. The Act imposes obligations on regulated entities that do business in Washington or that “target” products or services at Washington consumers. Such targeting likely includes actions as simple as making a business website available to access from within Washington or advertising in Washington.​​ In addition, MHMD applies to businesses and nonprofit organizations of any size that collect, hold, or transfer consumer data that has “any operation” performed on it in the state at any point. Significantly, MHMD defines “consumer” as any natural person whose health data is processed in “any manner” within the state. Therefore, if customer health data is at any point accessed in, travels through, or is stored in Washington State, MHMD is likely to apply. Unlike many other U.S. privacy laws, the Act does not exempt entities covered by other legal regimes, including the Health Insurance Portability and Accountability Act of 1996 (HIPAA), Gramm-Leach-Bliley Act (GLBA), and Family Educational Rights and Privacy Act of 1974 (FERPA), but instead only the data regulated thereby. 

2. ‘My Health, My Data’ defines “health data” far more broadly than any other U.S. privacy framework:

MHMD regulates collection and transfers of “consumer health data,” defined as any form of “personal information” that “identifies the consumer’s past, present, or future physical or mental health status.” The Act provides a non-exhaustive list of 13 categories of information that constitute de facto “health status” under the Act, including biometric data, “[p]recise location information that could reasonably indicate a consumer’s attempt to acquire or receive health services or supplies,” and health information that is inferred from non-health data. This definition of health data is far broader than the definitions established by other contemporary legal frameworks, and will encompass information that is not typically treated as health data. Any entity with a nexus to individually-identifying health information should assess potential operational impacts of MHMD.

While more expansive than other legislative frameworks, one significant aspect of MHMD’s definition of “consumer health data” aligns with the Federal Trade Commission’s (the Agency) approach to health information in its recent enforcement actions against GoodRx and BetterHelp. In its complaint against BetterHelp, the Agency alleged that the company wrongfully disclosed consumer information, including email addresses, IP addresses and unique advertising IDs, that revealed that consumers had accessed a website seeking mental health care services. Similarly, MHMD’s definition of “personal information” includes “data associated with a persistent unique identifier, such as a cookie ID, an IP address, a device identifier, or any other form of persistent unique identifier.” These definitions demonstrate an emerging regulatory attention to ways in which common online activities and user data can be processed to reveal sensitive health information.

3. ‘My Health, My Data’ establishes recurring, notice and consent obligations for the collection, transfer, sale, and secondary use of health data:

MHMD requires businesses to make disclosures and obtain separate consumer consent for any collection and transfer of health data beyond what is necessary to provide a consumer-requested product or service. For the “sale” of health data, the Act requires regulated entities to obtain “valid authorization,” a more exacting form of consent that expires after one year. MHMD defines “sale” broadly to include exchanges for valuable consideration, and will likely implicate current digital advertising practices for covered entities. 

While MHMD’s opt-in framework will provide individuals with increased ability to control how their health data is collected and transferred, users will likely face a significant increase in the volume of notices and pop-ups when accessing many common products and services. Furthermore, since MHMD relies on a “notice and consent” framework rather than creating new baseline rules around how entities may collect, use and transfer covered health data, the efficacy of the Act’s framework will depend on whether users are able to successfully navigate this new menu of consent options while obtaining desired products and services. 

4. ‘My Health, My Data’ creates consumer rights of access and deletion that go beyond those established by other state privacy laws:

MHMD creates several consumer rights that have become standard in global privacy laws, including the right to know how an organization uses personal data, the right to access that data, and the right to have covered health data deleted. However, MHMD does not contain common exemptions for these rights such as for protecting trade secrets or for complying with legal obligations. 

Furthermore, the Act’s rights of access and of deletion are significantly different from comparable state laws, and will require modifications to organizations’ compliance programs. For example, MHMD’s right to access not only gives users the right to obtain a copy of their data, but also to procure a list of the names and email addresses of third-parties with whom their data was shared or sold. The Act’s deletion right gives individuals the right to delete their health data from all records managed by a regulated entity, including from archived or backup systems and from within the records of processors, contractors, and other third parties, with no exception for data that is retained in order to comply with deletion requests on an ongoing basis. 

5. ‘My Health, My Data’ places novel restrictions on the geofencing of wide-ranging set of facilities that provide in-person “health care services:” 

MHMD forbids both covered entities and individual actors from geofencing physical “health care facilities” in order to identify individuals, collect health data, or send health data or health-service related messages to consumers. This restriction may impact several common practices, including security operations and the use of push notifications for advertising consumer goods. Furthermore, MHMD’s far-reaching definition of “health care services” means these restrictions could include geofencing conducted in order to collect data from or advertise to individuals visiting gyms, complexes that include healthcare offices, and general consumer goods stores.

6. ‘My Health, My Data’ provides for enforcement through a private right of action:

MHMD gives the Washington Attorney General authority to enforce the Act and also creates a private right of action by establishing that a violation of the Act is an unfair or deceptive trade practice under the Washington Consumer Protection Act (WCPA). While MHMD’s inclusion of a private right of action sets it apart from many other state privacy laws, entities should note that MHMD does not provide for statutory damages. Instead, MHMD grants plaintiffs the right to sue to recover for any injury to their “business or property” caused by a violation of the Act, and gives courts the discretion to award treble damages up to $25,000. While the Washington Attorney General’s office can likely issue interpretive guidance, the opportunity for private litigation suggests that Judges are likely to resolve drafting ambiguities. 

Conclusion

MHMD will set new standards for the protection of non-HIPAA covered personal health data. The Act’s broad scope and exacting requirements could create compliance hurdles for a wide range of covered entities, and its private right of action provides a private enforcement mechanism not usually available under U.S. privacy laws. Organizations of all sizes, even those who operate outside of Washington State, should investigate whether they are, or could become, covered by the Act and understand MHMD’s requirements. Likewise, individuals should determine when their data is covered by MHMD and what rights they are afforded under the Act. Finally, policymakers working on these issues should consider not only the scope of new health privacy legislation, but also how new regulations will interact with existing frameworks, including the sensitive data protections established under the various state comprehensive privacy laws.

FPF Announces Recipients of the Third Annual Award for Research Data Stewardship

Today, the Future of Privacy Forum (FPF) — a global non-profit focused on data protection headquartered in Washington, D.C. — announced the winners of the third annual Award for Research Data Stewardship

FPF is a long-standing advocate for privacy-protective data sharing by industry to the research community to advance scientific insights and drive progress in medicine, public health, education, social science, and many other fields. FPF established the Award for Research Data Stewardship in 2020 to recognize companies and academics that demonstrate innovative approaches and best practices for sharing private, corporate data to advance scientific knowledge. 

With the third-annual Award for Research Data Stewardship, FPF honors two teams of researchers and corporate partners for their commitment to privacy and ethical uses of data in their efforts to help with emergencies related to diseases and natural disasters. The winning team is a collaboration between the Mayo Clinic researchers led by Rozalina McCoy, MD, MS, and health services company Optum. The honorable mention is a collaboration between Assistant Professor Xilei Zhao, PhD, at the University of Florida and location intelligence company Gravy Analytics. These partnerships were awarded based on the strength of their research, adherence to privacy protection in the sharing process, and the company’s commitment to supporting academic research. 

“Our panel of judges were incredibly impressed reading through each meaningful and forward-thinking data-sharing partnership,” said Shea Swauger, FPF’s Senior Researcher for Data Sharing and Ethics. “Data plays a significant role in social progress. When companies share data responsibly with academic researchers, they can unlock new scientific insights, expand human knowledge and provide solutions to society’s most difficult challenges.”

Winner: Mayo Clinic and Optum:
“Predicting the Risk of Severe Hypoglycemic and Hyperglycemic Events in Adults with Diabetes”

Honorable Mention: University of Florida Transportation Institute Partnership with Gravy Analytics: “Using Location Analytics to Enhance Natural Disaster Emergency Response Planning and Management”

The Award is a part of FPF’s “Corporate Data Sharing for Research: Next Steps in a Changing Legal and Policy Landscape” project to accelerate the safe and responsible sharing of privacy-protected data between companies and academic researchers. This project is supported by the Alfred P. Sloan Foundation, a non-profit grantmaking institution whose mission is to enhance the welfare of all through the advancement of scientific knowledge.

FPF’s Award Ceremony will be held virtually on May 10, 2023, and is free for anyone interested in learning more about these winning programs and data sharing. Register for the event here to RSVP.

FPF at the 2023 IAPP Global Privacy Summit

Earlier this month, IAPP held its annual Global Privacy Summit (GPS) in Washington, DC. FPF played a major role in bringing together a team of seven renowned privacy experts on 11 panel discussions and varying peer-to-peer roundtables ranging from U.S. privacy law to AI tech and regulation to regional contractual frameworks for data transfers. FPF remained active through these expert discussions and engaged with FPF members at networking events and meetings, as well as at our expo booth during the three-day conference.

1680733043878

Most notably, our CEO Jules Polonetsky was the recipient of the 2023 IAPP Leadership Award, given to individuals who “demonstrate an ongoing commitment to furthering privacy policy, promoting recognition of privacy issues, and advancing the growth and visibility of the profession.” Jules has served as FPF’s CEO for the last 15 years.

“The Privacy Leadership Award is an incredible recognition, I am honored. I thank the team at IAPP for the award and my staff at FPF, who continue serving as global privacy leaders and publishing influential scholarship that is imperative to advancing privacy safeguards, protections, and policy.”

Jules Polonetsky, CEO, FPF

1680544426174

On the first day of the conference, FPF, in partnership with GW Law, hosted a reception featuring Chairperson Haksoo Ko of the Personal Information Protection Commission (PIPC) to welcome privacy professionals to Washington, D.C. In a packed room, Jules offered opening remarks and Chairperson Ko a keynote address to guests.

image from ios 16 copy

U.S. Privacy Law at a Crossroads: The Past, Present and Future

In an engaging conversation, FPF CEO Jules Polonetsky was joined alongside an expert panel of speakers, including Elliot Golding (Partner, McDermott Will & Emery), Alastair Mactaggart (Board Member, California Privacy Protection Agency; Board Chair, Founder, Californians for Consumer Privacy), and Lydia de la Torre (Board Member, California Privacy Protection Agency; Partner, Golden Data Law). GPS attendees heard the panel discuss relevant issues such as U.S. employment laws and data, state legislation from California and Utah (notably Utah’s social media bill), children’s privacy, and more.

“We need to get legislation done in the responsible ways that California did; otherwise we lean towards a poorer direction”

Jules Polonetsky, CEO, FPF

What Are the Long-term Implications of the Trans-Atlantic Data Privacy Framework

Former FPF Senior Counsel Sebastião Barros Vale discussed the long-term implications of the Trans-Atlantic Data Privacy Framework (TADPF) alongside experts Paul Breitbarth (Senior Fellow, Maastricht University Faculty of Law; Data Protection Lead, Catawiki), Caitlin Fennessy (Vice President & Chief Knowledge Officer, IAPP), and Alexander Joel (Tech, Law & Security Program, American University Washington College of Law). In this discussion, they touched on how the TADPF is an important chapter in the ongoing story of trans-Atlantic data flows, why privacy professionals should seek to enhance mutual understanding among governments, companies, and the public to help lay the groundwork for potential solutions, and more. View the presentation.

Great Expectations: Will the EU’s Data Strategy Laws Change the Digital World?

This panel moderated by FPF VP for Global Privacy Dr. Gabriela Zanfir-Fortuna, discussed the state of play in Brussels with regard to the EU’s new generation of data laws such as the DMA, DSA, DGA, Data Act, and the AI Act. She was joined by renowned global experts Brando Benifei (Member of the European Parliament, co-Rapporteur of the AI Act), Irene Roche Laguna (Deputy Head of Unit, Digital Services, European Commission), and Wojciech Wiewiórowski (European Data Protection Supervisor). 

Attendees learned how the GDPR interacts with the EU’s new generation of data laws and how these data laws coming from Brussels may impact jurisdictions around the world.

“Law is as good as its enforcement is”

Dr. Gabriela Zanfir-Fortuna, VP for Global Privacy, FPF

Oh, the Places We Might Go: U.S. Privacy Law and Regulation

In this standing-room-only, Dr. Seuss-themed panel, FPF Senior Counsel Tatiana Rice discussed the Washington, D.C. data privacy and security landscape as it relates to significant movement in privacy in 2022 and assessing developments from the FTC, Congress, the White House, the Supreme Court, and more. Tatiana was joined by D.C. privacy experts Brandon Pugh (Director and Senior Fellow, R Street Institute, Cyber and Emerging Threats Team), Divya Sridhar, Ph.D., (Director of Privacy Initiatives, BBB National Programs), and Cobun Zweifel-Keegan (Managing Director, D.C., IAPP). 

The panel explored data minimization, a principle likely to appear in state and federal law and regulations, as well as federal agency action and enforcement trends. Notably, speakers discussed protecting vulnerable populations, specifically kids and teens, as attendees heard discussion surrounding age-appropriate design codes. View the presentation here.

The Tip of the AI Iceberg: Views on Bias, Digital Discrimination & Data Rights

On day three, attendees joined an early-morning panel with FPF Senior Policy Counsel Bertram Lee as he discussed views on bias, digital discrimination, and data rights with moderator Anupam Chander (Scott K. Ginsburg Professor of Law and Technology, Georgetown Law), Yvette Badu-Nimako (Interim Executive Director, VP, Policy, National Urban League, Washington Bureau), Travis Hall (Acting Deputy Associate Administrator, National Telecommunications and Information Administration), and Ben Winters (Senior Counsel, Electronic Privacy Information Center (EPIC)). 

Attendees heard Bertram and the expert panel explore AI systems’ risks to privacy, biases and discriminatory outcomes of algorithms, and responsible AI systems, bringing a local D.C. angle to the conversation by discussing how District housing authorities and law enforcement utilize AI systems in their work that is inherently biased and harmful to underserved areas of the city.

“I believe that AI will change the world for the better, but that doesn’t mean that it shouldn’t be accountable to the many communities, and particularly underserved communities, that are impacting their lives. It’s important for us to think about that in the privacy community – how do we mitigate those harms? How do we design responsibly to offset those harms?”

Bertram Lee, Senior Policy Counsel, FPF

Preparing for the Next Generation of AI Tech and Regulation as Privacy Pros

In FPF Senior Policy Counsel Bertram Lee’s second panel of the day, he was joined by Nia Castelly (Co-Founder, Legal Lead, Google), Che Chang (Deputy General Counsel, OpenAI), and Filippo Raso (Senior Associate, Hogan Lovells). In another standing-room-only session, Bertram and the panelists discussed the latest on AI research and development, recent developments on AI commercialization, AI regulatory policy developments, and implementing AI governance.

“AI regulation is not going anywhere. It’s only here to stay”

Bertram Lee, Senior Policy Counsel, FPF

Not-so-standard Contractual Clauses: Comparing Global Data Transfer Tools

An engaging discussion moderated by FPF Senior Counsel for Global Privacy Lee Matheson on trans-border data flows took place on day three of the conference. Lee was joined by Mariano Peruzzotti (Partner, Ojam Bullrich Flanzbaum), Isabelle Vereecken (Head of Secretariat, European Data Protection Board), and Yeong Zee Kin (Deputy Commissioner, Personal Data Protection Commission of Singapore). This global panel covered three different model regional contractual frameworks for data transfers, the Ibero-American model clauses, Association of Southeast Asian Nations (ASEAN) MCCs and other SEA national rules, and the EU’s SCCs. 

When asked by an attendee about “the best scenario for what can be achieved in a dialogue between the EU and other regions,” the panelists offered differing perspectives. There may never be “one set of clauses to rule them all” because of cultural and legal differences, but the dialogue reveals that, at least in some ways, data protection principles related to transfers are moving towards convergence. There can be valid discussions about interoperability for different regional sets without having to agree on one set that will apply everywhere. View a recording of the session here.

A Conversation with the U.S. Ambassador for Cyberspace and Digital Policy

To close off this exciting conference, FPF CEO Jules Polonetsky sat down with Ambassador Nathaniel Fick, U.S. Ambassador for Cyberspace and Digital Policy, in a conversation highlighting several topics, including U.S. digital policy priorities globally, AI systems’ risks to privacy, biases and discriminatory outcomes of algorithms, responsible AI systems, and more.

“Data protection is increasingly becoming the law of everything.”

Jules Polonetsky, CEO, FPF

We hope you enjoyed this year’s IAPP Global Privacy Summit as much as we did! If you missed us at our booth, visit FPF.org for all our reports, publications, and infographics. Follow us on Twitter, LinkedIn, and subscribe to our newsletter for the latest.