New Report on Limits of “Consent” in the Philippines’ Data Protection Law
Introduction
Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the sixth in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).
This report provides a detailed overview of relevant laws and regulations in the Philippines, including:
notice and consent requirements for processing personal data;
the status of alternative legal bases for processing personal data which permit processing of personal data without consent if the data controller undertakes a risk impact assessment (e.g., legitimate interests); and
statutory bases for processing personal data without consent and exceptions or derogations from consent requirements in laws and regulations,
The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.
The Philippines’ Data Protection Landscape
The main personal data protection legislation in the Philippines is Republic Act No. 10173, better known as the Data Privacy Act of 2012 (DPA), which was passed in 2012 but only fully took effect in September 2017.
The DPA applies broadly to individuals and organizations that process the personal information of Philippine citizens, even if the individual or organization does not have a legal presence in the Philippines. For purposes of the DPA, “processing” refers to any operation(s) performed upon personal information and includes collection, use, and disclosure of personal information, among others. The DPA also provides a number of exceptions for the processing of personal information by public authorities for various purposes.
The stated policy aim of the DPA is to protect the fundamental human right to privacy of communication while ensuring the free flow of information to promote innovation and growth.
To that end, the DPA provides data subjects with a number of rights over their data, including rights to information about how their personal information is processed, correct personal information about them, and order the blocking, removal, or destruction of their personal information. Notably, the DPA was also the first data protection law in APAC to provide data subjects with an express right to data portability, which applies where personal information is processed by electronic means and in a structured and commonly used format.
The DPA also establishes the National Privacy Commission (NPC), an independent body that is responsible for administering and implementing the DPA. The NPC’s role as defined by the DPA is multifaceted and includes responsibilities to, among others: (1) advise the Government and the public and private sectors on personal data protection-related matters; (2) ensure that regulated entities comply with the DPA’s requirements, using enforcement measures if necessary; and (3) align the Philippine data protection framework with international standards and cooperate with peer regulators in other jurisdictions.
Since its establishment, NPC has been active in issuing guidance on the DPA. One of the NPC’s first acts was to issue the Implementing Rules and Regulations to the DPA, which took effect in September 2016 and provided clarification as to how the DPA’s requirements apply in practice. Since then, NPC has also provided further guidance in the form of circulars, advisories, and notably, 307 “advisory opinions” published on the NPC’s website, in which the Commissioner provides guidance on how the NPC would interpret and apply the DPA’s requirements in a wide range of situations, often in response to questions from businesses and members of the public.
Role and Status of Consent as a Basis for Processing Personal Data in the Philippines
Consent is one of several, equivalent legal bases for processing personal information and sensitive personal information under the DPA. Alternative legal bases are similar to those under the GDPR and cover a range of situations where the processing of personal information is necessary for:
preparatory steps for, or fulfillment of, a contract;
vital interests of the data subject;
compliance with a legal obligation;
response to a national emergency; or
pursuit of the “legitimate interests” of the data controller or a third party, subject to a balancing test.
Alternative legal bases for processing sensitive personal information are also premised on necessity but are much stricter and generally only apply in narrow circumstances where either the data subject is incapable of giving consent (e.g., medical treatment or a threat to life and health), or where specific provisions of law stipulate that consent is not required but provide other safeguards for the sensitive personal information.
For purposes of the DPA, consent must be freely given, specific, and informed and must indicate that the data subject agrees to collection or processing of his/her personal information. The NPC has also clarified through an Advisory Opinion that it would not recognize implied, implicit, or negative forms of consent.
If an individual or organization wishes to rely on consent to process personal information, it must obtain consent from the data subject (or the data subject’s lawful representative) prior to collecting the personal information or for non-sensitive personal information, either before or as soon as reasonably practicable after collection. Once obtained, the consent must also be recorded, whether by written, electronic, or other means.
Consent can also be withdrawn at any time, in which case processing of the personal information must cease unless the individual or organization can rely on an alternative legal basis for processing.
New Report on Limits of “Consent” in Australia’s Data Protection Law
Authors: Dominic Paulger and Elizabeth Santhosh
Elizabeth Santhosh is a current law student at Singapore Management University and an FPF Global Privacy intern.
Introduction
Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the fifth in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).
This report provides a detailed overview of relevant laws and regulations in Australia, including:
notice and consent requirements for processing personal data;
the status of alternative legal bases for processing personal data which permit processing of personal data without consent if the data controller undertakes a risk impact assessment (e.g., legitimate interests); and
statutory bases for processing personal data without consent and exceptions or derogations from consent requirements in laws and regulations,
The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.
Australia’s Data Protection Landscape
The cornerstone of Australia’s federal data protection framework is the Privacy Act of 1988, which was passed in 1988, commenced in 1989, and gives effect to the Organisation for Economic Co-operation and Development’s (OECD) 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, as well as Australia’s obligations under international human rights law to protect privacy.
The Privacy Act originally only applied to the public sector, but subsequent amendments to the Privacy Act over its 33-year lifespan have extended the scope of the Act so that the Act now covers the public sector and organizations in the private sector that either have an annual turnover of over AU$3 million or fall within certain prescribed industries.
Amendments made in 2010 to the Privacy Act established the Office of the Australian Privacy Commissioner (OAIC), which is responsible for, among others, issuing guidance on how organizations can comply with the Privacy Act. The Office also investigates and resolves complaints concerning organizations’ personal information practices, including, where necessary, issuing formal decisions known as determinations.
Major reforms to the Act’s privacy protections in 2014 introduced a unified set of Australian Privacy Principles (APPs) applying to both the public and private sectors. Any organization that is covered by the Privacy Act must comply with the 13 APPs, which broadly establish rights and obligations for:
collection, use, and disclosure of personal information;
anonymous and pseudonymous personal information;
use of personal information for direct marketing;
cross-border transfer of personal information;
quality and security of personal information; and
access and correction to personal information.
The latest amendments, in 2018, introduced a notifiable data breach scheme for organizations that are subject to security obligations under the Act.
In recent times, there has been significant discussion on the need to reform the Privacy Act, as well as Australia’s broader data protection framework, to respond to challenges to individuals’ privacy posed by the exponential growth in digital technologies, social media platforms, and the Internet of Things (IoT).
In 2019, the Australian Competition and Consumer Commission (ACCC) published its Digital Platforms Inquiry which highlighted risks from the business models of Big Tech companies and suggested the Australian government conduct a review. This eventually led the Attorney General’s Department (AGD) to release an issues paper in 2020 inviting public consultation on whether the Privacy Act and its enforcement mechanisms remain fit for purpose and possible avenues for reform, followed by a discussion paper with more detailed proposals one year later.
Alongside public consultation on reform to the Privacy Act, the AGD has also held consultation on a new bill, the “Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021” (Online Privacy Bill) which, if passed, would complement the Privacy Act by introducing a binding online privacy code with which social media and other online platforms would have to comply, or face legal penalties. The status of the Online Privacy Bill is currently uncertain following recent federal elections in Australia.
Role and status of consent in the jurisdiction
Consent plays an important role in the Privacy Act and is relevant to the operation of a number of APPs.
Though consent is not required for all collection of personal information under the APPs, consent is required for the collection of certain prescribed categories of “sensitive” personal information, unless an exception applies.
Consent also functions as an exception that permits certain acts in relation to personal information that would otherwise be prohibited under the APPs – namely:
collection of personal information from a source other than the data subject;
use or disclosure of personal information for a different purpose than the purpose for which the personal information was originally collected;
use of personal information for direct marketing purposes; and
cross-border transfer of personal information, without taking reasonable steps to ensure that the overseas recipient of personal information does not breach the APPs in relation to that information.
The APPs also impose detailed notification and notice requirements which operate independently of consent requirements. Organizations that are subject to the Privacy Act are generally required to maintain a privacy policy providing information about the organization’s activities in relation to personal information as well as how individuals may exercise their rights under the APPs in relation to their information. Additionally, organizations are required by default to notify individuals of certain prescribed matters when the individual’s personal information is collected or as soon as reasonably possible after collection.
These existing consent and notification requirements have been the subject of much discussion during consultation on reform to the Privacy Act as there is widespread recognition that organizations over-rely on consent. The general direction of reform proposals seems to be in favor of strengthening the legal test for what constitutes valid consent, while at the same time reducing the frequency with which, or circumstances under which, individuals are asked to provide consent. However, there are signs stemming from the consultations on the potential reform of the Privacy Act that Australia may ultimately move away from a “privacy self-management” approach and towards an approach that places greater accountability on organizations by requiring that collection, use, or disclosure of personal information must be fair and reasonable in the circumstances. It remains to be seen how these proposals will evolve in the future.
California Age-Appropriate Design Code Aims to Address Growing Concern About Children’s Online Privacy and Safety
Authors: Chloe Altieri, Kewa Jiang
Kewa Jiang, CIPP/US, is a 2021 graduate of USC Gould School of Law and a Student Contractor with FPF’s Youth and Education Privacy team.
On May 26, 2022, AB-2273, the California Age-Appropriate Design Code Act (ADCA) unanimously passed the California Assembly and moved to the Senate for consideration. California Assembly Members Buffy Wicks (D-Oakland) and Jordan Cunningham (R-Templeton) proposed AB-2273, earlier this year. The bill is modeled after the United Kingdom’s Age Appropriate Design Code (AADC) and aims to regulate the collection, processing, storage, and transfer of children’s data. The bill takes a substantially different approach to youth privacy than the leading federal framework, extending heightened protections to individuals under age 18, covering entities that provide services likely to be accessed by a minor (even if the provider lacks actual knowledge that children use the service), and requiring age-appropriate language for privacy disclosures. The bill is moving through the California legislature at a time when young people are increasingly engaging with their peers online and using digital services for entertainment, educational, and other purposes. Given the rise in young users, there is growing international concern for children’s online data privacy and the effects online content can have on children’s mental and physical health.
President Joe Biden emphasized the increasing need for children’s online protections in his State of the Union Address on March 1, 2022. He called upon Congress to provide stronger privacy protections for children, ban targeted advertising to children, and hold social media companies accountable. The Federal Trade Commission (FTC) also emphasized the need for increased protections for youth in a policy statement released on May 19, 2022. The statement signaled the agency’s renewed focus on enforcing COPPA for all covered entities, including edtech vendors. Internationally, the United Kingdom enacted the Age Appropriate Design Code (AADC), which came into force in September 2021. The UK AADC informed the creation of the ADCA and many of the questions or amendments surrounding compliance have arisen from the complications of transplanting the UK’s European-style regulatory approach into the US legal framework.
Proponents of the California ADCA are partly motivated by concerns regarding youth safety, privacy, and mental health; the bill would significantly change the regulation of many online services in the US. The bill is scheduled for a Senate committee hearing on June 28th, 2022. If the bill successfully passes the Senate and is signed into law by Governor Gavin Newsom, the law would come into effect on July 1, 2024. There remain numerous questions about the compliance, enforcement, and potential impact of the ADCA. The sections below provide an overview and analysis of the bill’s key provisions:
Covered Entities
Age Assurance and Knowledge Standard
Data Minimization
Default Privacy Settings
Enforcement and Regulation
Covered Entities
The ADCA would apply to businesses that provide “an online service, product, or feature likely to be accessed by a child.” This casts a wider net than COPPA’s “directed to children” standard. Under COPPA, under-age-13 users’ personal data is not typically afforded higher protection unless the service has actual knowledge that the user is a child or if the service’s offerings are deemed as child-directed through factors such as direct marketing, graphics, or music that appeals to children. Without additional guidance, the ADCA standard may be difficult to interpret or implement for covered entities that target a general audience and traditionally do not host content directed to minors. This uncertainty would likely be exacerbated by ADCA’s applicability to teens; some organizations could struggle to distinguish between services likely to be accessed by 17-year-olds and services likely to be accessed only by those 18 and older.
Baroness Beeban Kidron, an architect of the UK AADC, argued during an April hearing on the California bill that the “likely to be accessed” standard is essential to the bill’s purpose because the standard incentives service providers design a wide range of online products and services with youth protection in mind, not just services that know they have young users or direct their services to kids and teens. The “likely to be accessed” standard could encompass online products and services that children regularly visit that might otherwise not be covered under COPPA, like sites for video conferencing, online games, and social media.
Age Assurance and Knowledge Standard
The ADCA defines children as consumers under the age of 18, a higher age than any enacted child privacy law in the US. The ADCA definition differs from that of COPPA and the California Consumer Privacy Act (CCPA) which define children as under 13. The CCPA also creates a requirement for covered entities to receive affirmative opt-in authorization to sell the data of 13- to 16-year-old users. The ADCA would require covered entities to use age-appropriate language for privacy disclosures and the increase in age aims to create age-appropriate regulations for all minors. However, aggregating all children under 18 in a single group may cause issues in implementation because the developmental needs and maturity of teenagers are vastly different from those of elementary school age children. The UK AADC adopts the age-appropriate application through using age ranges. The California ADCA could benefit from a similar approach, which could improve the readability of privacy notices for young people while also incentivizing services to collect less identifiable data regarding user age.
Additionally, the ADCA would require covered entities to “establish the age of consumers with a reasonable level of certainty” appropriate to the risks or to apply the highest protections to all consumers. Age assurance and verification have been ongoing concerns as companies struggle with practical implementation. Some online services already engage in some form of age identification or inference, but some advocates have critiqued COPPA’s “actual knowledge” standard, arguing that it incentivizes websites for a general audience to simply not ask users’ ages. Others have criticized age-verification requirements, arguing that such mandates compel services to collect additional information about individuals.
The ADCA’s approach diverges from COPPA – the ADCA would essentially establish a “constructive knowledge” standard, creating liability if a service knew or should have known that children are likely to access its products or services given the circumstances, such as user data, online context, and marketing, but would not require services to collect additional personal information merely to infer user age. The ADCA standard is consistent with the UK’s AADC and may be more consistent with CPRA’s knowledge standard. While CPRA adopts an actual knowledge standard, it also states that a “business that willfully disregards the consumer’s age shall be deemed to have had actual knowledge of the consumer’s age.”
Data Minimization
The ADCA would establish strong data minimization requirements, prohibiting the collection, sale, sharing, or retention of personal information that is not necessary to provide the product or service. Children’s data that is necessarily collected may only be used for the reason for which it was collected. The ADCA would permit the collection of data solely for age verification purposes, but would minimize the use of data by prohibiting that data to be used for any purpose other than verifying user age. Moreover, the ADCA states that “Age assurance shall be proportionate to the risks and data practice of a service, product, or feature.”
Default Privacy Settings
When a minor accesses digital services, the ADCA would require covered entities to configure “all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy protection offered by the business.” “Default” is defined by the ADCA to mean “a preselected option adopted by the business for the online service, product, or feature.” Examples of settings that would be disabled by default include features that may profile children’s behavior, browsing history, or assume similarity to other children to offer detrimental material.
By default, the ADCA would bar a covered company from “collect[ing], sell[ing], or shar[ing] any precise geolocation information of children . . . unless the collection of that precise geolocation information is necessary.” Covered companies would be required to provide “an obvious sign to the child for the duration of that collection that precise geolocation information is being collected.” When any precise geolocation information is collected, it should only be for the limited time that the collection of the information is necessary.
Enforcement and Regulation
The ADCA would task the California Privacy Protection Agency (CPPA) with establishing the California Children’s Data Protection Taskforce (the “Taskforce”) and publishing privacy information, policies, and standards. The CPPA is an agency established through the CPRA for the purpose of implementing and enforcing the law. Under the ADCA, the Taskforce would be responsible for adopting regulations by April 1, 2024 and providing compliance guidance. The Taskforce would be assembled by April 1, 2023 and consist of members appointed by the CPPA. Members would consist of “Californians with expertise in the areas of privacy, physical health, mental health, and well-being, technology, and children’s rights.” Companies would have 3 months to comply with regulations produced by the taskforce prior to enforcement by the CPPA.
The ADCA would also require covered entities to undertake a Data Protection Impact Assessment (DPIA) for any product, service, or feature likely to be accessed by a child. This new requirement would be a large shift for entities as they would also be required to report the assessments to the CPPA and for review every 24 months or before new features are offered to the public. DPIAs can serve to strengthen enforcement of privacy laws, but are resource-consuming for both covered entities complying and regulators. The CPRA also requires risk assessments, but it is not clear whether these assessments will be filed with the CPPA, unlike the ADCA.
Looking Ahead
In proposing the ADCA, California expands the growing conversation on children’s online protection. Domestically and internationally, legislatures, policymakers, and advocates are attempting to balance the need for youth data protection and mitigating the negative effects of online content with the need for young minds to learn and explore. Federally, the FTC has also renewed its focus on children’s privacy. These efforts may further motivate Congress to push for more comprehensive federal children’s privacy protections and extend heightened protections to teens. Lawmakers in California and around the world have prioritized legislation that would establish individual protections, limit the potential harmful consequences of large-scale data collection and processing, and curtail abuses of targeted advertising and automated decision making.
The ADCA is currently pending in the California Senate Judiciary Committee and will be discussed in a committee hearing on June 28, 2022. Although the bill passed unanimously through the House, it is not yet clear how it will be received by the Senate and what revisions may be adopted. If the ADCA is passed and signed by the governor, it will shift the children’s data privacy landscape in the United States – many online services are based in California and nearly all have California users. For now, it remains to be seen whether the ADCA will emerge from the California Senate and, if it does, the provisions that may be amended in the process.
FPF Statement on U.S. Supreme Court’s Decision to Overturn Roe v. Wade
FPF is profoundly disappointed in the U.S. Supreme Court’s decision to overturn Roe v. Wade — a long held precedent that protected the rights of Americans to make personal decisions about their reproductive healthcare for nearly 50 years. The full scope has yet to be realized, but this decision will likely strip privacy protections even further. Every organization that collects personal data should immediately take a close look at how they can ensure the data they hold will not be used against those seeking reproductive health care.
New Report on Limits of “Consent” in New Zealand’s Data Protection Law
Authors: Elizabeth Santhosh and Dominic Paulger
Elizabeth Santhosh is a current law student at Singapore Management University and an FPF Global Privacy intern.
Introduction
Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI), as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific,” are publishing the fourth in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).
This report provides a detailed overview of relevant laws and regulations in New Zealand, including:
notice and consent requirements for processing personal data in New Zealand’s data protection law;
the status of alternative legal bases for processing personal data which permit processing of personal data without consent if the data controller undertakes a risk impact assessment (e.g., legitimate interests); and
statutory bases for processing personal data without consent and exceptions or derogations from consent requirements in-laws and regulations.
The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.
New Zealand’s Data Protection Landscape
New Zealand is one of the few jurisdictions in APAC which, together with Hong Kong and Australia, passed comprehensive data protection legislation before the turn of the millennium.
The Privacy Act, which was initially passed in 1993 and repealed then enacted in substantially updated form in 2020, provides the default rules for the processing of personal information under New Zealand law. These are articulated through the 13 Information Privacy Principles (IPPs) which provide, broadly, for collection, use, and disclosure of personal information, as well as storage and security, access, correction, and retention of personal information and use of unique identifiers.
This kind of “principles-based” data protection law is also seen in the data protection laws of Australia and Hong Kong, which all draw on principles from the Organisation for Economic Co-operation and Development (OECD)’s 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, including collection, limitation, data quality, purpose specification, use limitation, security, openness, and individual participation.
Beyond the IPPs, the Privacy Act also contains detailed provisions which establish the Office of the Privacy Commissioner to administer and enforce the Act. The Act also empowers the Commissioner to, firstly, investigate complaints regarding entities’ privacy practices, resolve disputes, and issue binding compliance notices, and secondly, issue binding codes of practice in relation to specific sectors or classes of personal information.
In 2012, New Zealand also became one of the few jurisdictions in APAC that has received an “adequacy decision” from the European Commission. This decision recognizes that New Zealand’s data protection laws provide an adequate level of data protection compared with that provided by European law for purposes of cross-border data transfers.
Role and status of consent in New Zealand
Consent – which the Privacy Act calls “authorisation” – plays a number of roles in the Privacy Act but unlike in other major data protection laws internationally, is not a standalone legal basis for collecting, using, or disclosing personal information.
The default position under the IPPs is that collection of personal information must be: (1) by lawful, fair, and non-intrusive means; (2) from the individual data subject, rather than a third party; and (3) necessary for a purpose which is connected with the organization’s functions or activities.
Subject to exceptions, organizations must also notify individuals when their personal data is collected by providing certain information, including the purpose for collecting the personal information.
Once organizations have collected personal information, they may use and disclose the information for the purpose of collection, or another purpose that is related to it, without having to obtain consent.
Authorization functions as one of several exceptions to the default rules in the Privacy Act.
Firstly, an organization may collect personal information from a third party or use or disclose personal information for a purpose that is unrelated to the purpose of collection, if the organization reasonably believes that the individual concerned has “authorized” the collection, use, or disclosure.
Authorization functions as one of several legal bases under the IPPs for cross-border transfer of personal information under IPP 12.
Read more about the role of consent in New Zealand’s Data Protection Law in the full report.
Report Analyzes the Role of Data Protection in Safeguarding Sexual Orientation and Gender Identity Information
While digital technology has empowered LGBTQ+ individuals to find community and access services, the increasing availability and use of connected devices have also created new privacy risks for LGBTQ+ communities.
Today, the Future of Privacy Forum (FPF), a global non-profit focused on data privacy and protection, and experts from LGBT Tech — a national, nonpartisan group of LGBTQ+ organizations, academics, and technology organizations — released a report analyzing the role of data protection in safeguarding sexual orientation and gender identity information (SOGI).
LGBTQ+ communities have historically been some of the earliest adopters of technology, but they are also apt to experience more severe harm. The report encourages policymakers and organizations to learn from past privacy and LGBTQ+ history to shape what data privacy could look like today while continuing the critical work in reducing bias and risk to mitigate or even avoid individual and collective harms.
“The processing of data about an individual’s sexual orientation and gender identity can carry unique risks for LGBTQ+ individuals and communities,” said Amie Stepanovich, Vice President of U.S. Policy at FPF, who is a co-author of the report. “Organizations need to understand the impacts of processing this data on traditionally marginalized communities and to provide heightened protections, with respect for past and present context, to protect against potential harms.”
FPF and LGBT Tech’s analysis shows that while individuals within the United States population are becoming more likely to accept and identify as LGBTQ+, civil rights protections — including the right to privacy — are under attack and still lag when it comes to protecting LGBTQ+ individuals.
Moreover, FPF and LGBT Tech found that LGBTQ+ individuals are disproportionately impacted by privacy violations online. Today, LGBTQ+ communities still face significant barriers and prejudices from violence and discrimination, harming their right to equality and dignity.
“Ninety-seven percent of LGBTQ+ youth have seen content online that could be described as ‘homophobic, biphobic or transphobic,’” said Christopher Wood, Executive Director of LGBT Tech. Wood was one of the co-authors of the report. “For much of the LGBTQ+ youth, the Internet is the only place they feel safe to express their sexuality and connect with other LGBTQ+ youth. Potential violations can lead to privacy harms in the form of online outings and harassment.”
On July 20, co-authors Amie Stepanovich, FPF’s Vice President of U.S. Policy, Chris Wood, Executive Director & Co-Founder of LGBT Tech, and Katelyn Ringrose, Policy Lead for Law Enforcement and Government Access at Google, dove deeper into the report and discussed considerations for policymakers, risks and harms of SOGI data, and more. Watch the conversation by clicking here.
FPF Testifies Before House Subcommittee on Energy and Commerce, Supporting Congress’s Efforts on the “American Data Privacy and Protection Act”
This week, FPF’s Senior Policy Counsel Bertram Lee testified before the U.S. House Energy and Commerce Subcommittee on Consumer Protection and Commerce hearing, “Protecting America’s Consumers: Bipartisan Legislation to Strengthen Data Privacy and Security” regarding the bipartisan, bicameral privacy discussion draft bill, “American Data Privacy and Protection Act” (ADPPA). FPF has a history of supporting the passage of a comprehensive federal consumer privacy law, which would provide businesses and consumers alike with the benefit of clear national standards and protections.
Lee’s testimony opened by applauding the Committee on its efforts towards comprehensive federal privacy legislation and emphasized the “time is now” for its passage. As it is written, the ADPPA would address gaps in the sectoral approach to consumer privacy, establish strong national civil rights protections, and establish new rights and safeguards for the protection of sensitive personal information.
“The ADPPA is more comprehensive in scope, inclusive of civil rights protections, and provides individuals with more varied enforcement mechanisms in comparison to some states’ current privacy regimes,” Lee said in his testimony. “It also includes corporate accountability mechanisms, such as the requiring privacy designations, data security offices, and executive certifications showing compliance, which is missing from current states’ laws. Notably, the ADPPA also requires ‘short-form’ privacy notices to aid consumers of how their data will be used by companies and their rights — a provision that is not found in any state law.”
Lee’s testimony also provided four recommendations to strengthen the bill, which include:
Additional funding and resources for the FTC;
Developing a more iterative process to ensure that the bill can keep up with evolving technologies;
Clarifying the intersection of ADPPA with other federal privacy laws (COPPA, FERPA, HIPAA, etc.); and
Establishing clear definitions and distinctions between different types of covered entities, including service providers.
Many of the recommendations would ensure that the legislation gives individuals meaningful privacy rights and places clear obligations on businesses and other organizations that collect, use and share personal data. The legislation would expand civil rights protections for individuals and communities harmed by algorithmic discrimination as well as require algorithmic assessments and evaluations to better understand how these technologies can impact communities.
New Report on Limits of “Consent” in Hong Kong’s Data Protection Law
Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI) – as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific” – are publishing the third in a series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC).
This report provides a detailed overview of relevant laws and regulations in Hong Kong, a Special Administrative Region (SAR) of the People’s Republic of China, which has its own data protection law, the Personal Data (Privacy) Ordinance (PDPO). The report covers:
notice and consent requirements for processing personal data in Hong Kong’s data protection law;
the status of alternative legal bases for processing personal data which permit processing of personal data without consent if the data controller undertakes a risk impact assessment (e.g., legitimate interests); and
statutory bases for processing personal data without consent and exceptions or derogations from consent requirements in relevant laws and regulations.
The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.
Hong Kong’s Data Protection Landscape
The PDPO – which was passed in 1995 and took effect (except for certain provisions) in 1996 – is one of the most long-standing data protection laws in both APAC and globally.
The purpose of the PDPO is to protect the privacy of individuals in relation to their personal data. The main way in which the PDPO protects such data is by giving legal effect to the six “Data Protection Principles” (DPPs) in Schedule 1 of the PDPO, which cover:
the purpose and manner of personal data collection;
accuracy and duration of data retention;
use of personal data;
security of personal data;
openness and transparency around personal data practices; and
access to and correction of personal data.
This kind of “principles-based” data protection law is also seen in the data protection laws of Australia and New Zealand, which all draw on principles from the OECD’s 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, including collection limitation, data quality, purpose specification, use limitation, security, openness, and individual participation.
The PDPO’s principles are supplemented by other provisions of the PDPO which provide further protection for personal data in specific contexts. Specifically, substantial amendments to the PDPO in 2012 added, among others, a new Part 6A to the PDPO which governs processing of personal data for direct marketing and imposes strict penalties, including fines or imprisonment, on organizations that fail to obtain consent to use or disclose personal data for direct marketing purposes. Further amendments to the PDPO in 2021 added new provisions to the PDPO to combat “doxing” (i.e., publishing private personal information online to, among others, harass, harm, or damage the property of a person). These included new criminal offenses and powers to investigate and prosecute acts of doing.
The PDPO also establishes the Privacy Commissioner for Personal Data (PCPD) – an independent data protection authority which serves advisory and enforcement functions with regard to the PDPO. In its advisory role, the PCPD is tasked with, among others, promoting public awareness and understanding of the PDPO. To that end, the PCPD has issued a number of guidelines on application of the PDPO’s requirements to specific situations or sectors. In its enforcement role, the PCPD is empowered to investigate possible contraventions of the PDPO and issue recommendations or enforcement notices directing an organization to take remedial or protective actions. PCPD is also empowered to investigate and prosecute criminal offenses under the PDPO.
Role and Status of Consent
The PDPO’s data protection framework is based primarily on notification rather than consent.
Generally, before an organization may collect personal data from a data subject, DPP 1 requires that the organization must take all practical steps to ensure that the data subject is explicitly informed of:
the purpose for which the data will be used,
any parties to whom the data may be transferred,
whether it is obligatory or voluntary for the data subject to provide the data.
An organization that has provided a valid notification may use or disclose personal data collected from the data subject for the purpose stated in the notification or a purpose that is reasonably related to it without the need to obtain consent.
Consent plays a secondary role in the PDPO. DPP 3 requires an organization to obtain express opt-in consent from the data subject if the organization wishes to use personal data for a different purpose from the one stated in the notification.
Additionally, if the organization intends to use or disclose the data subject’s personal data for direct marketing purposes, Part 6A of the PDPO requires the organization to notify the data subject of its intention and provide certain prescribed information. The data subject must then give consent. However, for direct marketing purposes, it is sufficient if data subjects minimally do not opt out of use or disclosure of their data for direct marketing.
Non compliance with the above requirements may be a criminal offense under the PDPO, which is punishable with a fine or even imprisonment.
However, the PDPO provides numerous exceptions to the consent requirement in DPP 3. An organization may not need to obtain consent to use or disclose personal data for a new purpose if that purpose includes, among others, preventing or detecting a crime, engaging in legal proceedings, preparing research and statistics, or responding to an emergency involving the data subject.
FPF Releases Policy Brief Comparing Federal Child Privacy Bills
On Wednesday, July 27, 2022, the Senate Committee on Commerce, Science, and Transportation held a markup of two bills this resource highlights: The Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). The Committee advanced both bills with significant amendments. Both bills garnered bipartisan support, with the Kids Online Safety Act receiving a unanimous roll call vote and COPPA 2.0 passing through a voice vote with limited opposition. This brief was last updated in September 2022 to reflect the changes to the two bills.
As children’s privacy continues to be a top priority and area of interest among lawmakers, companies, and the public, the Future of Privacy Forum (FPF) today released a new policy brief that compares the child-centric privacy bills that have been introduced in the 117th Congress. The resource compares four proposed bills against each other (with additional comparisons to current law) on key elements including the age group they seek to protect, enforcement mechanisms, covered entities, notice requirements, verifiable consent, restrictions on the use of personal information (PI), and more.
“Child privacy continues to receive a lot of attention from policymakers, companies, regulators, and families. In recent months, we’ve seen the FTC, state legislatures, federal policymakers, and even the President of the United States signal an interest in enhancing the consumer privacy rights and online protections afforded to children,” said Lauren Merk, Youth & Education Privacy Policy Counsel at FPF. “The four bills outlined in this resource stand to impact the child privacy landscape in the US either by directly changing the law or influencing future legislation at the federal or state levels. Case in point, the recently released discussion draft of the American Data Privacy and Protection Act includes provisions that mirror sections from some of the child-specific privacy bills.”
The four children’s privacy bills introduced in the 117th Congress are the Protecting the Information of our Vulnerable Children and Youth Act (“Kids PRIVCY Act”), the Children and Teens Online Privacy Protection Act (“COPPA 2.0”), the Kids Internet Design and Safety Act (“KIDS Act”) and the Kids Online Safety Act. Two of the bills – COPPA 2.0 and the Kids Online Safety Act – have bipartisan support, while the KIDS Act is the only bill of the four that has been introduced in both chambers of Congress this session.
While all four bills ultimately propose greater online privacy rights for kids, they vary in key respects such as covered age ranges of minor users, enforcement measures, and verifiable consent requirements. The brief’s two comparative tables highlight these and other elements to showcase the various approaches the bills take. Table 1 compares the two bills that seek to directly amend and update the already enacted Children’s Online Privacy Protection Act (COPPA) —COPPA 2.0 and the Kids PRIVCY Act–to each other as well as the current COPPA language for reference. And Table 2 examines the key elements of the KIDS Act and the Kids Online Safety Act, which work independently of COPPA.
ViewTable 1: Federal Child Privacy Bills That Seek to Directly Amend COPPA and Table 2: Federal Child Privacy Bills Independent of COPPA in the policy brief.
“Nearly everyone agrees that protecting kids’ privacy is important, but like many issues, the proverbial devil is in the details and each of these bills goes about it in different ways,” said Miles Light, Youth & Education Privacy Policy Counsel at FPF. “For example, most nonprofit organizations would continue to be exempt from COPPA under COPPA 2.0, but are considered a covered entity under the Kids PRIVACY Act and would have to comply. And while all four bills propose creating privacy protections for older minors – kids ages 13-17 who are not currently covered under COPPA – they vary as to whether they use under 16, under 17, or under 18 as their definition.”
“As child privacy discussions continue, we hope that this comparison can serve as a helpful resource to policymakers, staffers, advocates, and so many others who are closely tracking this issue and the various proposals,” added Light.
New Report on Limits of “Consent” in South Korea’s Data Protection Law
Today, the Future of Privacy Forum (FPF) and Asian Business Law Institute (ABLI) – as part of their ongoing joint research project: “From Consent-Centric Data Protection Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific” – are publishing a second report in their series of detailed jurisdiction reports on the status of “consent” and alternatives to consent as lawful bases for processing personal data in Asia Pacific (APAC) – this time focusing on South Korea.
This report provides a detailed overview of relevant laws and regulations in South Korea, including:
notice and consent requirements for processing personal data;
the status of alternative legal bases for processing personal data which permit processing of personal data without consent if the data controller undertakes a risk impact assessment (e.g., legitimate interests); and
statutory bases for processing personal data without consent and exceptions or derogations from consent requirements in laws and regulations.
The first report focused on the People’s Republic of China and explained how the country’s data protection framework has evolved over the past few years from a consent-centric model to one which provides various alternatives to consent in a GDPR-type model.
The findings of this report and others in the series will inform a forthcoming comparative review paper which will make detailed recommendations for legal convergence in APAC.
South Korea’s Data Protection Landscape
South Korea’s data protection law is founded on similar principles to those of other major data protection laws internationally stemming from the Fair Information Practice Principles, including lawfulness, purpose specification, purpose limitation, data minimization, data accuracy, and security.
In fact, South Korea is one of the few jurisdictions in Asia Pacific which has received an EU adequacy decision, by which the European Commission determines that the level of personal data protection in a given jurisdiction is “essentially equivalent” to that under the GDPR. In June 2021, the European Commission published its draft adequacy decision for South Korea and transmitted the decision to the European Data Protection Board (EDPB) for consideration.
South Korea’s general law on personal data protection is the Personal Information Protection Act (PIPA), which went into effect on September 30, 2011 with the stated purpose of governing processing and protection of personal data to safeguard the rights and freedoms of individuals. The PIPA is complemented by an enforcement decree, various sector specific laws, including in the credit, telecommunications, and insurance sectors, and guidelines issued by South Korea’s data protection regulator, the Personal Information Protection Commission (PIPC).
The PIPA contains detailed provisions covering the lifecycle of personal data from the time that it is collected until it is deleted. The PIPA provides data subjects with rights over their personal data, including, notably, express rights to be informed and to decide whether and to what extent to consent when their personal data is processed. The PIPA also imposes numerous compliance obligations on data controllers including mandatory obligations to issue a privacy policy, appoint a data protection officer, and notify data subjects in the event of a breach of their personal data, with penalties for non-compliance, including specific offenses for collecting and using personal data without a legal basis, such as consent.
The latest major round of amendments to the PIPA was in 2020. These amendments introduced, among others:
specific obligations for organizations which provide commercial information and communication services online, including strict obligations to notify users of how their personal data will be processed and obtain users’ consent for processing of personal data, with very limited exceptions;
new provisions allowing pseudonymized data to be processed without consent for purposes of statistics, research, and public records;
a general principle that data controllers should endeavor to use anonymized data for processing wherever possible and where it is not possible to achieve the purposes of processing using anonymized data, to use pseudonymized data as much as possible.
reforms to the structure and powers of PIPC to establish the organization as an independent, centralized data protection authority like its counterparts in the EU.
Role and Status of Consent in South Korea’s Personal Data Protection Law
Since the PIPA took effect in 2011, it has provided several equal legal bases for collecting personal data, including consent but also alternatives to consent where collection of personal data is necessary for:
executing and performing a contract with the data subject.
complying with a legal obligation.
a public institution to carry out its legal duties.
protecting of a person’s life, body, or property interests from immediate danger, where it is not feasible to obtain consent; and
fulfillment of a legitimate interest of the data controller, where this interest clearly supersedes the rights and interests of the data subject.
The PIPA’s legal bases are similar to those recognized by other major data protection laws internationally, including, notably, the GDPR. However, the PIPA’s requirements for relying on alternative bases to consent are often stricter than those under GDPR, and there is also less guidance on the circumstance in which organizations can rely on alternative legal bases, compared with guidance on how to obtain consent. Organizations in South Korea therefore tend to rely on consent rather than other legal bases in practice.
If an organization seeks to rely on consent to process personal data, the consent must be explicit, opt-in, and informed. In the latter regard, the PIPA requires the organization to notify the data subject of certain information when seeking consent. This includes the purpose for processing the data, what data will be processed, and the data subject’s right to deny consent. If the data subject refuses to give consent after receiving this information, the controller is prohibited from denying provision of goods or services to the data subject on this basis.
Where personal data has been collected on the basis of consent, the controller may use personal data or disclose it to a third party if this use or disclosure is reasonably related to the original purpose for which the data was collected. If the processing is not reasonably related to this original purpose, then the controller must seek fresh consent. Data subjects also have a right to withdraw consent to processing of their personal data at any time.
Processing personal data without a valid legal basis, failing to provide required notifications, and processing personal data beyond the scope of the purpose of collection are all violations under the PIPA, and there have been several high-profile enforcement cases where penalties were imposed on organizations that failed to comply with the PIPA’s consent requirements. A notable example is ScatterLab, an company whose AI chatbot collected personal information from over 200,000 children under the age of 14 without obtaining parental consent.
Lastly, the PIPA imposes stricter obligations on processing of certain personal data which falls within a category of “sensitive personal information” which if revealed, would constitute a material breach of privacy. This category includes personal data regarding a person’s ideological and religious beliefs, trade union and political party membership, political views, health and genetics, sexual orientation, criminal records, and individual physiological or behavioral profile. Notice and consent are generally required for processing of this class of personal data, with limited exceptions.