The Spectrum of Artificial Intelligence – An Infographic Tool
UPDATED August 3, 2021and June 2023. FPF released the white paper, The Spectrum of AI: Companion to the FPF AI Infographic, to provide additional detail and analysis for use of this Infographic tool as an educational resources for policymakers or regulators.
AI is the computerized ability to perform tasks commonly associated with human intelligence, including reasoning, discovering patterns and meaning, generalizing knowledge across spheres of application, and learning from experience. The growth of AI-based systems in recent years has garnered much attention, particularly in the sphere of Machine Learning (ML). A subset of AI, ML systems “learn” from the success or accuracy of their outputs, and can adapt their programming over time, with minimal human intervention. But there are other types of AI that, alone or in combination, lie behind many of the real-world applications in common use.
General AI – truly human-level computational systems – does not yet exist. But Narrow AI exists in many fields and applications where computerized systems enhance human output or outperform humans at defined tasks. This chart is designed to identify and explain the main types of AI, their relationship to each other, and provide some specific examples of how they are currently in place throughout our day-to-day lives. It also demonstrates how AI exists within the timeline of human knowledge and development. Building on knowledge from philosophy, mathematics, physics, ethics, and logic, and more recently, statistical analytics and modeling, it also reflects the foundational role of computer design and security.
There is not 100 percent consensus around the labels for the various types of AI, but for our purposes, we have adopted the following categories as representing generally accepted terms in common use.
Symbolic AI, including subsets: Expert Systems, Search, and Planning & Scheduling
Rules Based
Robotics
Computer Sensing
Knowledge Engineering
Natural Language Processing, and of course,
Machine Learning, including Deep Learning, Neural Networks, Reinforcement Learning, and General Adversarial Networks (GANs)
To aid in understanding these various types of AI programming, we have highlighted a specific use case in a number of broad topic areas.
Finance – Tax Compliance programs that allow you to fill out your tax forms and ensure your information included and provided in a way that is within current legal requirements of the tax code.
Healthcare – Ambient Charting, where conversations between doctors and patients are recorded and added to the patient record as they happen, with key words and followups noted as appropriate.
Tracking – as used in Workplace Monitoring to provide both physical and digital accountability, while enforcing security policies.
Mobility and Transportation – Turn-by-Turn Navigation provides real-time guidance through traffic or even when an area is under construction.
Social Media – platforms use AI to face the challenges around appropriate and effective Speech or Content Moderation.
Forecasting – Supply Chain Management has reached new levels of efficiency and accuracy with AI-based modeling and predictions.
For further information or to provide comments or suggestions, please contact Brenda Leong ([email protected]) or Dr. Sara Jordan ([email protected]).
FPF and LGBT Tech Host Discussion in Honor of Human Rights Day
Yesterday, in honor of International Human Rights Day, a time when individuals and organizations around the world celebrate the 1948 ratification of the Universal Declaration of Human Rights, FPF and LGBT Tech hosted a discussion exploring the “LGBTQ+ Right to Privacy.”
The conversation featured tech privacy and policy experts from both FPF and LGBT Tech, including: Christopher Wood, Executive Director, LGBT Tech; Christopher Wolf, Founder and Board Chair, FPF; Carlos Gutierrez, Deputy Director & General Counsel, LGBT Tech; Dr. Sara Jordan, Artificial Intelligence and Ethics Policy Counsel, FPF; and Katelyn Ringrose, Christopher Wolf Diversity Law Fellow, FPF. The conversation was moderated by Jules Polonetsky, FPF CEO.
The conversation revolved around issues related to the United Nations 2020 theme — which is “Stand Up for Human Rights” — and the speakers tackled entrenched and systematic inequalities experienced by the LGBTQ+ community from a privacy and equity perspective.
Throughout the discussion, speakers noted the many beneficial uses of data pertaining to an individual’s sexual orientation, gender identity and sex life; while also discussing the many ways that such data can be used to perpetuate systems that discriminate against, oppress, or otherwise harm members of the LGBTQ+ community.
For example, while the collection of health data is important in the context of COVID-19 contact tracing, the LGBTQ+ community has not always benefited from inclusive data protection practices and may be less willing to have their data collected and used in the name of public health. According to Carlos Gutierrez, Deputy Director & General Counsel of LGBT Tech, “historically, HIV laws were weaponized against the LGBTQ community, and gay men. Getting an HIV positive test result could mean the loss of employment, dropped insurance, and more.”
And while certain uses of data may pose risk for the LGBTQ community, speakers also pointed out that that inclusive data collection for marginalized communities tends to benefit society as a whole. For example, Dr. Sara Jordan, Artificial Intelligence and Ethics Policy Counsel for FPF, noted that existing definitions for sexual orientation and gender identity data (SOGI data) often do not address issues pertaining to the intersex community, and how identifying such gaps and leveraging data for good can allow the research community to “advance research, or identify ways to redirect existing research streams.” This, Dr. Jordan noted, will allow the benefits of using this data to “flow directly” back to individuals and communities in need of such advances.
Watch the full discussion here on LGBTQ+ data privacy here:
Regarding FPF or LGBT Tech’s joint efforts related to LGBTQ+ privacy; contact either Katelyn Ringrose at[email protected] or Chris Wood at [email protected]with any questions, comments, or to get involved.
Legislative Findings: Brookings Builds on U.S. Privacy Legislation Report
Today, the Brookings Institution released model legislative findings for federal privacy legislation, intended to accompany the model privacy legislation they published in June, 2020. The findings are designed to motivate discussion and to reconcile differences between two of the leading proposals: Sen. Maria Cantwell’s (D-WA) Consumer Online Privacy Rights Act and Sen. Roger Wicker’s (R-MS) SAFE DATA Act. The legislative findings also provide useful framing for the recommendations and options outlined in Brookings’ Report, “Bridging the gaps: A path forward to federal privacy legislation.”
// WHY LEGISLATIVE FINDINGS MATTER
Many federal and state laws begin with a non-binding “legislative findings” section that outlines the various goals that legislators are trying to achieve through the legislation. The articulation of the foundational principles and aims of privacy legislation is powerful and important for a number of non-operational reasons. These include building support among the public and members of Congress, declaring American values to the world, informing judicial and agency interpretation, and clearly expressing grounds to uphold the legislation against constitutional challenges.
Despite these reasons, ten of the dozen proposals for comprehensive privacy legislation introduced in the 116th Congress excluded statements of legislative findings. In contrast, the EU’s General Data Protection Regulation (GDPR) is accompanied by 173 recitals that explain and offer commentary to the law.
// WHAT THE BROOKINGS PROPOSED LEGISLATIVE FINDINGS CONTAIN
The legislative findings proposed by the Brookings Institution are not intended to be set in stone, but instead to provide a comprehensive outline for Congress and stakeholders to consider as they continue to draft privacy legislation. They also provide a framing for the detailed recommendations outlined in the recent report, “Bridging the gaps: A path forward to federal privacy legislation,” which advocates for a risk-based approach, tailored preemption and enforcement, and individual rights aimed at recognized privacy harms.
In summary, the proposed legislative findings contain (1) a statement of the legal, moral, and historical foundations of privacy in America; (2) technology developments that underlie the need for legislation; (3) the effects of these developments that privacy legislation aims to address; (4) how privacy legislation aims to address these effects; and (5) a set of policy declarations that express key governmental objectives.
You can read the model legislation with legislative findings in full here.
You can read Brookings’ full release for the legislative findings here.
You can read the Brookings report from June 2020, here.
You can read FPF’s brief analysis of the key areas of debate contained in the Brookings report here.
This year, FPF has observed considerable progress among stakeholders towards developing nuanced and workable solutions in key remaining areas of debate — including enforcement and preemption. This progress has been promoted by important legislative and judicial developments in the states and abroad, as well as the accelerated adoption of technologies for remote learning, work, and leisure activities during the COVID-19 pandemic. With all this in mind, we are optimistic that the new Administration will create fertile ground for the enactment of bipartisan privacy legislation in the 117th Congress.
A Deep Dive into New Zealand’s New Privacy Law: Extraterritorial Effect, Cross-Border Data Transfers Restrictions and New Powers of the Privacy Commissioner
By Caroline Hopland, Hunter Dorwart and Gabriela Zanfir-Fortuna
Last week, on December 1st, the newly amended Privacy Act 2020 (Act) of New Zealand came into force. The act was passed by the New Zealand Parliament on June 20, 2020 and made significant changes to the 1993 law, Privacy Act 1993. The amendments cover a broad range of topics including the extraterritorial scope of the law, new mandatory data breach notification requirements, changes to “compliance notices” as a key enforcement tool of the Office of the Privacy Commissioner, to data subject access requests, restrictions on cross-border transfers of personal information, and the enforcement regime overall.
One key feature of the Act as compared to other comprehensive privacy and data protection laws around the world is how central the Privacy Commissioner is in shaping and enforcing the law, including with regard to being a necessary “stop” before a claim made by an individual or a representative action gets to the Human Rights Tribunal. The Act gives the Commissioner greater powers to ensure covered entities are complying with the law, to adopt “codes of practice”, and broad authority to issue sweeping compliance notices and prohibit cross-border transfers of personal information. Such broad discretion may initially create legal uncertainty regarding specific compliance requirements, as well as the scope of the rights of data subjects. For instance, the Commissioner at any time may issue clarification guidelines known as “codes of practice” that modify the baseline of obligations set out in the law and clarify the obligations of data controllers.
The law mentions the OECD Guidelines in multiple sections as a baseline for compliance, and explicitly recognizes other data protection regimes such as European Union’s General Data Protection Regulation (GDPR) as providing a comparable level of protection.
However, the Act diverges from the GDPR and other data protection laws inspired by it in terms of its smaller set of rights of the data subject, its lukewarm penalty structure for non-compliance and its original framework for individual redress. Under the updated law, individuals do not have a right to erasure (“right to be forgotten”), a right to data portability or any specific rights, such as objection, in relation to automated decision-making. In fact, profiling and automated decision-making are not specifically addressed, and the Commissioner may only fine companies up to $10,000 New Zealand dollars (app. 7,000 USD) for violations of the Act.
Notably, Commissioner John Edwards stated in a recent radio interview that this Act was deliberately designed to sit mid-range in the spectrum of privacy regulations around the world. He went on to note that in 2011, the New Zealand Law Commission contemplated whether the Commissioner could impose much larger fines to entities like in Europe. However, it decided to instead grant the Commissioner discretion to issue compliance notices to see if there is a change of behavior, and will assess its effectiveness in a few years.
It should be noted that New Zealand is one of the countries whose legal system received an “adequacy decision” from the European Commission allowing unrestricted transfers of personal data from the EU. New Zealand’s adequacy is set to be reassessed as part of the European Commission’s efforts to re-assess adequacy decisions in the light of the GDPR, and this assessment will be done on the basis of the new law (New Zealand’s adequacy was issued in 2012, under the former Data Protection Directive 95/46/EC).
Below we discuss some key changes of the Act, specifically 1) its broadened extraterritorial scope, 2) cross-border transfer restrictions, 3) the Information Privacy Principles and Codes of Practice that may detail and enhance them, 4) updated data access requests by data subjects, 5) the new data breach notification requirements, 6) “compliance notices” regarding breaches of the Act and the new penalties framework, 7) the private right of action enshrined by the law and possible class actions, and 8) rules on public sector data sharing, before reaching 9) conclusions.
1. Extraterritorial Scope of the Law
The Act expanded its scope to overseas organizations that carry out business in New Zealand, regardless of where they collect or hold data and where the data subjects are located. Under the Act, carrying on business in New Zealand extends beyond traditional commercial activities such as having a place of business in the country or receiving money for the supply of goods and services in the country. Therefore, the new scope of the law could in theory encompass a range of other potential overseas organizations, such as non-profits, as long as they carry out their activities in New Zealand.
In addition, the act also applies to non-resident individuals, if, while in New Zealand, they either collect or hold personal information about anyone from anywhere in the world, even if the individual previously collected the information while outside of New Zealand. Lastly, the Act applies to “agencies” which refer to not only private companies and organizations but also certain public bodies such as government departments, both within and outside of New Zealand. “Agency” is somewhat of a nuanced term in the Act, which can find correspondents to both controllers and processors as defined under the GDPR (or “businesses” and “service providers” as defined by California Consumer Privacy Act – “CCPA”). While the Act does not contain a separate chapter laying out specific obligations for the subsequent processing of data by an organization on behalf of another organization, it does extend liability to “agents” of the “principal agency” which could encompass GDPR-processors/CCPA-business-providers. This widely expands the category of entities who have direct obligations under this Act.
2. Cross-Border Data Transfers
The Privacy Act includes a new Information Privacy Principle (IPP), IPP 12, which lays out rules for disclosing personal information outside New Zealand. According to this new principle, an agency may only disclose personal information to a foreign entity only if one of six grounds is satisfied:
(a) express and informed consent of the individual in the cases where the exporter informs them that the importer may not be required to protect their personal information in a comparable manner with the protection afforded in the Privacy Act;
(b) in the course of the importer carrying out business in New Zealand and the exporter reasonably believing that the importer is subject to the Privacy Act;
(c) the exporter reasonably believing that the importer is subject to comparable privacy laws to the Privacy Act;
(d) the exporter reasonably believing that the importer is a participant to a “prescribed binding scheme”;
(e) the exporter reasonably believing that the importer is subject to privacy laws of a “prescribed country”; and
(f) the exporter reasonably believing that the importer is required to protect the information in a comparable way to the Privacy Act, such as for example pursuant to an agreement between the two. Both countries and binding schemes can be “prescribed” through action by the Governor-General by Order in Council.
It is interesting to note how New Zealand’s Privacy Act solved a couple of the big questions stemming from GDPR’s rules on international data transfers: consent of the individual may be considered a valid mechanism for cross-border transfers only where the data importer is not subject to similar obligations to those in the jurisdiction of the data exporter; if the privacy law of the exporter applies to the importer by virtue of its extraterritorial effect, then no additional safeguards are required for the personal information being transferred. Another point to note is that the GDPR “essential equivalence” standard has a correspondent in the seemingly more straight-forward “comparable laws” standard under the Privacy Act.
The Privacy Act also gives the Commissioner broad authority to prohibit cross-border transfers of personal information outside of New Zealand, similar to the provisions in the old law. If the recipient country does not provide legal safeguards comparable to those covered in the Act and the transfer would likely contravene the basic principles set out in Part Two of the OECD Guidelines and in Schedule 8 of the Act, the Commissioner may issue a transfer prohibition notice to the company in question. This does not apply to transfers that receive authorization from the Commissioner or are otherwise authorised by “any enactment”, or that occur on the basis of an internationally binding convention.
The Commissioner will consider broad factors to determine whether to prohibit transfers. These include:
the likelihood the transfer would harm any individual;
the general desirability of facilitating the free flow of information; and
any existing or developing international guidelines relevant to cross-border data flows such as the OECD Guidelines and the GDPR.
Entities transferring data abroad must receive a transfer prohibition notice in order for the Commissioner to effectuate the prohibition. Before becoming effective, each notice must meet a series of requirements such as the nature of the prohibition, the personal information the prohibition applies to, and the grounds for prohibition. The Commissioner must reply within 20 days to any request to vary or cancel the notice and must provide a reason if the request is refused.
Organizations may appeal to the Human Rights Review Tribunal (Tribunal) the decision of the Commissioner to issue a transfer prohibition notice against all or any part of the notice or against the refusal by the Commission to vary or cancel the notice. The Tribunal must allow an appeal if it considers that the Commissioner’s decision violates the law or if the decision results from an inappropriate use of the Commissioner’s discretion. On appeal, the Tribunal may modify the notice to exclude any statement that it finds does not have effect.
The law imposes a penalty for any person who without reasonable excuse fails or refuses to comply with a transfer prohibition notice up to $10,000 New Zealand dollars.
3. Information Privacy Principles and Codes of Practice
The Act sets forth IPPs that impose broad obligations on entities for their processing activities and serve as a benchmark for the Commissioner to implement further guidance through legally binding codes of practice. These principles relate to different dimensions of information processing such as purpose, manner of collection, storage and security, access, correction, accuracy, and limits on use and disclosure.
Many of the IPPs have not changed from the previous version of the law. For instance, IPP 3 specifies that entities collecting personal data must take reasonable measures to inform the individual of specific facts such as the purpose of collection and the intended recipients of the information. If the collection is authorized or required by a separate law, the entity must inform the individual of such law and explain whether the supply of the information is voluntary or required. IPP 3 also lists grounds that excuse an entity from providing such information to the individual.
Notable changes to the IPPs compared to the previous law include heightened fairness requirements for entities that collect information from children or young persons (IPP 4) and a purpose limitation principle (IPP 1) that requires entities to collect identifying information from people only when necessary. IPP 1 also specifies that information must be collected for a lawful purpose connected with the act of processing.
The Act does not explicitly specify lawful grounds nor provides for a general consent requirement for all processing activities. It does, however, impose limits on the disclosure of personal information without consent or another valid justification such as protecting public safety, upholding the legitimate and reasonable activities of law enforcement, or facilitating the execution of a contract as a going concern. In fact, the IPPs and, generally, the other provisions of the Act differentiate among “collection”, “use” and “disclosure” of personal information, proposing a different set of rules for each of them. Note that the Act does not specify what are “lawful means” of collection, only that entities collecting data must use one. The Commissioner can issue further modification or guidance through a code of practice to impose further requirements on these baseline provisions.
In addition, IPP 10 imposes limits on the use of personal data but provides some exceptions such as when the data has been de-identified or will be used for statistical or research purposes in a form that will not reasonably lead to the identification of an individual. Publicly available data, data used in the furtherance of law enforcement, or data used to prevent or lessen substantial injury to an individual or the public health, may also trigger an exception to the general purpose limitation rule.
Note that an agency does not breach the IPPs in relation to information held overseas if the action is required by law of any country other than New Zealand.
The Act acknowledges a range of situations in which certain IPPs do not apply such as a household exemption similar to the one provided in the GDPR, information collected before 1993, the activities of intelligence and security agencies, information gained during an investigation initiated by the Commissioner or an Ombudsman, and information collected by Statistics New Zealand. Under certain circumstances, the Commissioner may authorize the processing of personal information otherwise in breach of certain IPPs if the Commissioner determines that the public interest in granting authorization substantially outweighs the possibility of adverse effect on the individuals concerned.
As stated above, the Commissioner has broad discretion to issue codes of practice in relation to the IPPs to modify and clarify the application of the law. These codes of practice may prescribe a broad list of measures including:
how companies must comply with the IPPs;
specific requirements for types of information, businesses, industries or activities;
technical controls relating to specific processing activities;
guidelines and fee structures; and
review and monitoring mechanisms.
The Act specifies that failure to comply with a code of practice amounts to a breach of an IPP. The Commissioner may issue a compliance notice (see below) for agencies that violate any code of practice or any of the baseline IPPs set forth in the law.
4. Rights of Individuals: Access and Correction Requests
The IPPs also specify the rights of the individuals whose personal information is collected and used vis-a-vis processing entities, including the right to access and correct personal information. Individuals or representatives of individuals may issue an IPP 6 access or an IPP 7 correction request to which the entity receiving the request must promptly respond by either granting or refusing the request with reasons for the decision. If an entity does not hold the information and believes that another entity does, it must promptly transfer the request to the other entity unless good cause exists to believe that the data subject does not want the request to be transferred. Under this scenario, the entity must then notify the data subject of its decision.
The Act expanded the range of withholding grounds that entities may rely on to refuse an access request, compared to the old law. Notably, companies may now refuse access to protect an individual if disclosure would likely pose a serious threat to 1) life, health (including both mental and physical) or safety of an individual, or 2) to public health and public safety. Companies may also refuse to disclose information of an individual under 16 years old if they determine such disclosure would be contrary to the interests of the individual. Other withholding grounds include: 1) protection of security, 2) defense and international relations, 3) protection of trade secrets, 4) inability of the entity to locate the data, 5) the use of data law enforcement, or 6) rejection of frivolous requests.
The Act also specifies the means by which an entity can make information available to a data subject under an access request. Entities must make available information in a manner preferred by the requestor unless doing so would be overly burdensome or contrary to any legal duty of the entity.
For IPP 7 correction requests, the Act specifies that an individual may either request the entity to correct personal information or attach a statement of correction, but does not specify that an individual can request the entity to erase the personal information. The Act does not include a right to be forgotten nor a right to data portability.
Finally, the Act specifies that the Commissioner cannot modify or restrict the entitlements under IPP 6 or 7 for access and correction requests. The Act, however, does not facially preclude the Commissioner from expanding the scope of these rights or the obligations of entities in relation to these rights through codes of practices. Given the broad discretion the Act gives to the Commissioner, an interesting legal question arises as to whether the Commissioner could have the authority to require companies to provide for portability or erasure.
5. Data Breach Notifications
The Act introduces mandatory data breach notification requirements for organizations when a notifiable privacy breach has occurred having affected individuals. Entities will be under a legal duty to notify the Commissioner and any affected individuals if the breach could cause serious harm to anyone; and a failure to do so is a criminal offense, punishable as a fine up to $10,000 New Zealand dollars.
The Act defines a privacy breach, in relation to personal information held by an agency, as an 1) unauthorized or accidental access to, or disclosure, alteration, loss, or destruction of, the personal information; or 2) an action that prevents the agency from accessing the information on either a temporary or permanent basis; whether or not it is ongoing, was caused by a person inside or outside it, or is attributable in whole or in part to any action by it.
The Act defines a notifiable data breach as 1) a privacy breach that it is reasonable to believe has caused serious harm to an affected individual or individuals or is likely to do so; but 2) does not include a privacy breach if the personal information that is the subject of the breach is held by an entity who is an individual and the information is held solely for the purposes of, or in connection with, the individual’s personal or domestic affairs.
According to the Act, an affected individual is one whose personal information relating to them was the subject of a privacy breach; and is an individual inside or outside New Zealand, and can even be deceased if 1) a sector-specific code of practice applies to deceased persons, and 2) to the extent that the code of practice applies one or more IPPs to that information.
An organization must consider several elements when assessing whether a privacy breach is likely to cause serious harm in order to decide whether the breach is notifiable, such as any action it took to reduce the risk of harm following the breach or whether the personal information is sensitive. If unsure whether a breach is notifiable, an organization can use the Office of the Privacy Commissioner’s NotifyUs tool to determine whether it has a legal duty to report it.
Further, it must notify the Commissioner and affected individuals as soon as practicable after learning that a notifiable privacy breach occurred. Both the notification to the Commissioner and the notification to data subjects must contain a list of specific information defined by the law, including a description of steps that the organization took in response to the privacy breach.
The entity can also identify the person or body, if known, that has obtained or could obtain the affected individual’s personal information, if it reasonably believes that identification is necessary to prevent or lessen a serious threat to the life or health of the affected individual or another individual. It can also provide this information incrementally, as it becomes known, in order to comply with the requirements around providing any new or available information as soon as practicable. It must not include, however, any particulars about any other affected individuals.
Moreover, if it is not reasonably practicable to notify each affected individual, the organization must instead give public notice of the breach in a way that no affected individual is identified.
An entity is not required to notify an affected individual or give public notice under certain circumstances, such as if it believes it would endanger someone’s safety, reveal a trade secret, the affected individual is under the age of 16 and the entity believes that notice would be contrary to the individual’s interests, or if, after consultation with an individual’s health practitioner (where practicable), it believes that the notice would likely prejudice that individual’s health. Further, an organization may also delay notifying affected individuals or giving public notice of the breach under certain circumstances.
An entity who fails to notify the Commissioner of a notifiable data breach can be liable and fined up to $10,000 New Zealand dollars.
Specific employees, agents and members of agencies who fail to comply with these procedures, (notifying the Commissioner and affected individuals, or giving public notice) will not be personally liable, and their acts or knowledge of the breach will be treated as being done and known by the employer or entity.
6. Compliance Notices, Appeals and Fines
For any breaches of this Act, including the IPPs, interference with the privacy of an individual under another Act, breaches of any codes of practice or a code of conduct under another Act, the Commissioner may issue a “compliance notice” to the breaching entity. This grants the Commissioner greater discretion to require entities to act or refrain from certain behaviors. Compliance notices are enforced through the Tribunal, and a failure to comply is a criminal offense, resulting in fines up to $10,000 New Zealand dollars.
The Commissioner will consider a number of factors prior to issuing a compliance notice to an entity, such as the seriousness of the breach, the number of people affected, and the likely costs to the entity in order to comply with the notice. Prior to issuing a compliance notice, the Commissioner must also send the organization a written notice about the breach, any particular steps the organization should take to remedy it and dates to do so, and allow the organization a reasonable opportunity to respond.
A compliance notice then issued by the Commissioner will describe the breach, citing the relevant statutory provisions, require the entity to remedy it, and inform it of its appeal rights. The notice might also require the entity to take particular steps to remedy the breach, conditions that the Commissioner considers appropriate, dates the entity must remedy the breach or report to the Commissioner, or other useful information.
Once the compliance notice has been issued, the entity must, within a “practicable time”, take steps to comply with the notice and any particular steps specified within it, and remedy the breach by the date stated in the notice. The Commissioner, believing it is within the public’s interest, can publish 1) the entity’s identity, 2) details about the notice or the breach, or 3) a statement or comment about the breach.
6.1 Appeal of Compliance Notice
Businesses should be aware of the option to appeal a compliance notice, but also the potential of enforcement proceedings or interim orders brought and issued against them with respect to failing to comply with the notice. An entity may appeal to the Tribunal, 1) all or part of a compliance notice issued against it, or 2) the Commissioner’s decision to amend or cancel the notice. The appeal must be filed within 15 working days from the day on which the compliance notice was issued or the notice of the decision is given to the entity.
However, the Tribunal cannot cancel or modify a notice simply because 1) the breach was unintentional or without negligence on the part of the agency, or 2) the organization took steps to remedy the breach, unless there is no further reasonable step it could take to do so.
6.2 Enforcement of Compliance Notice
The Commissioner may also bring enforcement proceedings to the Tribunal for an agency’s noncompliance with a notice. The Commissioner can bring enforcement proceedings if, after the statutory period, no appeal has been filed against a notice, and, if either 1) the Commissioner has reason to believe that the entity did not, or will not, remedy the breach (if applicable, by the date stated in the notice), or 2) it failed to report to the Commissioner the steps it took to remedy the breach by the date stated in the notice. The entity may object to enforcement of the notice only on the ground that it believes it has fully complied with the notice. The Tribunal can determine whether the notice has been fully complied with, or order the entity to comply with the notice or perform an act specified in the order.
6.3 Remedies, Costs, and Enforcement
In the enforcement proceedings brought by the Commissioner, the Tribunal may grant an order requiring the entity to 1) comply with the notice by a date specified in the order (which may vary from the original date stated in the notice), 2) perform any act specified in the order by a date specified in the order (for example, reporting to the Commissioner on progress in complying with the notice), and 3) award costs it considers appropriate. In an appeal brought by the agency, the Tribunal may grant an order that 1) confirms, cancels, or modifies the notice and 2) confirms, overturns, or modifies the decision, and 3) award costs it considers appropriate.
An award of costs may be enforced in the District Court as if it were an order of that Court. An entity that, without reasonable excuse, fails to comply with the Tribunal’s decisions above can be fined up to $10,000 New Zealand dollars. Lastly, the Commissioner is entitled to appear or be represented by a lawyer or an agent during these proceedings.
6.4 Liability and Penalties
The Act imposes a $10,000 New Zealand dollars maximum fine on persons that fail to comply with any lawful requirement of the Commissioner or without reasonable excuse, obstructs, hinders or resists the Commissioner or any other person exercising authority under the law.
In addition, the Act also penalizes persons up to $10,000 New Zealand dollars who 1) knowingly make a false statement to the Commissioner or any other person exercising power under the Act; 2) misrepresent any authority they hold under the Act; 3) mislead companies by impersonating or pretending to be acting under the authority of an individual for the purpose of unlawfully obtaining access and using that individual’s information; or 4) destroy any document containing personal information knowing that it is subject to a data subject request.
Finally, the Act also sets out liability for companies for actions taken by their employees or agents. Unless the company does not know of the employee’s actions or has not given express or implied authority to its agents, it may be liable for any violation listed above carried out by such employees or agents. The Act offers an affirmative defense for companies that take reasonably practicable steps to prevent their employees or agents from violating the law.
Note that the law does not contain offenses punishable by imprisonment for officers or individuals that contravene its provisions.
7. Limited Private Right of Action, Class Actions
As far as the IPPs are concerned, they “do not confer on any person any right that is enforceable in a court of law”, except for the right of access (IPP6(1)) and only in relation to public sector agencies (see Section 31). However, individuals may challenge before the Human Rights Review Tribunal a Commissioner’s decision or failure of the company to comply with such a decision.
An individual may bring multiple complaints to the Privacy Commission on behalf of multiple aggrieved parties for an “interference with the privacy of an individual.” The Act defines an interference broadly as a breach of the Act or a private contract that could cause damage to the individual, adversely affect the rights of the individual or result in significant humiliation of the individual. An interference also includes decisions by companies to improperly refuse an access request or otherwise fail to respond timely to such requests.
After receiving the complaint, the Commissioner must consider the complaint and decide to either not investigate, refer the complaint to another person or overseas privacy enforcement agency, explore the possibility of securing settlement between the parties, or conduct an investigation. The Commissioner has wide authority to decide between any of these routes, but must respond to the complainant as soon as practicable with the reason for decision. Notably, the Commissioner may also refer a complaint directly to the Director of Human Rights Proceedings, as appointed under the New Zealand Human Rights Act, without conducting an investigation if it is unable to secure a settlement or discovers parties to a settlement have failed to comply with the terms of the settlement.
The Act gives the Commissioner discretion to collect information in the course of an investigation and regulate its own procedure. The Commissioner must conduct the investigation in a timely manner but has the authority to act upon the investigation in any manner the Commissioner finds satisfactory. After completing the investigation, the Commissioner may decide that it has no merits and dismiss the complaint.
When the complaint does have merits, the Commissioner must first attempt to secure settlement or obtain assurances between the parties before acting otherwise. If this fails, the Commissioner has broad discretion to either direct the company take remedial action with respect to an access request, refer the complaint directly to the Director, or take any other action the Commissioner considers appropriate.
Aggrieved individuals may appeal to the Human Rights Review Tribunal a Commissioner’s decision or failure of the company to comply with such a decision. Appeals may be commenced by the individual, a representative of the aggrieved individual or a representative lawfully acting on behalf of a class of aggrieved individuals. On appeal, the Tribunal may impose broad remedies to correct the interference, including actual and expected damages as well as damages for humiliation suffered by the individual. Companies may also appeal decisions reached by the Commissioner including enforcement of access.
8. Public Sector Data Sharing
The Act also sets forth provisions governing data sharing agreements between public entities in New Zealand. While the amendments did not change this section of the law, it is worth noting that New Zealand has its own regime governing the sharing of information between its public sector institutions. Each agreement pursuant to the law must meet certain transparency requirements including notifying any individual of an adverse action against them resulting from the agreement. These agreements help agencies access information to facilitate the execution of their legal duties and obligations.
Finally, the Act establishes provisions governing information matching programs. These programs allow entities to compare personal information as long as the parties meet a set of requirements including heightened notice, reporting and transparency obligations.
9. Conclusions
The New Zealand Privacy Act of 2020 is an interesting framework on the global privacy map, characterized by great responsibility in the hands of the Privacy Commissioner to breath life into the law and keep it up to date in the digital era. While being marginally inspired by GDPR provisions, and being able thus to contribute to interoperability of privacy regimes around the world, it stays true to New Zealand’s decades-old history of regulating privacy and data protection.
This blog is part of a series of overviews of new laws and bills around the world regulating the processing of personal data, coordinated by Gabriela Zanfir-Fortuna, Senior Counsel for Global Privacy ([email protected]).
Future of Privacy Forum, National Education Association Call for Review of Mandatory Video Policies in Online Learning
The Future of Privacy Forum (FPF) and National Education Association (NEA) today released new recommendations for the use of video conferencing platforms in online learning. The recommendations ask schools and districts to reconsider mandatory video requirements that create unique privacy and equity risks for students, including increased data collection, an implied lack of trust, and conflating students’ school and home lives.
“Mandating the use of video requires students to share more about their private home lives than they may want to, from their living situation to who they live with, creating potential social harm,” said Amelia Vance, FPF’s Director of Youth and Education Privacy. “It also risks deepening existing inequities, presenting additional challenges for students with disabilities, English language learners, and students with limited access to adequate wi-fi or video-supported devices. Video can be a helpful tool in online learning, but it shouldn’t be a mandatory one – or the only one – that educators use to engage with and assess students.”
“Video mandates during virtual class instruction coerces students to further blur the vanishing line between their home and school lives. When educators are required by districts to force video use, it violates the trust they’ve built with their students over countless hours of relationship-building through this pandemic and needlessly puts learning at risk in the pursuit of administrative oversight,” said Donna M. Harris-Aikens, Senior Director of Education Policy and Practice at the National Education Association.
Seventy-seven percent of students started the fall semester remotely, and as COVID-19 cases spike this winter, schools across the country are closing back down or delaying their in-person reopening plans. Recognizing that online learning, including video, is likely to remain essential through the remainder of the 2020-21 school year, FPF and NEA developed these recommendations to serve as a resource for educators that are continuing to navigate an evolving and unprecedented time in education.
Before mandating the use of video, the recommendations encourage educators to explore alternatives and considerations, including:
Considering alternatives to create and measure classroom engagement, such as quizzes at the end of a lesson, or encouraging students to respond and interact with material in different ways, including “reactions,” emojis, or the chat feature. Another option: using avatars protects student privacy while encouraging creativity and allows educators to feel as if they are teaching to more than a blank screen.
Considering privacy and equity throughout the process. There is no substitute to an in-person learning environment. While video may seem like the closest alternative, educators should carefully weigh the benefits and risks posed to privacy and equity, from increased data collection to the permanent documentation of deeply personal aspects of students’ home lives.
Teaching students about privacy and how to ingrain it into their online lives. The shift to online learning creates a key opportunity to teach student safe online practices, and the role they play in protecting their own privacy. Students should understand the implications of sharing personal information, what data is being collected about them, and how to adjust settings within products and services to be more privacy-protective, including to minimize data collection, data sharing, and third-party tracking.
The 2020 Brussels Privacy Symposium is the fourth-annual academic program jointly presented by the Brussels Privacy Hub of Vrije Universiteit Brussel (VUB) and the Future of Privacy Forum (FPF) and is a global convening of practical, applicable, substantive privacy research and scholarship.
On December 2, 2020, the fourth iteration of the Brussels Privacy Symposium, “Research and the Protection of Personal Data Under the GDPR”, will occur as a virtual international meeting where industry privacy leaders, academic researchers, and regulators will discuss the present and future of data protection in the context of scientific data-based research.
This is a special moment to bring together thought leaders and regulators in data protection and privacy and competition law.
The COVID-19 pandemic has brought to the fore the crucial role that data collection, analysis, sharing, and dissemination play for governments, academic institutions, and private sector businesses racing to advance research to help contain, mitigate and remedy the disease. It also illustrates that data protection safeguards are essential to build public trust for the swift adoption of data-based solutions that respect fundamental rights. But the effect of privacy and data protection laws on scientific research extends beyond the pandemic and healthcare. The interactions between data protection and research are complex, with privacy and data protection enhancing individuals’ trust and ensuring respect of fundamental rights and ethical standards, while at the same time creating friction for data sharing across organizations and borders.
The EU General Data Protection Regulation (GDPR) provides a tailored framework for processing personal data for research purposes, which allows for differences in implementation at the Member State level and presents questions about the interpretation and implementation of issues such as the scope of research; repurposing personal data, including sensitive data; researchers’ access to corporate databases; sharing data across international borders; and the use of data protection enhancing techniques such as key coding and pseudonymization. Other legal frameworks, including emerging US privacy laws, call for the creation of ethical review boards for data research in organizations, including businesses that have not traditionally adhered to ethical standards such as the Common Rule. In this year’s Brussels Privacy Symposium (which will take place online), leading ethicists, scientists, lawyers, and policymakers discuss the present and future of data protection in the context of scientific data-based research under the GDPR.
14:00-14:15 (CET) – Introduction from the co-hosts
Jules Polonetsky, FPF
Prof. Christopher Kuner, VUB
14:15-14:45 (CET) – Opening Keynote
Malte Breyer-Katzenberg, DG CONNECT, European Commission
Cornelia Kutterer, Microsoft Corporation
14:45-15:45 (CET) – Complex Interactions: the GDPR, Data Protection and Research
The GDPR provides safeguards and derogations relating to the processing of personal data for scientific research purposes. At the same time, the framework limits the collection of sensitive data and its sharing across organizations and national borders. In an era when EU institutions and international organizations advocate for data philanthropy and the sharing of personal data for compelling public interest grounds, legal frameworks must strike a delicate balance between public interests and individual rights. How do experts assess the interactions between scientific progress and the protection of rights under the GDPR? What have the effects of GDPR been on research in academic and corporate settings? Against the backdrop of the COVID-19 pandemic, this session explores whether GDPR has facilitated or hindered data research for healthcare purposes as well as in a broader context.
Moderator:
Gianclaudio Malgieri, Associate Professor EDHEC Augmented Law Institute (Lille); Affiliated Researcher LSTS VUB
Speakers:
Claire Gayrel, EDPS
Dara Hallinan, FIZ Karlsruhe – Leibniz Institute for Information Infrastructure
Ciara Staunton, School of Law, Middlesex University, London and Centre for Biomedicine, EURAC, Bolzano, Italy
Henrik Junklewitz, JRC, European Commission
16:00-17:00 (CET) – Using Sensitive Data in Research to Counter (Hidden) Bias and Discrimination
Scientific research often depends on the broad collection, use, and sharing of special categories of data. In the context of COVID-19, organizations may study not only individuals’ health, but also data about the geolocation, proximity, genetics, biometrics, and racial and ethnic origins of entire populations. While research often entails the processing of sensitive data, and hence presents privacy risks, it can also unearth covert bias and discrimination (for example, in the context of education, credit, housing, criminal justice, and more). In this session, experts discuss the scope of the definition of sensitive data as well as the rules that should apply to the processing of sensitive data in the research arena. How can researchers ensure data-based practices minimize privacy risks while at the same time not glossing over existing societal imbalances? What are the risks that methods such as differential privacy obfuscate underlying inequities? How will organizations use sensitive data to unearth and counter hidden bias and discrimination without abusing their access to such information? Where are the bounds of ethical data research in corporate environments, including healthcare, financial, advertising, and digital platforms?
Elettra Ronchi, Organisation for Economic Co-operation and Development
Paul Quinn, VUB
Heng Xu, American University
Knut Mager, Novartis
17:00 – 17:15 – Closing Keynote
Dr. Wojciech Wiewiórowski, European Data Protection Supervisor
In its January 2020 Preliminary Opinion on data protection and scientific research, the EDPS recommends intensifying the dialogue between DPAs and ethical review boards for a common understanding of which activities qualify as scientific research, on codes of conduct for scientific research, and on closer alignment between EU research framework programs and data protection standards. In his keynote comments, the EDPS discusses the report and reflects on the day’s sessions.
Schrems II Put Privacy Shield Program at Risk; More Than 250 European Companies are Participating
The July 16, 2020 Schrems II decision by the Court of Justice of the European Union (CJEU) ruled that the US/EU Privacy Shield framework is an insufficient mechanism for cross-border data transfers. How big of a deal is that? Just prior to the Schrems IIdecision, FPF conducted a study of the companies enrolled in the US/EU Privacy Shield program and determined that at least 259 European headquartered companies are active participants.
EU-headquartered firms and major EU offices of global firms depend on the Privacy Shield program so that their related US entities can effectively exchange data for research, to improve products, to pay employees, and to serve customers. Nearly one-third of Privacy Shield companies use the mechanism to process human resources data—information that is crucial to employ, pay, and provide benefits to workers.
The FPF study has additional information about the scope of the Privacy Shield program.
Singapore’s Personal Data Protection Act Shifts Away From a Consent-Centric Framework
Authors: Caroline Hopland, Hunter Dorwart and Gabriela Zanfir-Fortuna
The Singapore Parliament passed amendments to its Personal Data Protection Act 2012 (PDPA) on November 2, 2020, making it the first comprehensive review and change of this law since its enactment in 2012, as it was announced by the Ministry of Communications and Information (MCI) and the Personal Data Protection Commission (Commission) in Singapore.
Some of the key changes include:
a shift away from the consent-centric paradigm of the previous law by adding new exceptions to consent-based processing, including legitimate interests;
the introduction of a right to data portability;
new obligations to report data breaches; and
changes in the sanctions regime to increase penalties for individuals and organizations that breach the law, including prison sentences, and to enhance the enforcement powers of the Commission.
The Amended Act will only enter into force once the President assents to it and a notification is published in the Government Gazette. Experts expect it to come into force before the end of 2020.
Below we address the key changes of the Act, specifically (1) new definitions, including “derived personal data,” (2) new exceptions introduced to the rule that consent is required for collecting and otherwise processing personal data, including contractual necessity and legitimate interests, (3) how accountability was enhanced, (4) the introduction of a right to data portability, (5) new requirements to notify data breaches, and (6) the enhanced liability and sanctions regime, including now personal criminal liability for specific offenses, increased fines and an alternative dispute resolution option.
1. “Derived Personal Data”: Newly Defined and Exempted from Correction and Portability Requests
The Act was amended to include new definitions, such as “derived personal data,” and a set of definitions that are relevant in the context of the new right to data portability – “user activity data,” “user-provided data,” “data porting request,” and “ongoing relationship.”
“Derived personal data” is akin to “inferred personal data” as defined by the European Data Protection Board (EDPB)[1], and it refers to “personal data about an individual that is derived by an organization in the course of business from other personal data about that individual or another individual in the possession or under the control of the organization.” However, it excludes personal data “derived using any prescribed means or method, such as mathematical averaging and summation,” so further guidance may be needed to fully circumscribe this exception.
Note that there are two tailored rules for “derived personal data” in the amended Act – in particular, data subjects cannot obtain the correction of an error or omission if the request concerns derived data (see the amended Sixth Schedule redefining exemptions from Section 22 of the PDPA). In addition, similarly to the right to data portability under the EU’s General Data Protection Regulation (GDPR), a porting organization is not required to transmit any derived personal data following a data portability request (see the new Twelfth Schedule).
2. New Rules to Define “Deemed Consent” and to Shift from the Consent-Centric Framework of the PDPA
The amendments (2.1.) will allow organizations to disclose individuals’ personal data, without their express consent, to other organizations, when it relates to contractual necessity, and is not subject to contrary terms in the contract between the individual and the organization (see amended Section 15). An organization (2.2.) may also obtain “deemed consent” from an individual if it conducts a detailed risk assessment, it informs the individual about the intention to collect or use personal data, and if the individual does not notify the organization that the individual objects to the processing (see new Section 15A). In addition to expanding the meaning of “deemed consent,” the amended PDPA (2.3.) also adds “legitimate interests” and “business improvement purposes” as downright exceptions from obtaining consent for collection, disclosure, or use of personal data.
2.1. “Deemed Consent by Contractual Necessity” to Allow Data Sharing
Section 15 of the PDPA has been modified to introduce “deemed consent by contractual necessity,” whose purpose is to facilitate data sharing. According to the amended PDPA, an individual who provides personal data to an organization with a view of entering into a contract with that organization, is deemed to consent to the following, where “reasonably necessary” for the conclusion of the contract between them:
1) the organization’s disclosure of that personal data to a second organization;
2) the collection and use of that personal data by the second organization; and
3) the second organization’s disclosure of that personal data to a third organization.
The third organization should apply the rules as if the original organization had disclosed the personal data provided by the individual to it directly. This allows the disclosure to, and the collection, use and disclosure by, successive organizations of the personal data provided by the individual, where reasonably necessary for the conclusion of the contract between the individual and the original organization.
This amendment applies retroactively. Data collected within this category prior to the entry into force of these amendments should be treated as if these sections were in force when the personal data was first provided, and had continued in force until the applicable date; allowing organizations to use and share personal data that was collected prior to the effective date of this Act.
2.2. “Deemed Consent by Notification” (and Risk Assessment)
“Deemed consent” is further expanded to cover a situation where an organization conducts a risk assessment of the processing of personal data over the rights of the individual and informs the individual about the processing that will take place, and of the possibility to object to it (see Section 15A). If the individual does not notify the organization within a determined period of time that they do not consent, then they will have provided valid “deemed consent.”
In a test similar to the legitimate interests assessment and balancing exercise under the GDPR, the risk assessment for “deemed consent by notification” according to the amended PDPA must:
1) identify any adverse effect that the proposed collection, use or disclosure of the personal data for the purpose concerned is likely to have on the individual;
2) identify and implement reasonable measures to eliminate the adverse effect;
3) reduce the likelihood that the adverse effect will occur; or
4) mitigate the adverse effect.
In addition, the organization must take reasonable steps to inform the individual about the 1) intention and purpose for which the personal data will be collected, used, or disclosed; and 2) the reasonable period and manner with which the individual can opt out and notify the organization that they do not consent to the proposed collection, use or disclosure of their personal data.
2.3 New Exceptions from Consent, Including Legitimate Interests & Business Improvement Purposes
The amendments carve out new exceptions for organizations regarding their collection, use, and disclosure practices based on vital interests of the individuals, public interest (from processing publicly available personal data, to processing for artistic or literary purposes, or for archival or historical purposes), legitimate interests, business access transactions, and business improvement purposes (see Section 17 and the new First Schedule). In addition, the amended Act provides for exceptions from consent specifically for using personal data – for example, for research purposes; and exceptions for disclosing personal data – for research purposes as well, or in the public interest (see the new Second Schedule).
Legitimate Interests: Organizations can collect, use, and disclose personal data without the consent of the individual when 1) it is in the legitimate interests of the organization or another person; and 2) the legitimate interests of the organization or other person outweigh any adverse effect on the individual. Before collecting, using or disclosing such personal data, the organization must conduct an assessment to identify any adverse effects that the proposed collection, use or disclosure is likely to have on the individual, and to implement reasonable measures to:
1) eliminate the adverse effect to reduce the likelihood that it will occur; or
2) mitigate the adverse effect; and
3) comply with any prescribed requirements.
Some legitimate interests are specifically enumerated in the First Schedule, such as recovering a debt from an individual or paying a debt to an individual. Processing of personal data in employment contexts is also specifically mentioned. The organization must provide the individual with reasonable access to information about its collection, use or disclosure of the individual’s personal data.
Business improvement purposes: Personal data about an individual can also be used by an organization without the individual’s consent for specifically defined business improvement purposes to:
1) improve or enhance any of the organization’s existing or developing goods or services it provides;
2) improve or enhance any of the organization’s existing or developing methods or processes for the organization’s operations;
3) learn about and understand the behavior and preferences of individuals or another individual in relation to the goods or services the organization provides; and
4) identify any goods or services provided by the organization that may be suitable for the individual or another individual, or to personalize or customize any such goods or services for that individual or another individual.
This exception is limited by data minimization requirements and by a reasonableness test. Specifically, it only applies if the purpose for which the organization uses personal data about the individual cannot reasonably be achieved without the use of the personal data in an individually identifiable form; as well as if a reasonable person would consider the use of personal data about the individual for that purpose to be appropriate in the circumstances.
3. Enhanced Accountability
The amendments aim to strengthen the accountability of organizations with respect to the processing of personal data. Part III of the PDPA, originally titled “General Rules With Respect to Protection of Personal Data,” is amended to: “General Rules With Respect to Protection of and Accountability For Personal Data.” Most notably, however, are the additional mandatory assessments for “deemed consent by notification”, legitimate interests, and data breaches, that create accountability measures for organizations to implement. Two other requirements further highlight the amendment’s aim to strengthen accountability:
Preservation of Copies of Personal Data: New Section 22A, which covers access to and correction of personal data, now requires an organization who refuses an individual’s request to provide the individual with their personal data that the organization possesses or controls, to preserve a copy of the personal data concerned. The organization must ensure that the copy of the personal data it preserves is a complete and accurate copy of the personal data concerned (see below).
Protection of Personal Data Extended: Section 24 is amended to extend an organization’s requirements to protect personal data in its possession or under its control. An organization not only must make reasonable security arrangements to prevent unauthorized access, collection, use, disclosure, copying, modification or disposal, or similar risks of personal data, but is now also required to make reasonable security arrangements to prevent the loss of any storage or medium device that stores personal data in its possession or under its control. This adds an additional layer of security requirements for organizations to ensure that security measures exist to protect physical devices storing personal data.
4. Introduction of a Right to Data Portability
The amended PDPA introduces a right to data portability, and corresponding obligations (Sections 26F and 26J to Part VIB of the amended PDPA). The declared purpose of these obligations is to provide consumers with greater autonomy to control their personal data and facilitate individuals’ switching services across the innovative and competitive ecosystem (Section 26G). To this end, the amendments introduce a handful of terms such as “data porting request,” “porting organization,” and “receiving organization” to denote the various actors involved in the portability and transfer of data.
As an overarching matter, the portability requirements apply only to personal data that is in electronic form on the date of the porting request and collected by the porting organization on a date before receiving the porting request. The portability requirements will apply retroactively to data collected before the commencement of the amended Act.
An individual may request a porting organization to directly transmit applicable data about the individual to a receiving organization. As opposed to the GDPR and California’s Consumer Privacy Act (CCPA), data portability does not include the possibility for the individual to obtain a copy of their personal data in a portable format. The porting organization must comply with the request if the organization has an ongoing relationship with the individual at the time of the request, the request satisfies “prescribed requirements,” and the receiving organization is either constituted under the law of Singapore, or it has a presence in Singapore, regardless of where it stores the data.
The amendments prohibit transfers of data pursuant to data porting requests if 1) the transmission would likely threaten the safety, or physical or mental health of the individual or a third-party or is otherwise contrary to national interest; 2) the receiving organization is excluded by further regulations; or 3) the Commission directs the porting organization not to transmit the data.
If a porting organization fails to transfer applicable data under the request, the organization must notify the individual of the refusal within a prescribed time and in accordance with prescribed requirements. These portability requirements do not affect the restrictions on disclosure of personal data under other written laws.
The amendments regulate instances where transferring applicable data about one individual results in the transmission of personal data about another individual. Under the Act, a porting organization may disclose personal data about a third-party without that person’s consent only if the individual requesting the transfer makes the request in her personal capacity and the request relates to her user activity data or user-provided data. A receiving organization in this context which receives any personal data about a third-party can only use that data for the purpose of providing goods or services to the individual requesting the transfer.
The amendments clarify that a porting organization that discloses personal information of a third-party through a porting request transfer should not breach any obligation under any written law or contract as to secrecy, other restrictions, or any rules of professional conduct.
In addition to general portability obligations, porting organizations must preserve a complete and accurate copy of any applicable data specified in a data porting request for a prescribed period of time. Such preservation must occur regardless of whether the organization refused to transmit data for any reason. The Commission may prescribe different periods for different porting organizations or circumstances.
Finally, the updated provisions stipulate that data portability obligations apply to applicable data regardless of whether a porting organization stores, processes, or transmits data in Singapore or a country or territory outside of Singapore.
5. Mandatory Data Breach Notification Requirements
Under the amended law, organizations will need to implement breach notification measures. New Part VIA requires an organization to assess data breaches affecting personal data in its possession or control, and to notify the Commission, as well as the affected individuals, of the occurrence of a notifiable data breach. Data breaches are defined as i) the unauthorized access, collection, use, disclosure, copying, modification or disposal of personal data; or ii) the loss of any storage medium or device which stores personal data in circumstances where the unauthorized access, collection, use, disclosure, copying, modification or disposal of the personal data is likely to occur.
A notifiable data breach occurs when the breach 1) results in, or is likely to result in, significant harm to an affected individual; or 2) is, or is likely to be, of a significant scale. Further regulatory guidance will be needed to identify thresholds for notification. According to the amended law, a data breach results in significant harm to an individual 1) if the data breach is in relation to any “prescribed personal data or class or personal data” relating to the individual; or 2) in other “prescribed circumstances”. A data breach is not notifiable when the breach relates to the unauthorized access, collection, use, disclosure, copying or modification of personal data within the organization only.
An organization must conduct a data breach assessment when it has reason to believe that a breach affecting personal data in its possession or control has occurred. The assessment must be conducted in a reasonable and expeditious manner, and should determine whether the data breach is notifiable. Data intermediaries, after conducting an assessment and determining a notifiable data breach occurred, are also required to notify the organization or public agency for whom they are processing the personal data for.
Once the organization or data intermediary verifies that the breach is notifiable, it has three calendar days to notify the Commission of the breach. It must also, in a reasonable manner, notify each affected individual only if the breach caused, or could likely cause, significant harm to the individual.
An organization does not need to notify affected individuals of a notifiable data breach after it 1) assesses that the breach is unlikely to cause significant harm to the affected individuals; or 2) had previously implemented any technological measure that renders it unlikely that the notifiable data breach will cause significant harm to the affected individuals. Finally, the Commission or a law enforcement agency can waive the requirement and instruct the organization not to notify individuals affected by the breach for any reason it sees fit.
6. Penalties and Enforcement: Increased Fines, Personal Criminal Liability and Alternative Dispute Resolution
The amended Act imposes new criminal penalties on individuals who mishandle personal information. Under the amendments, an individual may be criminally liable for three separate offenses, related in principle to security breaches and to re-identification of data sets:
knowing or reckless unauthorized disclosure of personal data in the possession of an organization or public agency to another person;
knowing or reckless unauthorized use of personal data in the possession of an organization or public agency that results in a gain for the individual or third party or causes harm to an individual; or
knowing or reckless unauthorized re-identification of anonymized personal data in the possession of an organization or public agency.
Individuals found guilty of each offense could face up to a SGD 5,000 fine or two years imprisonment, or both. A defense exists if an individual can prove that the data in question was publicly available at the time of disclosure, or that inappropriate handling of the information was required under another law or an order of the Court.
Apart from these offenses, the amendments increase the financial penalties on organizations for intentional or negligent breaches of the law. The new regime sets maximum penalties to 10% of an organization’s gross annual turnover in Singapore if its turnover exceeds SGD 10 million or SGD 1 million otherwise, whichever is higher. The old law set a maximum cap of SGD 1 million for infringement.
In addition, the amendments authorize the Commission to establish alternative dispute resolution mechanisms to handle complaints brought by individuals against an organization by mediation. The Commission may order a dispute resolution without the consent of the individual or the organization. Individuals may also apply for the Commission to review an organization’s refusal or failure to provide access, transmit applicable data pursuant to a data porting request, or a fee imposed in relation to the data porting request. Additionally, the amended Act grants the Commission authority to order an organization or individual to stop collecting or using data or destroy any data in contravention of the Act.
Finally, under the original version of the PDPA, individuals harmed as a result of an entity violating any provision in Part IV, V or VI had a private right of action for relief in a civil proceeding. The new amendments retain the private right of action, but expand the scope of actionable violations to include Part VIA (data breach notification provisions), VIB (data portability provisions) and Division 3 of Part IX or Part IXA.
7. Conclusions
The changes brought to the Personal Data Protection Act of Singapore are underlined by a shift from a consent-centric legal regime for collecting and processing personal data to accountability of organizations and risk-based processing. This change, however, came as complementary to increasing individuals’ control over their personal data, through the introduction of the new right to data portability, which is also a nod to the influence of EU’s GDPR over data protection and privacy laws around the world.
This blog is part of a series of overviews of new laws and bills around the world regulating the processing of personal data, coordinated by Gabriela Zanfir-Fortuna, Senior Counsel for Global Privacy ([email protected]).
California’s Prop 24, the “California Privacy Rights Act,” Passed. What’s Next?
Authors: Stacey Gray, Senior Counsel, Katelyn Ringrose, Christopher Wolf Diversity Law Fellow at FPF, Polly Sanderson, Policy Counsel, and Veronica Alix, FPF Legal Intern
Despite a day of election uncertainty, November 3, 2020 produced an important moment for privacy legislation: California voters approved Proposition 24 (the California Privacy Rights Act) (CPRA) (full text here). Garnering 56.1% of the vote so far, the initiative will almost certainly meet the majority threshold to become the new law of the land in California.
The CPRA amends key portions of the 2018 California Consumer Privacy Act (CCPA), which went into effect earlier this year. The CPRA gives additional rights to consumers and places additional obligations on businesses. The new law provides additional protections for sensitive personal information, expands CCPA’s opt out rights to include new types of information sharing, and requires businesses to provide additional mechanisms for individuals to access, correct, or delete data, with a particular focus on information used by automated decision-making systems.
What’s next? The new law is scheduled to become operative in 2023, but preparations will occur over the next two years: a new California Privacy Protection Agency will be established, funded, and tasked with taking over rulemaking from the California Attorney General; and businesses will need to interpret (and build systems to comply with) the law’s additional consumer privacy rights. The establishment of a dedicated Privacy Protection Agency is a major milestone for privacy in the US, and we expect the passage of the CPRA to energize efforts to pass comprehensive federal privacy legislation.
NEXT STEPS FOR THE NEW CA AGENCY: FUNDING, RULEMAKING, AND ENFORCEMENT
The CPRA transfers all funding, rulemaking, and enforcement authority from the Attorney General to the new California Privacy Protection Agency (PPA). Primary enforcement responsibilities remain vested with the state agency (rather than in a private right of action), with minor but significant changes. Specifically, the CPRA triples penalties for violations regarding minors under the age of 16 and removes the 30-day cure period that businesses can currently utilize under the CCPA. CCPA’s narrow private right of action for security breaches remains intact.
Absent amendment by the California legislature, the timeline for funding, rulemaking, and enforcement for the PPA will be:
Certification that Proposition 24 Passed – Votes may continue to be received and counted as late as November 20, which is the deadline for the state to receive mail-in ballots postmarked by November 3rd. Analysts do not expect mail-in ballots to impact CPRA’s passage.
Funding and Establishment of the Agency (2020) – In accordance with Section 31 of the CPRA and Article II, Section 10(a) of the California Constitution, the Act becomes effective five days after the Secretary of State “files the statement of the vote for the election.” This timeline means that the funding and establishment of the new California PPA is likely to begin soon, as early as December 2020.
Adopting Regulations (2021-22) – According to Section 21 of the CPRA (amending Section 1798.185 of the Civil Code), the new PPA may begin exercising its rulemaking authority as early as July 1, 2021, or six months after the Agency provides notice to the Attorney General that it is prepared to begin rulemaking. The timeline for adopting final regulations required by the Act is set for July 1, 2022.
Obligations Become Operative (January 1, 2023) – Substantive obligations for businesses are scheduled to become operative on January 1, 2023. Obligations will apply to personal information collected by a business on or after January 1, 2022.
Enforcement (July 1, 2023) – The CPRA provides that all civil and administrative enforcement by the new Agency of the provisions in the CPRA shall not commence until July 1, 2023, and shall only apply to violations occurring on or after that date. Notably, there will be no gap between CCPA and CPRA enforcement – CPRA states that enforcement of CCPA provisions will continue “and shall be enforceable until the same provisions of [the CPRA] become enforceable.”
In the meantime, the California Attorney General has solicited broad public comments for the CCPA throughout 2019 and 2020, including as recently as October 2020 (in a third modified rulemaking). These rules will continue in effect and be supplemented by rules adopted by the new Agency.
ADDITIONAL CONSUMER PRIVACY RIGHTS AND BUSINESS OBLIGATIONS
In substance, the most significant changes in the CPRA are that the law expands the right to opt-out of sharing of information, and establishes new rights to limit businesses’ uses of “sensitive personal information,” a new term defined broadly to include, among other things: information about sexual orientation, race and ethnicity, precise geolocation, and health conditions.
Expanded Right to Opt-Out of Data “Sharing” (in Addition to Sale) — Under existing law, California residents can request to opt-out of the “sale” of their personal information. The CPRA expands this opt-out right to include both “sale” and “sharing,” including disclosing personal information to third parties “for cross-context behavioral advertising,” a clarification that brings greater certainty regarding how California law regulates online ad networks. Subject to interpretation and rulemaking by the new Privacy Protection Agency, businesses will likely be required to respect a global opt-out mechanism, or “opt-out preference signal sent with the consumer’s consent by a platform, technology, or mechanism, based on technical specifications set forth in regulations . . .” (1798.135). So far, at least one draft technical specification has emerged, the Global Privacy Control introduced by privacy-focused tech companies, nonprofits, and publishers.
Expanded Right to Access — Under the existing CCPA right to access, California consumers can request access to all categories of personal information collected by companies over the previous 12 months. The CPRA will extend that 12-month window indefinitely (beginning January 1, 2022), requiring that businesses provide access to all categories of personal information collected “unless doing so proves impossible or would involve a disproportionate effort.”
Right to Correct Inaccurate Information – Under the CPRA, a consumer has the right to request a business to correct inaccurate personal information that a business maintains. Further, the business collecting this personal information must (1) disclose the consumer’s right to request a correction, and (2) “use commercially reasonable efforts” to correct the inaccurate personal information upon request.
Right to Limit Uses of Sensitive Information — The CPRA contains a new consumer right to limit the use and disclosure of sensitive personal information, including information concerning health, race and ethnicity, sexual orientation, precise geolocation, and more. Upon request, covered entities must not only stop selling or sharing sensitive information, but also limit any internal uses of such information. Service providers must also comply with this limitation if they receive an opt-out request or signal from a business associate, and have actual knowledge that the personal information they are using and/or processing is sensitive.
Data Minimization and Purpose Limitation— The CPRA establishes a new general obligation (1798.100) that a business’s collection, use, retention, and sharing of a consumer’s personal information “shall be reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed In a manner that is incompatible with those purposes.”
Additional Notification Obligations — Covered businesses that collect information must still, pursuant to the CCPA, inform consumers of the categories of personal information collected. Additionally, under the CPRA, covered businesses that collect information must inform consumers of the categories of sensitive personal information collected; for what purposes; if that information is sold or shared; and the length of time the businesses intend to keep each category of information.
Clarification on Loyalty Programs — Under existing law, companies cannot retaliate against consumers for exercising their privacy rights, but may offer differential pricing for digital services if the pricing is “reasonably related to the value provided to the business by the consumer’s data.” (1798.125(a)(2)). The CPRA further clarifies that the anti-discrimination provision “does not prohibit a business from offering loyalty, rewards, premium features, discounts, or club card programs.” (Sec. 11).
In scope, the CPRA retains the same basic structure as the CCPA, with minor changes to the kinds of businesses that are regulated. For example, the law doubles the CCPA’s threshold amount of personal information that must be processed for a business to be subject to the law, from 50,000 to 100,000 consumers or households. However, the law retains the CCPA’s applicability to for-profit businesses “doing business in California,” and the law’s exemption for the processing of “publicly available data.”
The CPRA also extends the California Legislature’s sunset provisions on rulemaking regarding employee and business-to-business obligations to January 1, 2023, and expands existing service provider obligations to contractors.
LOOKING AHEAD
The establishment of a dedicated Privacy Protection Agency is a major milestone for privacy in the US, a development that could even potentially lead to discussions with EU officials regarding the adequacy and interoperability of California privacy law with Europe’s General Data Protection Regulation (GDPR). The CPRA expands consumer rights for Californians in important ways, including extending rights to access and correct information, opt-out of sharing and sale, and limit uses of sensitive information.
Most importantly, we expect passage of the California Privacy Rights Act to energize efforts to pass comprehensive federal privacy legislation. Congress and the next Administration will have an opportunity to pass privacy legislation that establishes national protections for all US consumers and gives businesses clear obligations.
We look forward to working with the new California Privacy Protection Agency as it establishes the state’s approach to allowable uses of health data, de-identification practices, and other challenging questions.
Stacey Gray is a Senior Counsel at FPF and leads FPF’s legislative analysis and policymaker education team. Katelyn Ringrose is the Christopher Wolf Diversity Law Fellow at FPF, Polly Sanderson is a Policy Counsel at FPF working on U.S. federal and state privacy legislation, and Veronica Alix is a Fall 2020 Legal Policy Intern. Contact us at [email protected].
Understanding Blockchain: A Review of FPF’s Oct. 29th Digital Data Flows Masterclass
Authors: Hunter Dorwart, Stacey Gray, Brenda Leong, Jake van der Laan, Matthias Artzt, and Rob van Eijk
On 29 October 2020, Vrije Universiteit Brussel (VUB and Future of Privacy Forum (FPF) hosted the eight Digital Data Flows Masterclass. The masterclass on blockchain technology completes the VUB-FPF Digital Data Flows Masterclass series.
The most recent masterclass explored the basics of how blockchain technologies work, including established and proposed use cases, which were then evaluated through the lens of privacy and data protection. How does blockchain technology work? Does blockchain present opportunities for a privacy-by-design approach? What is the legal compliance analysis of blockchain systems, particularly addressing the roles and responsibilities of various parties engaging with the technology?
FPF’s Brenda Leong and Dr. Rob van Eijk moderated the discussion following expert overviews from Jake van der Laan (Director Information Technology and Regulatory Informatics at the Financial and Consumer Services Commission of New Brunswick) and Dr. Matthias Artzt (Senior Counsel, Deutsche Bank).
The slides of the presenters can be accessed here and here.
Blockchain Technology – What is it and how does it function?
The term blockchain refers to a distributed ledger technology, composed of recorded transactions in set groupings, called “blocks,” that are linked to each other using cryptographic hashes. These linked blocks form the “chain” (Figure 1), and if any block is tampered with after it’s been added to the chain, that change would be immediately noticeable because the hashed link would be broken (would no longer match between the two blocks).
Figure 1. Analogy of a train for chaining together blocks of transactional information.
Additional blocks of information (e.g., recorded transactions) get added to the ledger when they receive verification from other blocks that they have successfully solved an accompanying mathematical challenge (by “miners”). This verification process takes several minutes, but once 51% of the participating nodes have verified the result, the block is permanently added to that chain.
Because these transactions are simultaneously recorded on ledger systems in independent locations around the world (called “nodes”), all copies of which are updated at the same time when a new transaction is recorded, the security of information recorded on a blockchain is considered “immutable” or unchangeable (Figure 2). Originally designed specifically to manage a cryptocurrency system such as Bitcoin, the point of this process was to enable trust and reliability within a system without a central manager, like a bank or national financial system.
Figure 2. The information contained in the records are extremely resistant to alteration.
Expanding Use Cases
Although initially, blockchain systems were used only for cryptocurrencies, this technology can potentially be used for other purposes. Blockchain was initially designed to alleviate the need for a manager or controlling entity, recording transactions for anonymous or pseudonymous users, and with the expectation that all related information was included in the transaction as stored on the chain. However, as the underlying technology has been considered for other uses, there have been a number of changes to that design.
Distributed ledgers are now created in ways that can be private, public or hybrid in nature. Public ledgers operate as an open network which allows anyone to download the protocol and participate in the network. Private ledgers, by contrast, work on an invite-only basis with a single entity governing the network. Hybrid systems combine elements of these two systems. In part, blockchains vary in the degree to which type of ledger they create as well as what activity the ledger specifically records.
Blockchain has in recent years been considered for applications such as supply-chain management; recording or property or real estate records; smart contracts; and even voting functions. One of the leading commercial providers of blockchain systems, Ethereum is a global, open-source platform for many decentralized applications. The full usefulness of blockchain for such applications is not yet fully understood, and many use cases are still being developed or explored.
Legal Implications of Blockchain for Privacy and Data Protection – Roles and Responsibilities
Blockchain technology raises many privacy and legal implications that warrant awareness from industry professionals, policymakers, and the general public. When a blockchain ledger is used to manage or even directly store personal information, the processing of that personal information will fall under the auspices of most privacy and data protection laws, including the General Data Protection Regulation (GDPR).
Those managing or using a blockchain-based system could assume certain responsibilities depending on their role in processing the data. Under the GDPR, a “controller” means “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data.” Art. 4(7). “Processors” refer to a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.” Art. 4(8).
Dr. Artzt described the roles and the responsibilities of blockchain participants under the GDPR and similar international data protection regimes.
Miners validate transactions by solving a mathematical puzzle defined by the applicable consensus mechanism used by the blockchain. Miners do not determine the specific purpose of any data processing activity nor do they carry out any specific services based on instructions of controllers. Therefore miners are not controllers or processors under the GDPR.
Nodes refer to the decentralized computers that each store a copy of the blockchain. As an IT operation, storing does not implicate decision-making processes. Therefore nodes are not controllers or processors. Indeed, it is difficult to interpret merely participating in the blockchain network as determining the means and purpose of a specific data processing activity.
Wallets store and manage asymmetric keys and addresses used for transactions at the application level. Wallets allow blockchain users to control their own private keys and interact with the network. As a software package, wallets merely pass data from users to miners and therefore do not by themselves determine the purpose or means of processing and are not controllers.
Users – Users participate in transactions on a blockchain network. Users could be controllers if (i) personal information is submitted to the ledger and (ii) the processing activity has a professional or commercial background only. However, if the processing of information is in the course of a “purely personal or household activity,” it may fall under the GDPR’s household exception. Art. 2(2)(c). In that case, the GDPR doesn´t apply.
Smart Contract Developers create algorithms to use in “smart contract” programs stored on the blockchain which provide certain functionality to blockchain users. Notably, they do not operate the software they write and therefore do not determine the purpose and means of the processing activity.
Oracles refer to agents that allow the transfer of external data feeds to the blockchain to leverage smart contracts. Oracles are necessary to process external real-world events for the network and have a strong influence on the data processing operation. Therefore, oracles can qualify as controllers if they have a commercial interest in the outcome of the related data processing.
Governance Bodies monitor blockchain transactions and define the roles of the participants. Existing only in private blockchains, a governance body qualifies as a controller if it has control over the processing of personal data. Governance bodies can also determine one participant to act as a controller and become processors or joint controllers as a result.
Data Subject Rights, Minimization and Purpose Limitation
Because storage counts as data processing under the GDPR and new uses of blockchain technology involve some direct storage of data, the use of the technology raises other privacy implications and trade-offs. Clarifying the nature of these implications will be increasingly important not only for users and operators of blockchain but also for policymakers who must ask difficult questions about the appropriate scope of regulation. For this reason, our speakers recommended that a best practice would include storage of personal data off-chain, or remove personal data which is not needed any more (“legacy data”) from the blockchain, storing it in an external database off-chain and link the stored information to the recorded exchanges and transactions via on-chain hashes
From the discussion, data subjects exercising their rights – such as the right to deletion – under the GDPR may run into problems determining who should respond to such requests. Here, the distinction between public and private blockchains becomes critical. With private blockchains, a data subject may theoretically treat the governance body as the responsible controller of their information. But with public blockchains, the data subject faces the dual challenge of identifying the appropriate controller and ensuring the controller carries out its obligation.
What’s more, the nature and design of the technology itself compounds such challenges by making it nearly impossible to access or modify the information contained in the blockchain. For instance, a controller in a public blockchain won’t be able to comply with data subject access requests as a matter of feasibility. Any data on the public blockchain will be there to stay and can’t be deleted or rectified.
Similarly, the immutability of data in blockchains creates tension with other data privacy principles such as purpose limitation and data minimization. By nature, blockchain continuously processes data, forever recording information related to the full history of past transactions.. This makes it practically impossible to minimize data use to that which is necessary for a particular transaction.
Techniques to Mitigate Data Protection Risks
In some cases, the data protection challenges of complying with data subject rights and other legal obligations can be mitigated through the use of private blockchains; off-chaining the data; and innovative encryption techniques, e.g. hashing.
Adopting a private blockchain resolves certain issues because the governance body has more control and is better equipped to respond to data subject requests. Such a solution impacts the utility and scalability of the technology. The fundamental problem with the immutability of the blockchain is the perpetual storage of information.
Creating a mechanism for the use of legacy data may provide one legal solution to this issue. In one scenario, legacy data from a mutable private blockchain where the real-time processing takes place could be transferred to an immutable public blockchain through interoperating multi-layered blockchains.
The most straightforward and efficient technique for resolving the issue with legacy data is the hashing-out approach: Legacy data is removed (hashed-out) from the blockchain and put externally (off-chain). The personal data is replaced with a hash value which remains on the blockchain and points to the reference data stored externally. In the case of a deletion request raised by an individual, the off-chain personal information can be deleted.
Off-chain storage may resolve some of the issues around erasure requests and enable greater data minimization and purpose limitation. Once the controller deletes the corresponding personal data in the external database, the hash value remaining on the blockchain becomes a random string with no meaning. Because blockchain uses cryptographic hash functions, in most cases it is impossible to reverse engineer the original (reference) data from the hash. As a result, the hashed section is no longer tied to personal information, and therefore no longer subject to the same legal implications.
Under GDPR or similar legal frameworks, controllers will likely have to carry out a data privacy impact assessment (DPIA). Such an assessment provides an opportunity for controllers of a blockchain-based system to evaluate the appropriate technical and organizational measures they can adopt to minimize privacy risks. Utilizing various mitigation measures may offer controllers ways to apply privacy-preserving technical solutions while benefiting from potential innovative capabilities.
Because of the many unresolved privacy and regulatory questions around blockchain systems, policymakers and other stakeholders must be aware of the particular concerns and challenges involved with adopting this technology for new applications and use cases.
The slides of the presenters can be accessed here and here.