Singapore’s Personal Data Protection Act Shifts Away From a Consent-Centric Framework

Authors: Caroline Hopland, Hunter Dorwart and Gabriela Zanfir-Fortuna

The Singapore Parliament passed amendments to its Personal Data Protection Act 2012 (PDPA) on November 2, 2020, making it the first comprehensive review and change of this law since its enactment in 2012, as it was announced by the Ministry of Communications and Information (MCI) and the Personal Data Protection Commission (Commission) in Singapore.

Some of the key changes include:

  1. a shift away from the consent-centric paradigm of the previous law by adding new exceptions to consent-based processing, including legitimate interests; 
  2. the introduction of a right to data portability; 
  3. new obligations to report data breaches; and 
  4. changes in the sanctions regime to increase penalties for individuals and organizations that breach the law, including prison sentences, and to enhance the enforcement powers of the Commission. 

The Amended Act will only enter into force once the President assents to it and a notification is published in the Government Gazette. Experts expect it to come into force before the end of 2020. 

Below we address the key changes of the Act, specifically (1) new definitions, including “derived personal data,” (2) new exceptions introduced to the rule that consent is required for collecting and otherwise processing personal data, including contractual necessity and legitimate interests, (3) how accountability was enhanced, (4) the introduction of a right to data portability, (5) new requirements to notify data breaches, and (6) the enhanced liability and sanctions regime, including now personal criminal liability for specific offenses, increased fines and an alternative dispute resolution option.  

1. “Derived Personal Data”: Newly Defined and Exempted from Correction and Portability Requests

The Act was amended to include new definitions, such as “derived personal data,” and a set of definitions that are relevant in the context of the new right to data portability – “user activity data,” “user-provided data,” “data porting request,” and “ongoing relationship.”

“Derived personal data” is akin to “inferred personal data” as defined by the European Data Protection Board (EDPB)[1], and it refers to “personal data about an individual that is derived by an organization in the course of business from other personal data about that individual or another individual in the possession or under the control of the organization.” However, it excludes personal data “derived using any prescribed means or method, such as mathematical averaging and summation,” so further guidance may be needed to fully circumscribe this exception. 

Note that there are two tailored rules for “derived personal data” in the amended Act – in particular, data subjects cannot obtain the correction of an error or omission if the request concerns derived data (see the amended Sixth Schedule redefining exemptions from Section 22 of the PDPA). In addition, similarly to the right to data portability under the EU’s General Data Protection Regulation (GDPR), a porting organization is not required to transmit any derived personal data following a data portability request (see the new Twelfth Schedule).

2. New Rules to Define “Deemed Consent” and to Shift from the Consent-Centric Framework of the PDPA

The amendments (2.1.) will allow organizations to disclose individuals’ personal data, without their express consent, to other organizations, when it relates to contractual necessity, and is not subject to contrary terms in the contract between the individual and the organization (see amended Section 15). An organization (2.2.) may also obtain “deemed consent” from an individual if it conducts a detailed risk assessment, it informs the individual about the intention to collect or use personal data, and if the individual does not notify the organization that the individual objects to the processing (see new Section 15A). In addition to expanding the meaning of “deemed consent,” the amended PDPA (2.3.) also adds “legitimate interests” and “business improvement purposes” as downright exceptions from obtaining consent for collection, disclosure, or use of personal data. 

2.1. “Deemed Consent by Contractual Necessity” to Allow Data Sharing

Section 15 of the PDPA has been modified to introduce “deemed consent by contractual necessity,” whose purpose is to facilitate data sharing. According to the amended PDPA, an individual who provides personal data to an organization with a view of entering into a contract with that organization, is deemed to consent to the following, where “reasonably necessary” for the conclusion of the contract between them: 

1) the organization’s disclosure of that personal data to a second organization; 

2) the collection and use of that personal data by the second organization; and 

3) the second organization’s disclosure of that personal data to a third organization. 

The third organization should apply the rules as if the original organization had disclosed the personal data provided by the individual to it directly. This allows the disclosure to, and the collection, use and disclosure by, successive organizations of the personal data provided by the individual, where reasonably necessary for the conclusion of the contract between the individual and the original organization. 

This amendment applies retroactively. Data collected within this category prior to the entry into force of these amendments should be treated as if these sections were in force when the personal data was first provided, and had continued in force until the applicable date; allowing organizations to use and share personal data that was collected prior to the effective date of this Act. 

2.2. “Deemed Consent by Notification” (and Risk Assessment)

“Deemed consent” is further expanded to cover a situation where an organization conducts a risk assessment of the processing of personal data over the rights of the individual and informs the individual about the processing that will take place, and of the possibility to object to it (see Section 15A). If the individual does not notify the organization within a determined period of time that they do not consent, then they will have provided valid “deemed consent.” 

In a test similar to the legitimate interests assessment and balancing exercise under the GDPR, the risk assessment for “deemed consent by notification” according to the amended PDPA must: 

1) identify any adverse effect that the proposed collection, use or disclosure of the personal data for the purpose concerned is likely to have on the individual; 

2) identify and implement reasonable measures to eliminate the adverse effect; 

3) reduce the likelihood that the adverse effect will occur; or

4) mitigate the adverse effect.

In addition, the organization must take reasonable steps to inform the individual about the 1) intention and purpose for which the personal data will be collected, used, or disclosed; and 2) the reasonable period and manner with which the individual can opt out and notify the organization that they do not consent to the proposed collection, use or disclosure of their personal data. 

2.3 New Exceptions from Consent, Including Legitimate Interests & Business Improvement Purposes

The amendments carve out new exceptions for organizations regarding their collection, use, and disclosure practices based on vital interests of the individuals, public interest (from processing publicly available personal data, to processing for artistic or literary purposes, or for archival or historical purposes), legitimate interests, business access transactions, and business improvement purposes (see Section 17 and the new First Schedule). In addition, the amended Act provides for exceptions from consent specifically for using personal data – for example, for research purposes; and exceptions for disclosing personal data – for research purposes as well, or in the public interest (see the new Second Schedule).

Legitimate Interests: Organizations can collect, use, and disclose personal data without the consent of the individual when 1) it is in the legitimate interests of the organization or another person; and 2) the legitimate interests of the organization or other person outweigh any adverse effect on the individual. Before collecting, using or disclosing such personal data, the organization must conduct an assessment to identify any adverse effects that the proposed collection, use or disclosure is likely to have on the individual, and to implement reasonable measures to: 

1) eliminate the adverse effect to reduce the likelihood that it will occur; or 

2) mitigate the adverse effect; and 

3) comply with any prescribed requirements. 

Some legitimate interests are specifically enumerated in the First Schedule, such as recovering a debt from an individual or paying a debt to an individual. Processing of personal data in employment contexts is also specifically mentioned. The organization must provide the individual with reasonable access to information about its collection, use or disclosure of the individual’s personal data.  

Business improvement purposes: Personal data about an individual can also be used by an organization without the individual’s consent for specifically defined business improvement purposes to: 

1) improve or enhance any of the organization’s existing or developing goods or services it provides; 

2) improve or enhance any of the organization’s existing or developing methods or processes for the organization’s operations; 

3) learn about and understand the behavior and preferences of individuals or another individual in relation to the goods or services the organization provides; and 

4) identify any goods or services provided by the organization that may be suitable for the individual or another individual, or to personalize or customize any such goods or services for that individual or another individual. 

This exception is limited by data minimization requirements and by a reasonableness test. Specifically, it only applies if the purpose for which the organization uses personal data about the individual cannot reasonably be achieved without the use of the personal data in an individually identifiable form; as well as if a reasonable person would consider the use of personal data about the individual for that purpose to be appropriate in the circumstances. 

3. Enhanced Accountability 

The amendments aim to strengthen the accountability of organizations with respect to the processing of personal data. Part III of the PDPA, originally titled “General Rules With Respect to Protection of Personal Data,” is amended to: “General Rules With Respect to Protection of and Accountability For Personal Data.” Most notably, however, are the additional mandatory assessments for “deemed consent by notification”, legitimate interests, and data breaches, that create accountability measures for organizations to implement. Two other requirements further highlight the amendment’s aim to strengthen accountability: 

Preservation of Copies of Personal Data: New Section 22A, which covers access to and correction of personal data, now requires an organization who refuses an individual’s request to provide the individual with their personal data that the organization possesses or controls, to preserve a copy of the personal data concerned. The organization must ensure that the copy of the personal data it preserves is a complete and accurate copy of the personal data concerned (see below).

Protection of Personal Data Extended: Section 24 is amended to extend an organization’s requirements to protect personal data in its possession or under its control. An organization not only must make reasonable security arrangements to prevent unauthorized access, collection, use, disclosure, copying, modification or disposal, or similar risks of personal data, but is now also required to make reasonable security arrangements to prevent the loss of any storage or medium device that stores personal data in its possession or under its control. This adds an additional layer of security requirements for organizations to ensure that security measures exist to protect physical devices storing personal data. 

4. Introduction of a Right to Data Portability 

The amended PDPA introduces a right to data portability, and corresponding obligations (Sections 26F and 26J to Part VIB of the amended PDPA). The declared purpose of these obligations is to provide consumers with greater autonomy to control their personal data and facilitate individuals’ switching services across the innovative and competitive ecosystem (Section 26G). To this end, the amendments introduce a handful of terms such as “data porting request,” “porting organization,” and “receiving organization” to denote the various actors involved in the portability and transfer of data. 

As an overarching matter, the portability requirements apply only to personal data that is in electronic form on the date of the porting request and collected by the porting organization on a date before receiving the porting request. The portability requirements will apply retroactively to data collected before the commencement of the amended Act. 

An individual may request a porting organization to directly transmit applicable data about the individual to a receiving organization. As opposed to the GDPR and California’s Consumer Privacy Act (CCPA), data portability does not include the possibility for the individual to obtain a copy of their personal data in a portable format. The porting organization must comply with the request if the organization has an ongoing relationship with the individual at the time of the request, the request satisfies “prescribed requirements,” and the receiving organization is either constituted under the law of Singapore, or it has a presence in Singapore, regardless of where it stores the data. 

The amendments prohibit transfers of data pursuant to data porting requests if 1) the transmission would likely threaten the safety, or physical or mental health of the individual or a third-party or is otherwise contrary to national interest; 2) the receiving organization is excluded by further regulations; or 3) the Commission directs the porting organization not to transmit the data. 

If a porting organization fails to transfer applicable data under the request, the organization must notify the individual of the refusal within a prescribed time and in accordance with prescribed requirements. These portability requirements do not affect the restrictions on disclosure of personal data under other written laws. 

The amendments regulate instances where transferring applicable data about one individual results in the transmission of personal data about another individual. Under the Act, a porting organization may disclose personal data about a third-party without that person’s consent only if the individual requesting the transfer makes the request in her personal capacity and the request relates to her user activity data or user-provided data. A receiving organization in this context which receives any personal data about a third-party can only use that data for the purpose of providing goods or services to the individual requesting the transfer. 

The amendments clarify that a porting organization that discloses personal information of a third-party through a porting request transfer should not breach any obligation under any written law or contract as to secrecy, other restrictions, or any rules of professional conduct. 

In addition to general portability obligations, porting organizations must preserve a complete and accurate copy of any applicable data specified in a data porting request for a prescribed period of time. Such preservation must occur regardless of whether the organization refused to transmit data for any reason. The Commission may prescribe different periods for different porting organizations or circumstances. 

Finally, the updated provisions stipulate that data portability obligations apply to applicable data regardless of whether a porting organization stores, processes, or transmits data in Singapore or a country or territory outside of Singapore. 

5. Mandatory Data Breach Notification Requirements

Under the amended law, organizations will need to implement breach notification measures. New Part VIA requires an organization to assess data breaches affecting personal data in its possession or control, and to notify the Commission, as well as the affected individuals, of the occurrence of a notifiable data breach. Data breaches are defined as i) the unauthorized access, collection, use, disclosure, copying, modification or disposal of personal data; or ii) the loss of any storage medium or device which stores personal data in circumstances where the unauthorized access, collection, use, disclosure, copying, modification or disposal of the personal data is likely to occur. 

A notifiable data breach occurs when the breach 1) results in, or is likely to result in, significant harm to an affected individual; or 2) is, or is likely to be, of a significant scale. Further regulatory guidance will be needed to identify thresholds for notification. According to the amended law, a data breach results in significant harm to an individual 1) if the data breach is in relation to any “prescribed personal data or class or personal data” relating to the individual; or 2) in other “prescribed circumstances”. A data breach is not notifiable when the breach relates to the unauthorized access, collection, use, disclosure, copying or modification of personal data within the organization only. 

An organization must conduct a data breach assessment when it has reason to believe that a breach affecting personal data in its possession or control has occurred. The assessment must be conducted in a reasonable and expeditious manner, and should determine whether the data breach is notifiable. Data intermediaries, after conducting an assessment and determining a notifiable data breach occurred, are also required to notify the organization or public agency for whom they are processing the personal data for. 

Once the organization or data intermediary verifies that the breach is notifiable, it has three calendar days to notify the Commission of the breach. It must also, in a reasonable manner, notify each affected individual only if the breach caused, or could likely cause, significant harm to the individual. 

An organization does not need to notify affected individuals of a notifiable data breach after it 1) assesses that the breach is unlikely to cause significant harm to the affected individuals; or 2) had previously implemented any technological measure that renders it unlikely that the notifiable data breach will cause significant harm to the affected individuals. Finally, the Commission or a law enforcement agency can waive the requirement and instruct the organization not to notify individuals affected by the breach for any reason it sees fit. 

6. Penalties and Enforcement: Increased Fines, Personal Criminal Liability and Alternative Dispute Resolution 

The amended Act imposes new criminal penalties on individuals who mishandle personal information. Under the amendments, an individual may be criminally liable for three separate offenses, related in principle to security breaches and to re-identification of data sets:

  1. knowing or reckless unauthorized disclosure of personal data in the possession of an organization or public agency to another person;
  2. knowing or reckless unauthorized use of personal data in the possession of an organization or public agency that results in a gain for the individual or third party or causes harm to an individual; or
  3. knowing or reckless unauthorized re-identification of anonymized personal data in the possession of an organization or public agency.

Individuals found guilty of each offense could face up to a SGD 5,000 fine or two years imprisonment, or both. A defense exists if an individual can prove that the data in question was publicly available at the time of disclosure, or that inappropriate handling of the information was required under another law or an order of the Court.

Apart from these offenses, the amendments increase the financial penalties on organizations for intentional or negligent breaches of the law. The new regime sets maximum penalties to 10% of an organization’s gross annual turnover in Singapore if its turnover exceeds SGD 10 million or SGD 1 million otherwise, whichever is higher. The old law set a maximum cap of SGD 1 million for infringement. 

In addition, the amendments authorize the Commission to establish alternative dispute resolution mechanisms to handle complaints brought by individuals against an organization by mediation. The Commission may order a dispute resolution without the consent of the individual or the organization. Individuals may also apply for the Commission to review an organization’s refusal or failure to provide access, transmit applicable data pursuant to a data porting request, or a fee imposed in relation to the data porting request. Additionally, the amended Act grants the Commission authority to order an organization or individual to stop collecting or using data or destroy any data in contravention of the Act. 

Finally, under the original version of the PDPA, individuals harmed as a result of an entity violating any provision in Part IV, V or VI had a private right of action for relief in a civil proceeding. The new amendments retain the private right of action, but expand the scope of actionable violations to include Part VIA (data breach notification provisions), VIB (data portability provisions) and Division 3 of Part IX or Part IXA.

7. Conclusions

The changes brought to the Personal Data Protection Act of Singapore are underlined by a shift from a consent-centric legal regime for collecting and processing personal data to accountability of organizations and risk-based processing. This change, however, came as complementary to increasing individuals’ control over their personal data, through the introduction of the new right to data portability, which is also a nod to the influence of EU’s GDPR over data protection and privacy laws around the world.

[1] See, for example, the European Data Protection Board Guidelines 8/2020 on the targeting of social media users.

This blog is part of a series of overviews of new laws and bills around the world regulating the processing of personal data, coordinated by Gabriela Zanfir-Fortuna, Senior Counsel for Global Privacy ([email protected]).

California’s Prop 24, the “California Privacy Rights Act,” Passed. What’s Next?

Authors: Stacey Gray, Senior Counsel, Katelyn Ringrose, Christopher Wolf Diversity Law Fellow at FPF, Polly Sanderson, Policy Counsel, and Veronica Alix, FPF Legal Intern


Despite a day of election uncertainty, November 3, 2020 produced an important moment for privacy legislation: California voters approved Proposition 24 (the California Privacy Rights Act) (CPRA) (full text here). Garnering 56.1% of the vote so far, the initiative will almost certainly meet the majority threshold to become the new law of the land in California. 

The CPRA amends key portions of the 2018 California Consumer Privacy Act (CCPA), which went into effect earlier this year. The CPRA gives additional rights to consumers and places additional obligations on businesses. The new law provides additional protections for sensitive personal information, expands CCPA’s opt out rights to include new types of information sharing, and requires businesses to provide additional mechanisms for individuals to access, correct, or delete data, with a particular focus on information used by automated decision-making systems.

What’s next? The new law is scheduled to become operative in 2023, but preparations will occur over the next two years: a new California Privacy Protection Agency will be established, funded, and tasked with taking over rulemaking from the California Attorney General; and businesses will need to interpret (and build systems to comply with) the law’s additional consumer privacy rights. The establishment of a dedicated Privacy Protection Agency is a major milestone for privacy in the US, and we expect the passage of the CPRA to energize efforts to pass comprehensive federal privacy legislation. 

NEXT STEPS FOR THE NEW CA AGENCY: FUNDING, RULEMAKING, AND ENFORCEMENT

The CPRA transfers all funding, rulemaking, and enforcement authority from the Attorney General to the new California Privacy Protection Agency (PPA). Primary enforcement responsibilities remain vested with the state agency (rather than in a private right of action), with minor but significant changes. Specifically, the CPRA triples penalties for violations regarding minors under the age of 16 and removes the 30-day cure period that businesses can currently utilize under the CCPA. CCPA’s narrow private right of action for security breaches remains intact.

Absent amendment by the California legislature, the timeline for funding, rulemaking, and enforcement for the PPA will be:

In the meantime, the California Attorney General has solicited broad public comments for the CCPA throughout 2019 and 2020, including as recently as October 2020 (in a third modified rulemaking). These rules will continue in effect and be supplemented by rules adopted by the new Agency.

ADDITIONAL CONSUMER PRIVACY RIGHTS AND BUSINESS OBLIGATIONS

In substance, the most significant changes in the CPRA are that the law expands the right to opt-out of sharing of information, and establishes new rights to limit businesses’ uses of “sensitive personal information,” a new term defined broadly to include, among other things: information about sexual orientation, race and ethnicity, precise geolocation, and health conditions.

  1. Expanded Right to Opt-Out of Data “Sharing” (in Addition to Sale) — Under existing law, California residents can request to opt-out of the “sale” of their personal information. The CPRA expands this opt-out right to include both “sale” and “sharing,” including disclosing personal information to third parties “for cross-context behavioral advertising,” a clarification that brings greater certainty regarding how California law regulates online ad networks. Subject to interpretation and rulemaking by the new Privacy Protection Agency, businesses will likely be required to respect a global opt-out mechanism, or “opt-out preference signal sent with the consumer’s consent by a platform, technology, or mechanism, based on technical specifications set forth in regulations . . .” (1798.135). So far, at least one draft technical specification has emerged, the Global Privacy Control introduced by privacy-focused tech companies, nonprofits, and publishers.
  2. Expanded Right to Access — Under the existing CCPA right to access, California consumers can request access to all categories of personal information collected by companies over the previous 12 months. The CPRA will extend that 12-month window indefinitely (beginning January 1, 2022), requiring that businesses provide access to all categories of personal information collected “unless doing so proves impossible or would involve a disproportionate effort.”  
  3. Right to Correct Inaccurate Information – Under the CPRA, a consumer has the right to request a business to correct inaccurate personal information that a business maintains. Further, the business collecting this personal information must (1) disclose the consumer’s right to request a correction, and (2) “use commercially reasonable efforts” to correct the inaccurate personal information upon request. 
  4. Right to Limit Uses of Sensitive Information — The CPRA contains a new consumer right to limit the use and disclosure of sensitive personal information, including information concerning health, race and ethnicity, sexual orientation, precise geolocation, and more. Upon request, covered entities must not only stop selling or sharing sensitive information, but also limit any internal uses of such information. Service providers must also comply with this limitation if they receive an opt-out request or signal from a business associate, and have actual knowledge that the personal information they are using and/or processing is sensitive.
  5. Data Minimization and Purpose Limitation  The CPRA establishes a new general obligation (1798.100) that a business’s collection, use, retention, and sharing of a consumer’s personal information “shall be reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed In a manner that is incompatible with those purposes.”
  6. Additional Notification Obligations — Covered businesses that collect information must still, pursuant to the CCPA, inform consumers of the categories of personal information collected. Additionally, under the CPRA, covered businesses that collect information must inform consumers of the categories of sensitive personal information collected; for what purposes; if that information is sold or shared; and the length of time the businesses intend to keep each category of information. 
  7. Clarification on Loyalty Programs — Under existing law, companies cannot retaliate against consumers for exercising their privacy rights, but may offer differential pricing for digital services if the pricing is “reasonably related to the value provided to the business by the consumer’s data.” (1798.125(a)(2)). The CPRA further clarifies that the anti-discrimination provision “does not prohibit a business from offering loyalty, rewards, premium features, discounts, or club card programs.” (Sec. 11).

In scope, the CPRA retains the same basic structure as the CCPA, with minor changes to the kinds of businesses that are regulated. For example, the law doubles the CCPA’s threshold amount of personal information that must be processed for a business to be subject to the law, from 50,000 to 100,000 consumers or households. However, the law retains the CCPA’s applicability to for-profit businesses “doing business in California,” and the law’s exemption for the processing of “publicly available data.” 

The CPRA also extends the California Legislature’s sunset provisions on rulemaking regarding employee and business-to-business obligations to January 1, 2023, and expands existing service provider obligations to contractors. 

LOOKING AHEAD

The establishment of a dedicated Privacy Protection Agency is a major milestone for privacy in the US, a development that could even potentially lead to discussions with EU officials regarding  the adequacy and interoperability of California privacy law with Europe’s General Data Protection Regulation (GDPR). The CPRA expands consumer rights for Californians in important ways, including extending rights to access and correct information, opt-out of sharing and sale, and limit uses of sensitive information. 

Most importantly, we expect passage of the California Privacy Rights Act to energize efforts to pass comprehensive federal privacy legislation. Congress and the next Administration will have an opportunity to pass privacy legislation that establishes national protections for all US consumers and gives businesses clear obligations. 

We look forward to working with the new California Privacy Protection Agency as it establishes the state’s approach to allowable uses of health data, de-identification practices, and other challenging questions.


Stacey Gray is a Senior Counsel at FPF and leads FPF’s legislative analysis and policymaker education team. Katelyn Ringrose is the Christopher Wolf Diversity Law Fellow at FPF, Polly Sanderson is a Policy Counsel at FPF working on U.S. federal and state privacy legislation, and Veronica Alix is a Fall 2020 Legal Policy Intern. Contact us at [email protected]

Understanding Blockchain: A Review of FPF’s Oct. 29th Digital Data Flows Masterclass

Authors: Hunter Dorwart, Stacey Gray, Brenda Leong, Jake van der Laan, Matthias Artzt, and Rob van Eijk

On 29 October 2020, Vrije Universiteit Brussel (VUB and Future of Privacy Forum (FPF) hosted the eight Digital Data Flows Masterclass. The masterclass on blockchain technology completes the VUB-FPF Digital Data Flows Masterclass series.

The most recent masterclass explored the basics of how blockchain technologies work, including established and proposed use cases, which were then evaluated through the lens of privacy and data protection. How does blockchain technology work? Does blockchain present opportunities for a privacy-by-design approach? What is the legal compliance analysis of blockchain systems, particularly addressing the roles and responsibilities of various parties engaging with the technology?

FPF’s Brenda Leong and Dr. Rob van Eijk moderated the discussion following expert overviews from Jake van der Laan (Director Information Technology and Regulatory Informatics at the Financial and Consumer Services Commission of New Brunswick) and Dr. Matthias Artzt (Senior Counsel, Deutsche Bank).

The slides of the presenters can be accessed here and here.

The recording of the class can be accessed here.

Blockchain Technology – What is it and how does it function?

The term blockchain refers to a distributed ledger technology, composed of recorded transactions in set groupings, called “blocks,” that are linked to each other using cryptographic hashes. These linked blocks form the “chain” (Figure 1), and if any block is tampered with after it’s been added to the chain, that change would be immediately noticeable because the hashed link would be broken (would no longer match between the two blocks).

picture1

Figure 1. Analogy of a train for chaining together blocks of transactional information.

Additional blocks of information (e.g., recorded transactions) get added to the ledger when they receive verification from other blocks that they have successfully solved an accompanying mathematical challenge (by “miners”). This verification process takes several minutes, but once 51% of the participating nodes have verified the result, the block is permanently added to that chain.

Because these transactions are simultaneously recorded on ledger systems in independent locations around the world (called “nodes”), all copies of which are updated at the same time when a new transaction is recorded, the security of information recorded on a blockchain is considered “immutable” or unchangeable (Figure 2). Originally designed specifically to manage a cryptocurrency system such as Bitcoin, the point of this process was to enable trust and reliability within a system without a central manager, like a bank or national financial system.

picture2

Figure 2. The information contained in the records are extremely resistant to alteration.

Expanding Use Cases

Although initially, blockchain systems were used only for cryptocurrencies, this technology can potentially be used for other purposes. Blockchain was initially designed to alleviate the need for a manager or controlling entity, recording transactions for anonymous or pseudonymous users, and with the expectation that all related information was included in the transaction as stored on the chain. However, as the underlying technology has been considered for other uses, there have been a number of changes to that design.

Distributed ledgers are now created in ways that can be private, public or hybrid in nature. Public ledgers operate as an open network which allows anyone to download the protocol and participate in the network. Private ledgers, by contrast, work on an invite-only basis with a single entity governing the network. Hybrid systems combine elements of these two systems. In part, blockchains vary in the degree to which type of ledger they create as well as what activity the ledger specifically records.

Blockchain has in recent years been considered for applications such as supply-chain management; recording or property or real estate records; smart contracts; and even voting functions. One of the leading  commercial providers of blockchain systems, Ethereum is a global, open-source platform for many decentralized applications. The full usefulness of blockchain for such applications is not yet fully understood, and many use cases are still being developed or explored.

Legal Implications of Blockchain for Privacy and Data Protection – Roles and Responsibilities

Blockchain technology raises many privacy and legal implications that warrant awareness from industry professionals, policymakers, and the general public. When a blockchain ledger is used to manage or even directly store personal information, the processing of that personal information will fall under the auspices of most privacy and data protection laws, including the General Data Protection Regulation (GDPR).

Those managing or using a blockchain-based system could assume certain responsibilities depending on their role in processing the data. Under the GDPR, a “controller” means “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data.” Art. 4(7). “Processors” refer to a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.” Art. 4(8).

Dr. Artzt described the roles and the responsibilities of blockchain participants under the GDPR and similar international data protection regimes.

Data Subject Rights, Minimization and Purpose Limitation

Because storage counts as data processing under the GDPR and new uses of blockchain technology involve some direct storage of data, the use of the technology raises other privacy implications and trade-offs. Clarifying the nature of these implications will be increasingly important not only for users and operators of blockchain but also for policymakers who must ask difficult questions about the appropriate scope of regulation. For this reason, our speakers recommended that a best practice would include storage of personal data off-chain, or remove personal data which is not needed any more (“legacy data”) from the blockchain, storing it in an external database off-chain and  link the stored information to the recorded exchanges and transactions via on-chain hashes

From the discussion, data subjects exercising their rights – such as the right to deletion – under the GDPR may run into problems determining who should respond to such requests. Here, the distinction between public and private blockchains becomes critical. With private blockchains, a data subject may theoretically treat the governance body as the responsible controller of their information. But with public blockchains, the data subject faces the dual challenge of identifying the appropriate controller and ensuring the controller carries out its obligation.

What’s more, the nature and design of the technology itself compounds such challenges by making it nearly impossible to access or modify the information contained in the blockchain. For instance, a controller in a public blockchain won’t be able to comply with data subject access requests as a matter of feasibility. Any data on the public blockchain will be there to stay and can’t be deleted or rectified.

Similarly, the immutability of data in blockchains creates tension with other data privacy principles such as purpose limitation and data minimization. By nature, blockchain continuously processes data, forever recording information related to the full history of past transactions.. This makes it practically impossible to minimize data use to that which is necessary for a particular transaction.

Techniques to Mitigate Data Protection Risks

In some cases, the data protection challenges of complying with data subject rights and other legal obligations can be mitigated through the use of private blockchains; off-chaining the data; and innovative encryption techniques, e.g. hashing.

Adopting a private blockchain resolves certain issues because the governance body has more control and is better equipped to respond to data subject requests. Such a solution impacts the utility and scalability of the technology. The fundamental problem with the immutability of the blockchain is the perpetual storage of information.

Creating a mechanism for the use of  legacy data  may provide one legal  solution to this issue. In one scenario, legacy data from a mutable private blockchain where the real-time processing takes place could be transferred  to an immutable public blockchain through interoperating multi-layered blockchains.

The most straightforward and efficient technique for resolving the issue with legacy data is the hashing-out approach: Legacy data is removed (hashed-out) from the blockchain and put externally (off-chain). The personal data is replaced with a hash value which remains on the blockchain and points to the reference data stored externally. In the case of a deletion request raised by an individual, the off-chain personal information can be deleted.

Off-chain storage may resolve some of the issues around erasure requests and enable greater data minimization and purpose limitation. Once the controller deletes the corresponding personal data in the external database, the hash value remaining on the blockchain becomes a random string with no meaning. Because blockchain uses cryptographic hash functions, in most cases it is impossible to reverse engineer the original (reference) data from the hash. As a result, the hashed section is no longer tied to personal information, and therefore no longer subject to the same legal implications.

Under GDPR or similar legal frameworks, controllers will likely have to carry out a data privacy impact assessment (DPIA). Such an assessment provides an opportunity for controllers of a blockchain-based system to evaluate the appropriate technical and organizational measures they can adopt to minimize privacy risks. Utilizing various mitigation measures may offer controllers ways to apply privacy-preserving technical solutions while benefiting from potential innovative capabilities.

Because of the many unresolved privacy and regulatory questions around blockchain systems, policymakers and other stakeholders must be aware of the particular concerns and challenges involved with adopting this technology for new applications and use cases.

The slides of the presenters can be accessed here and here.

The recording of the class can be accessed here.

To learn more about the eight masterclasses in the VUB-FPF Digital Data Flows Masterclass series click here.

To learn more about FPF in Europe, visit https://fpf.org/eu.

21st Century Cures Act Final Rule: Key Health Data Privacy Considerations

On the eve of a key compliance date, the HHS Office of the National Coordinator for Health Information Technology (ONC) extended deadlines for entities working in health information technology to comply with a new federal rule intended to “promote health care choice and competition across the United States” and “advance interoperability and support the access, exchange, and use of electronic health information.” Organizations now have until 2021 and 2022 to comply with key provisions of the 21st Century Cures Act.

In March, the ONC announced the Final Rule, which implements certain provisions of the 21st Century Cures Act, under the President’s 2017 Executive Order 13813. The Final Rule became effective on June 30, 2020, requiring organizations that hold patient data to take steps intended to make the information more portable and useful for individuals. However, the ONC initially deferred enforcement to November 2, 2020 in response to the COVID-19 pandemic and has now further extended compliance dates into next year and 2022. The agency will host an informational public webinar on the topic at 3pm ET today (registration for that webinar is available here).

As organizations move to implement privacy and data-related provisions of the 21st Century Cures Act final rule, they are weighing key privacy considerations, including:

1. Privacy Exceptions Around Information Blocking

The Final Rule clarifies the definition of “information blocking” in the 21st Century Cures Act. The Act restricts information blocking by organizations that hold patient data, with the goal that more portable, interoperable health information will benefit individuals, clinicians, and researchers. The rule focuses on what qualifies as the intentionalwithholding of patient health information either from provider to provider or provider to patient data transfers. Three categories of “actors” are regulated by the rule’s information blocking provisions: 1) healthcare providers (e.g. healthcare facilities, laboratories, pharmacies, physicians, and providers), 2) health information networks; and 3) health information technology (IT) developers of certified health IT.

The Final Rule also discusses when “an actor’s practice of not fulfilling a request to access, exchange, or use EHI in order to protect an individual’s privacy” may not be considered information blocking (referred as the “Preventing Harm Exception”). Measures that would otherwise qualify as “information blocking” can be exempt from the rule’s restrictions on blocking if they are intentional measures to safeguard individual EHI privacy. In such circumstances, organizations must weigh net individual benefits to granting EHI access against net privacy risks. However, if the net portability benefits outweigh the privacy risks, arguments can be made that refusing a request to access, exchange, or use EHI would qualify as “information blocking.” The ONC will evaluate alleged violations of information blocking regulations on a case-by-case basis.

2. Privacy and Security Transparency Attestation Criteria

The ONC adopted privacy and security transparency attestation criteria while also recognizing the wide variation that exists in authentication needs and approaches across the health industry. The criteria is intended to serve as a key accountability mechanism to bolster security and privacy safeguards for health data held by covered organizations. Noting the large public support for multi-factor authentication (MFA) as a privacy and security certification criteria for developers of certified health IT, the ONC generally views MFA and “encrypting authentication credentials” as “best practices for privacy and security in healthcare settings.”

Although the ONC supports these two measures as best practices, there may be cases where either one or both measures could be inapplicable or inappropriate (e.g. when a Health IT Module does not support MFA). In such cases, in the spirit of consistency and transparency to engender goodwill among users and regulators, developers can provide rationale or context when attesting “no” to certain privacy and security attestation criteria.

3. Disclosing Patient Data Under the HIPAA Privacy Rule Right of Access Provisions

Right of access provisions under HIPAA regulation 45 CFR 164.524(a)(1) give individuals a right of access to their Protected Health Information (PHI) in the form of a “designated record set.” This includes EHI, health insurance claims data, and any information used “by or for the covered entity to make decisions about individuals.” The right of access provision excludes information that is not used to make decisions about individuals (e.g. provider performance evaluations or quality control records) and other information that is outside the scope of the “designated record set.”

The Final Rule provides guidance on the appropriate means of disclosing or ensuring patient access to PHI under the HIPAA right of access provisions. The phrase “appropriate means” rests on the nature and scope of the definition of “interoperability element” within the Final Rule, which is defined as:

“Hardware, software, integrated technologies or related licenses, technical information, privileges, rights, intellectual property, upgrades, or services that:

(1) may be necessary to access, exchange, or use EHI; and

(2) is controlled by the actor, which includes the ability to confer all rights and authorizations necessary to use the element to enable the access, exchange, or use of EHI.”

Web and mobile applications (e.g. direct-to-consumer apps) that rely on application programming interfaces (APIs) fall within the definition and scope of “interoperability element.” Privacy considerations emerge, however, when web and mobile applications interface with EHI systems, especially when the applications are run or hosted by third parties that are not HIPAA covered entities or business associates. Some public comments on the Final Rule discuss privacy concerns, including the potential for EHI to be mismatched or mishandled when shared via web or mobile applications that rely on APIs (third-party or otherwise). Despite these concerns, the ONC states that the rule’s “Preventing Harm Exception” should not interfere with legally permissible access, exchange, or use of patient EHI.

4. Best Practice Recommendations for EHI in Interoperable Mobile Applications

Recent expert analysis notes that “there is concern and uncertainty about transmitting a patient’s data to a health app of unknown security and privacy protection and whether the physician or covered entity may be liable if the patient’s app or its developer subsequently breaches or improperly uses or discloses the data.” There may be healthcare providers and patient app users with unresolved privacy and security concerns about adopting web or mobile applications to transmit, receive, or access health information.

To help address these concerns, the CARIN Trust Framework and Code of Conduct and Attestation (CARIN Code) offers an opportunity for consumer-facing apps that “collect personal data and are offered to and used by consumers in the United States, regardless of whether or not they are covered by HIPAA,” to publicly commit to standards of transparency, consent, use and disclosure, individual access, security, provenance, and accountability.

According to the CARIN Alliance, apps can “publicly endorse and agree to a set of questions regarding how they use, manage, and secure the consumer’s health data based on the principles in the code of conduct.”

App developer privacy policies, which are enforceable by the Federal Trade Commission and State Attorneys General, are an important area to consider when analyzing the privacy practices of mobile apps that interface with systems housing EHI. Beyond language in the Final Rule, the ONC provides five recommendations for privacy policies and practices to which third-party apps should adhere with regard to the access, exchange, or use of EHI. One key recommendation: a privacy policy should include “a statement of whether and how the individual’s EHI may be accessed, exchanged, or used by any other person or other entity, including whether the individual’s EHI may be sold at any time (including in the future).”

In 2016, FPF published Best Practices for Consumer Wearables and Wellness Apps and Devices. Although these FPF best practices exclude data governed by HIPAA (i.e. EHI), moving forward FPF will continue to convene industry and key other stakeholders to determine privacy best practices in light of the ONC Final Rule, CARIN Code, and other relevant guidance.

Public-facing commitment opportunities like the CARIN Code and FPF’s Best Practices for Consumer Wearables and Wellness Apps and Devices and Privacy Best Practices for Consumer Genetic Testing Services give organizations that generate and manage health data across the patient-consumer spectrum important opportunities to establish public trust and engage in transparent, enforceable self-regulation. In the absence of a comprehensive, federal privacy law, it is important for stakeholders that collect and use health data to establish privacy best practices and standards that add value to health management across the broadening patient-consumer spectrum.

Subscribe to the FPF mailing list to stay up to date on these issues and check out more of our top stories in health.

For additional information about FPF’s Health Initiative, please contact Dr. Rachele Hendricks Sturrup ([email protected]) and Katelyn Ringrose ([email protected]).

Acknowledgements to Samuel Adams, Policy Intern, for his background research and editorial contributions and Katelyn Ringrose, Policy Fellow, for her editorial contributions.