Event Report: From “Consent-Centric” Frameworks to Responsible Data Practices and Privacy Accountability in Asia Pacific
On September 16, the Asia-Pacific office of the Future of Privacy Forum (FPF) held its first event following its launch in August 2021. This event was hosted by the Personal Data Protection Commission (PDPC) of Singapore during the very popular “Personal Data Protection week” (PDP Week 2021).
The theme of the event was Exploring trends: From “consent-centric” frameworks to responsible data practices and privacy accountability in Asia Pacific, and it is that of a larger project, carried out jointly by FPF with the Asian Business Law Institute (ABLI) across 14 Asian jurisdictions. The event was also co-organized by ABLI and FPF in the context of a cooperation agreement which was signed by the two organisations in August 2021.
This post summarizes the discussions in the two stellar panels featuring regulators, thought leaders, and practitioners from across the region, and highlights key takeaways:
- Consent requirements which apply to the collection and processing of personal data, “notice & choice”, exceptions and alternatives to those requirements, combined, form an area where regulatory coherence is most needed in Asia-Pacific (APAC). Over-reliance on consent has led to the development of a “tick-the-box approach” to data protection, consent fatigue, and unnecessary compliance costs due to contradictory requirements in Asia Pacific.
- Modern data protection laws should shift the onus of data protection from users to organizations, by promoting an accountability-based approach to data protection over a “consent-centric” one. Different avenues may be used to rebalance consent and privacy accountability in APAC, including through concepts such as legitimate interests, compatible uses and equivalent notions.
- Making consent meaningful again in APAC can happen in a variety of ways, which include winding back the range of circumstances in which consent is sought; requiring consent only where it can be given thoughtfully, sparingly and with understanding; supporting enhanced transparency and consent through UX and UI design, with due attention brought to the different needs and literacy levels of users.
- Harmonization is illusory in the face of Asia’s extreme diversity, but a bottom-up approach to convergence can work in the context of regional cooperation.
1. Repositioning consent requirements in APAC’s fragmented data protection landscape
Dr. Clarisse Girot, Director of FPF Asia Pacific and ABLI Senior Fellow, opened the discussion by explaining that the issue of a comparative look at “consent” requirements across the region was chosen as a key topic following suggestions from a vast network of stakeholders. Feedback showed that consent requirements which apply to the collection and processing of personal information, the “Notice & Choice” principle, exceptions and alternatives to those requirements, combined, form an area where regulatory coherence is most needed in the Asia Pacific (APAC) region.
In practice, the cumulative application of consent requirements for data processing in the region has led to the development of a “tick-the-box approach” to data protection in many jurisdictions. However, in APAC as elsewhere, overreliance on consent as a lawful ground by organisations has led to a general consent fatigue and unnecessary compliance costs, due to contradictory requirements.
An agreement is therefore forming across Asian jurisdictions that modern data protection laws should shift the onus of data protection from users to organizations, by promoting an accountability-based approach to data protection over a “consent-centric” one. This triggers a need to relativize the role of consent and to bring it back to the place which had been initially assigned to it by the very first data protection frameworks — namely, as one among many elements in a regulatory ecosystem which generally seeks to balance the role and interests of individuals, the responsibility of organizations, and broader social and societal interests with regard to the processing of personal data.
The main goal of the workshop was therefore to identify similar discussions that are taking place in multiple jurisdictions in APAC and to explore the possibilities of convergence among them. The discussion will also feed a joint comparative study with recommendations for convergence on consent and related data protection requirements, which will be published jointly by FPF and ABLI before the end of the year.
Both panels were composed of data protection professionals from different APAC jurisdictions and disciplines. Each speaker contributed with an original and expert point of view that could help identify commonalities, pathways for interoperability between Asian data protection frameworks, and concrete solutions to provide meaningful data protection to individuals — with or without consent.
Such reflections and recommendations are particularly timely at a time when key jurisdictions in Asia, including India, Indonesia, Thailand, Vietnam, Hong Kong SAR, Malaysia, Australia, are adopting new data protection frameworks or amending their laws, and new laws or major amendments recently came into force in jurisdictions like Thailand, Korea, New Zealand, China, or Singapore.
2. Rebalancing consent and privacy accountability
The title of the first panel was “Switching from a consent-centric approach to privacy accountability: a comparative view of APAC data protection laws”.
The panel was moderated by Yeong Zee Kin, Assistant Chief Executive, Infocomm Media Development Authority (IMDA), and Deputy Commissioner, PDPC, Singapore, with the inputs of Peter Leonard, Principal and Director at Data Synergies, Sydney, Takeshige Sugimoto, Managing Director at S&K Brussels, Tokyo, Shinto Nugroho, Chief Public Policy and Government Relations at Gojek, Jakarta, and Marcus Bartley-Johns, Asia Regional Director, Government Affairs and Public Policy at Microsoft, Singapore.
The goal of this first panel was to identify commonalities and pathways for interoperability between Asian data protection frameworks with regard to balancing the protection of individuals, accountability, and broader social and societal interests. This includes the role of consent, lawful grounds to process personal data, and/or other privacy principles in jurisdictions which do not contain provisions on “lawfulness” of processing.
The most important points highlighted during the discussion were the following:
2.1 How to achieve convergence across APAC’s fragmented and diverse landscape?
As an introductory note, Yeong Zee Kin stressed that APAC jurisdictions take different approaches towards privacy and data protection, but also that their laws are in different stages of developments (e.g., Japan and South Korea have had privacy laws for a long time, while Singapore, Philippines and Malaysia are more recent players). One may add that data protection or privacy laws follow different structures and not all modelled on EU GDPR, hence some key provisions (e.g. on “lawfulness” of data processing) have no equivalent in other jurisdictions.
A challenge which is endemic in APAC is therefore to identify a common ground in order to achieve convergence, while respecting the different inspirations and the particular culture that are enshrined in each jurisdiction’s privacy laws.
This raises a key question for participants, which is whether APAC stakeholders should aim for harmonisation or for more targeted actions of convergence, for instance through the mutual recognition of specific legal standards.
2.2 Over-reliance on consent and need for alternatives
Speakers highlighted that APAC-based organisations tend to overly rely on consent, even in cases where another solution or legal basis would be available and more appropriate. The potential consequence of such a practice is the erosion of the value of consent.
A view expressed by Peter Leonard and shared across the panel was that consent, “informational self-determination”, or “citizen self-management” of privacy settings, remains important. However, anyone should only be expected to self-manage what is realistically manageable by them. The need is felt to address both the frequency of consent requests and reduce the level of noise in privacy policies and collection notices, as well as rethink the role of privacy policies and collection notices.
Among “noise reduction measures”, he specifically cited appropriately targeted exceptions, whether through legitimate interests, industry codes or standards, class exemptions by regulators, or new generic concepts such as “compatible data practices”, in such a way that the control of individuals over their personal data is not adversely affected. As a baseline, moving away from consent requires recognizing the importance of concepts like “reasonableness” or “fairness” to support the alternative requirements of data protection laws.
Unambiguous express consent should remain necessary for categories of processing that create higher risk of privacy harm to individuals, in particular for manifestly sensitive data, including data about children, processing which directly contradict individuals’ rights and interests, or cannot reasonably be expected by them. This may also tie in with the concept of “no-go zones” as it has been developing in Canada, and which has gained some popularity in Australia.
2.3 Varying approaches and interpretations in different jurisdictions: Japan, Indonesia, Vietnam
Another point raised by the moderator and panellists was that material differences in the protections awarded by legal systems in APAC countries may hinder the path towards harmonisation. There is therefore a need to better understand how each law works before proposing solutions for convergence, so that they can be meaningful for all.
Takeshige Sugimoto commented on the “consent by default” situation which currently prevails in Japan. He noted that the Japanese data protection law (APPI) does not have a “legitimate interest” legal basis, but that—contrary to a common belief—it does not take a consent-centric approach either. Rather it permits processing of personal data based on the “business necessity” ground, as long as the data subject may reasonably expect the intended further usage of his or her data. The boundaries of permissible processing under APPI are therefore similar to GDPR, even without “legitimate interests” as a legal basis. In its adequacy decision on Japan, the European Commission actually states that the Japanese system also ensures that personal data is processed lawfully and fairly through the purpose limitation principle.
Sugimoto also mentioned the Japanese Personal Information Protection Commission (PPC Japan) guidelines, which list limited cases where consent must be sought, while pointing to other areas which are open to other legal bases and authorisation from the PPC. In other words, in his view there would be no significant difference, in practice, between what GDPR considers legitimate interests-based processing, and what APPI considers lawful processing.
Shinto Nugroho presented the situation in ASEAN from the perspective of Gojek, Indonesia’s first decacorn and SuperApp, with operations in Indonesia, Vietnam, Singapore, Thailand, and Philippines. Nugroho’s particular focus was on the challenges of operationalizing consent in times of crisis, like in the current Covid-19 pandemic. She noted that in its current state Indonesia’s data protection legislation is quite consent-centric, but that the draft Data Protection Bill to be soon adopted by the Parliament of Indonesia mentions consent as only one of seven available lawful grounds for processing personal data (others including contract, performance of a legal obligation and legitimate interests).
Nugroho welcomed this development. She explained how consent as a legal ground is neither always practical for controllers nor protective for individuals, and in fact sometimes even harmful for citizens. For instance, in Indonesia, out of 170 million inhabitants roughly 160 millions are eligible for vaccination against Covid. Gojek has secured massive vaccination slots from the governments, namely for its drivers who are frontliners. However, the government requires that everyone be registered in the public vaccination system first, for which consent is required. But not everyone has access to the Internet or has the literacy required to get registered; moreover, the vaccination register itself is work in progress. Securing 100% opt-in consent from millions of drivers to be registered in the scheme is not only going to slow down the process, but the drivers are also going to miss the notification, or fail to complete their registration. In such cases, for Gojek the most adequate legal basis to get the drivers registered would be its “legitimate interests” as an employer, together with clear purposes, and adequate transparency over mere consent. The consideration that drivers are exposed to a high risk of contamination at a time when the epidemic is hitting the country should override the need to obtain consent.
Lastly, Nugroho mentioned the ongoing discussions on the future Data Protection Decree of Vietnam, to be adopted imminently. The Decree does not provide for a legitimate interests basis, but at least similarly allows controllers to collect and process data on grounds other than consent (such as security, when permitted under the law, and research). Discussions on convergence must therefore factor in the fact that APAC data protection laws can vary even between neighbouring countries which have commonly drawn inspiration from similar sources (primarily EU GDPR) to draft their future comprehensive data protection frameworks.
2.4 Transparency & choice as trust enablers
Marcus Bartley-Johns welcomed the fact that the discussions enabled to introduce nuances in the conversation, because “making consent meaningful again” is a journey and for that we must avoid binary approaches (“for, or against consent”). He also concurred with Takeshige Sugimoto that laws and regulations can go in one direction, but business practices and embedded behaviors can go in another, and these variations are a key part of the discussion around consent.
Bartley-Johns shared a few data points on what consent means in the region. In 2019, Microsoft ran a survey on 6300 consumers across Asia on consumer perception of trust; 53% of the persons surveyed said that they had had a negative trust experience related to privacy when using a digital service in the region. Younger people reported a higher share of negative experience, and more than half of those said they would switch services if their trust was breached. Bartley-Johns added that the fact that consumers have reasons to be wary should be acknowledged, one of those reasons being the excessive difficulty for individuals to find out and understand how their data is being collected and used.
Another data point is in relation to the use of the privacy dashboard which enables Microsoft users globally to see and control their activity data including location, search, browsing data across multiple services. 51 million unique visitors have used that dashboard since its launch in May 2018 (19 million people in 2020). Japan, China, Australia, India and Korea feature in the top 20 markets from where users have been using the dashboard. In other words, the speaker stated that Microsoft’s experience shows that consumers wish to know what personal data is collected about them and exercise their options and rights when they are given an opportunity to have their say.
Following up on this point, Peter Leonard added that transparency plays a double role: on the one hand, it allows individuals to know how their data are being used, while at the same time provides safeguards against deceptive and manipulative statements by organisations, where appropriate “do only what you say” laws are in place at a local level.
2.5 “Legitimate interests” in context
On the whole, all the speakers expressed their support for the development of the concept of legitimate interests or equivalent concepts in APAC laws. The adoption across more privacy laws of alternative grounds for processing personal data, notably legitimate interests, is one of the potential areas for strengthening privacy regulatory coherence in the region. Microsoft for instance has advocated for this necessity in a recent policy paper calling for strengthening privacy regulatory coherence in Asia.
Speakers noted that a problem in APAC of increased reliance upon legitimate interests as an alternative for consent is that lists of legitimate interests are varied and jurisdiction-specific. This means that entities operating across borders and seeking a common denominator in their privacy policies and requests for consent will continue to be incentivised to overly rely upon consent, unless they are given some certainty on where the lawmakers and regulators are likely to use this notion. Convergence can be strengthened by the adoption of regulatory guidelines on implementing this approach and information sharing on their implementation.
Peter Leonard contributed by stating that, to make the legitimate interests lawful ground work in APAC, there could be a need for a mutual recognition scheme in the region of the differing definitions and approaches to the legitimate interests. According to him, this will not lead to absolute convergence, but will allow reaching a compromise that takes stock of local legal systems and cultures in diverse Asia. Failing this, we will have data controllers who will keep using consent as a common denominator by default.
In the view of Takeshige Sugitomo, having a compilation of use cases that clarify whether legitimate interests or consent would be the most appropriate legal basis in each case, would help achieve a more holistic regional approach. This could lead to international consensus on specific use cases which would be more efficient than awaiting joint regulatory guidance which might take years to be issued.
Marcus Bartley-Johns suggested that it would be valuable to check if the consensus that emerged from this panel could emerge in the regional and global regulatory community. This is important as more regulations and guidance have emerged in the last months in Asia which tends to make transparency or consent requirements even more prescriptive. In this respect, there would be real value in obtaining practical guidance from regulators on these issues, like PDPC has done, with indicative examples, use cases and scenarios that will give a basis for a more holistic approach to balancing consent and other approaches in the region.
Seconding the comments by Sugitomo and Bartley-Johns, Yeong Zee Kin indicated that one of the sources of inspiration for drafting PDPC’s guidelines on legitimate interests in the recently amended PDPA was the FPF’s report on legitimate interests in the EU, which provides a compilation of guidance or decisions by regulators and court cases where the scope of the legitimate interests lawful ground was clarified. He suggested that the right way forward would probably be to identify real world examples and use cases where a regional or global consensus can be reached on situations where we do not need consent, and the next step will be for regulators to start contextualizing the end result depending on their respective legal systems (necessity, reasonableness, legitimate interests, contractual necessity, vital interests, etc.).
The moderator suggested that FPF and other stakeholders contribute to building this library of “legitimate interests”, and that regulators could do their part by going out to their local industry and looking for such use cases. Subscribing to a remark by Peter Leonard, however, he acknowledged that in the broad spectrum of different cultures and histories in Asia, complete harmonization is not realistic. In contrast, taking a practical bottom-up approach to convergence might get us somewhere and we should seek to build on consensus as and when we find them, for instance bilaterally, between like-minded partners, and maybe more slowly, on a regional level.
3. Making consent meaningful (again)
The title of the second panel was “Shaping choices in the digital world: how consent can become meaningful again”. The panel was moderated by Rajesh Sreenivasan, Head, Technology Media and Telecoms Law Practice, Rajah & Tann Singapore LLP. It further included interventions by Anna Johnston, Principal, Salinger Privacy (Sydney), Malavika Raghavan, Visiting Faculty, Daksha Fellowship and FPF Senior Fellow for India, Rob van Eijk, FPF Europe Managing Director and Edward Booty, Founder and CEO of reach52.
Rajesh Sreenivasan started by saying that the problem with consent lies not on the concept itself but on the way this legal ground has been used for processing personal data. Especially in APAC, where multiple jurisdictions have very different approaches, he mentioned that obtaining meaningful consent requires answering two questions first: 1) Meaningful consent for whom: for the data subject or for the organisation?, and 2) Meaningful how? Additionally, the moderator openly asked participants whether in their view it was more pressing to make consent meaningful or to build alternative models for fair data processing, as consent might have become redundant in today’s context, at the speed at which data is being used.
3.1 Are current online consent-seeking practices fair?
Anna Johnston kicked off by supporting a burden shift, away from individuals and onto organisations, when it comes to consent standards. According to her, consent has almost lost its true meaning because it has been so over-used as a promise — in her own words, it has become like “your cheque is in the mail”!
The situation in Australia as she sees it is that consent is over-relied on, but also under-enforced. There is guidance from the Australian Privacy Commissioner (the OAIC) and there is case-law to back up that guidance, that consent in Australian law is similar to the GDPR: it cannot be bundled up with other things, it cannot be included in mandatory Terms and Conditions, in a Privacy Policy, it cannot even be “opted out” – consent as a lawful basis on which to collect, use or disclose personal information has to be the customer’s clear “opt in” choice, made freely, separate from all other choices. However the law is under-enforced, and so it is still very common to see business practices which follow a model of “bury the customer in fine print and make them agree to something we know they won’t even read”, and then claim that the customer has consented to something.
Surveys conducted by the OAIC actually suggest that only 20% of Australians feel confident that they understand privacy policies when they are actually reading them. Recently the Australian consumer and competition regulator, the ACCC, has called out this kind of power imbalance and these kinds of behaviours from the Big Tech platforms, and recommended that the Privacy Act should be amended, to make the standard required for consent much clearer in the law.
3.2 The boundaries of consent’s role
Overall, speakers agreed that there is a need to “make consent meaningful again”, primarily by winding back the range of circumstances in which consent is sought by organisations. Consent should only be required, and sought, where it can be given thoughtfully, sparingly and with understanding. Consent is only real consent where an individual has a real choice [note: an increasing number of data protection laws in Asia recognize the concept of “free and unbundled consent”]. A discussion is needed about when requiring consent is sensible, and how to ensure that capabilities of individuals to control their privacy settings are not compromised by any changes in consent requirements.
Winding back such requirements, to improve data privacy, may sound both radical and counter-intuitive. However, over both sessions an agreement has been formed that processing without consent should only be recommended if the processing is aligned with the ordinary expectations or direct interests of data subjects, and without ever overriding a requirement for transparency.
Anna Johnston thus opined that there should be clear distinction between business activities that require or do not require consent. For example, activities that are outside customers’ expectations should require consent (eg. asking someone to join a research project), whereas the same would not apply to unobjectionable, fair and proportionate activities (such as including an individual in a customer database) nor to others with public interest backing. She concluded by adding that there should also be a list of activities that are prohibited even if consent is given, including cases of profiling children for marketing purposes.
In his presentation, FPF’s Manager for Europe Dr. Rob van Eijk concurred and added that a lot of the debate on the consequences of the datafication of society has been around limiting the collection of data but also on its further use. Consent is one of the ways in which to regulate these two “gateways”, and if we look at that there are multiple ways in which we can ensure that everyone is on board. In practice, however, much of the burden is on the users to read and comprehend what is being put forward. This aspect was the key focus of this year’s Dublin Privacy Symposium organized by FPF, entitled Designing for Trust: Enhancing Transparency & Preventing User Manipulation.
An important point made during the symposium is that organisations should be proactive in increasing transparency from a design perspective so as to present users with a real choice and encourage them to make deliberate decisions. Understandability, how people read through the information, for instance, can be tested through technology in the online space. Another important point is that organizations should ask themselves whether they should be collecting all the envisaged data in the first place (in line with the minimisation principle).
They must also take active steps to prevent user manipulation not only in designing consent solutions (for instance through cookie banners), but also when they process data through machine learning algorithms. Finally, the question of vulnerable groups should be factored in the design of UX/UI (“have we left any groups behind?”). A lot can be done to make things more understandable. And this of course leads to the question of the extent to which the expression of choice can be embedded in the technology.
3.3 Dealing with users with different needs and literacy levels in APAC
The Asia Pacific region is a region of contrasts, especially in terms of literacy levels, including financial and health literacy, due among other reasons to different educational levels and the wide linguistic variety that exist in some countries.
FPF’s Malavika Raghavan made comments and shared findings issued from her research and extensive field work done in India, to explore how the mental models of internet users in India impact these discussions on consent, with a particular focus on the financial sector (eg. loan applications). She underlined the importance of understanding the context of non-Western users, particularly new generations of users in Asia, before even aiming at the design of laws and practices for obtaining meaningful online consent.
For instance, Raghavan pointed to surveys that showed that many mainstream Indian users – i.e. modest-earning individuals from primarily rural areas – do not understand the differences between their mobile phones, the internet, online services and allied services like payment platforms, because they exclusively use them on their phones. Understanding this reality (how users have never used a computer, but only mobile phones with preloaded apps, free allowances, etc.) is key to start thinking about designing consent, or even policymaking around consent.
However, literacy is not necessarily a barrier, and it is not related to digital skills: highly proficient digital users might not be literate, and reciprocally. Moreover, a large number of Indian families often share their mobile devices, which means that consent in those scenarios should be considered given for a group of individuals rather than separate individuals: this mental model is very far from the mental models of a designer or policymaker. Asking for one-to-one consent in such circumstances might not make sense. But however disadvantaged, individuals still have strong ideas about how their data may be shared.
The limitations of consent have been analysed by Raghavan in particular in her work on the Data Empowerment and Protection Architecture (DEPA) and the Consent Layer developed by Indiastack, which seeks to enable secure and effective sharing of personal data with third party institutions in India by using the concept of “consent managers”. Raghavan highlighted in her work how cognitive limitations operate on individuals’ decision-making about their personal data and how the threat of denial of service can make “taking consent” a false choice. To be effective, therefore, such systems must be supported by strong accountability systems and access controls that operate independent of consent. Relying solely on consent is not a good idea, as a wealth of data protection and consumer protection thinking has shown that consent is necessary but not sufficient for data protection.
Moreover, the panellist concluded, coders and digital platform designers should consider users’ perceptions, literacy and context when setting up online services. The law alone cannot fix what has been broken by technology. This, according to Raghavan, is particularly important in a jurisdiction where the highest judicial instances have recognized privacy as a fundamental right (such as India) and where users have strong ideas and reasonable expectations about how digital data flows occur. In said exercise, unbundling ancillary consent-needing data processing from online services’ terms and conditions should be front and center.
Edward Booty then shared his experience as CEO of Reach52, a social enterprise and a growth-focused start-up that provides accessible and affordable healthcare for the 52% of the world without access to health services, with 5 key markets in Cambodia, Philippines, India, Indonesia, and Kenya.
Reach52 uses technology and community outreach to widen access to health services while simultaneously lowering their costs. Booty explained that his company is still small, but has accumulated a lot of sensitive data in the multiple countries in which they operate. He specifically shared about his experience in collecting health data and profiling residents for providing better care in remote rural communities in Philippines and Cambodia, and uncovering data-driven insights to inform more targeted, effective access to healthcare solutions. Although it is sometimes disheartening that some users do not care, not having legitimate consent from users in a data-driven business model constituted a risk to his start-up. Furthermore, reach52 still believes that it must help the people who use their services understand their rights around data collection and use, regardless of their education and literacy levels. Booty explained how consent was sought from individuals who provided their data for such a purpose, using video, visuals, and progressive disclosure, paying attention to the way terms are explained, and consent gained, so as not to fall short for people with low literacy and education levels. For this, support was obtained from Facebook accelerator and IMDA.
A specific challenge explained by Booty is that local and national government authorities were then coming to reach52 to obtain access to the datasets for a variety of purposes, notably to manage different humanitarian crises. The speaker shared that, as pressure from those authorities mounted, the organisation started working on ways to get more meaningful and granular consent from individuals for each of the needs that their data could serve. This involved engaging designers to deliver simple flyers with information to individuals about what could happen to their data after its collection, as well as about their data-related rights. The process included testing with different age groups to make the message intelligible for a wide audience.
3.4 How UX and UI can support enhanced transparency and consent
During the session, several times the idea was brought up that designers, and the improvement of the user experience and user interface (UX/UI), have an essential role to play in improving the regulation of architectures of choice.
In recent years, more academics and data protection regulators have underlined the fundamental role which UX/UI design can play for user empowerment and that design and interfaces must now make part of the compliance analysis. Universally-accepted icons could be a possible solution to improve intelligibility, said Anna Johnston. In her presentation, she argued that web designers should try to think with the mind of a user, by considering useful evidence and guidance on how to better design privacy notices, such as the UK Government’s piece on better privacy notice design.
Various ideas for improving privacy notices are modelled on successful designs used in safety messaging (like traffic light indicators), and product labelling (such as star ratings and nutrition labels). But this form of notice still does not work at scale. Anna Johnston expressed the view that the most innovative idea she has seen in this space comes from Data61, which is an arm of the Australian Government that has proposed machine-readable icons, based on Creative Commons icons from copyright law – these are universally agreed, legally binding, clear and machine-readable.
This latter suggestion was echoed by the findings of FPF’s Dublin Privacy Symposium on manipulative design practices, which were outlined by Dr. Rob van Eijk during the session. According to him, the Symposium’s speakers explained that providers should encourage users to make deliberate decisions online by avoiding so-called “dark patterns”, consider the needs of vulnerable groups (such as visually impaired or colour-blind users) and the best way of informing users where data collection devices do not have visual or audio interfaces (eg. IoT). Van Eijk added that cookie walls as they are developing in Europe may be a radical solution, as they prevent users from accessing content unless they agree to pay a fee or accept online tracking.
Conclusion
Commissioner Raymund Liboro, National Privacy Commissioner of Philippines, delivered the concluding remarks of the workshop.
To support the work of the FPF and ABLI and the discussions of the day, Commissioner Liboro evoked a topical case in the Philippines. In late August, his office ordered the take-down of money lending apps from the Google Play Store to sanction the practice of some online lending platforms. Such platforms harvested excessive information from their users without legitimate purpose through the use of unreasonable and unnecessary apps permissions, including saving and storing their clients’ contact list and photo gallery ostensibly to evaluate their creditworthiness. Yet an applicant’s creditworthiness may be determined through other lawful and reasonable means. Moreover, these apps have also been the subject of more than 2000 complaints of unauthorized use of personal data that resulted in harassment and shaming of borrowers before persons in their mobile devices’ contact list to collect debts.
Such behaviors and practices cannot be considered acceptable because users have supposedly given their “legitimate consent” to them, which was the companies’ first line of defence. This, Commissioner Liboro said, combined with the privacy paradox, urges the data protection community to reconsider the current regulatory paradigm which operates in Asia and globally. As policymakers now regulate in hyperscale – with encompassing laws coming up in China, India, Indonesia, Thailand, and so many countries in ASEAN hopping on, impacting millions of data subjects –, the current dependence on consent and paper compliance should be replaced with accountability and added onus on organisations to ensure and demonstrate compliance. Privacy accountability is a compelling force, and accountable organisations foster trust and thrive, said the Commissioner.
The workshop set the scene and informed the discussion around consent and accountability in the APAC jurisdictions. All participants agreed on the need to reconsider the use of the consent legal ground in the region. The datification of society as well as the global dimensions of privacy and data protection promise to urge policy makers to aim for convergence, while respecting the legal culture and approach of each separate jurisdiction.
Commissioner Liboro concluded the event by expressing his appreciation to everyone who participated in the discussions, and reminded the participants that this conversation aims at setting the foundations of a collective response that will benefit the privacy ecosystem in the Asia-Pacific region.
The next steps of the FPF ABLI project will be announced soon.
Sources:
Peter Leonard, Notice, Consent and Accountability: addressing the balance between privacy self-management and organisational accountability: A paper for the Office of the Australian Information Commissioner (June 2020)
Anna Johnston, Re-thinking transparency: If notice and consent is broken, what now? (29 May 2020)
Takeshige Sugimoto, A New Era for Japanese Data Protection: 2020 Amendments to the APPI (13 Apr 2021)
Microsoft Policy Paper, Strengthening Privacy Regulatory Coherence In Asia (2020)
Findings of FPF Europe Dublin Privacy Symposium Designing for Trust: Enhancing Transparency & Preventing User Manipulation (23 June 2021)
TTC Labs Data Innovation Program, Designing a visual consent flow for people with low literacy: the case of reach52 (Startup Station Singapore, Session 1, 2018)
Malavika Raghavan, Anubhutie Singh, Building safe consumer data infrastructure in India: Account Aggregators in the financial sector (Part I); Building safe consumer data infrastructure in India: Account Aggregators in the financial sector (Part II) (2019)