The European Commission Considers Amending the General Data Protection Regulation to Make Digital Age of Consent Consistent
The European Commission published a Communication on its mandated two-year evaluation of the General Data Protection Regulation (GDPR) on June 24, 2020 in which it discusses as a future policy development “the possible harmonisation of the age of children consent in relation to information society services.” Notably, harmonizing the age of consent for children across the European Union is one of only two areas in the GDPR that the Commission is considering amending after further review of practice and case-law. Currently, the GDPR allows individual Member States some flexibility in determining the national age of digital consent for children between the ages of 13 and 16. However, upon the two-year review, the Commission expressed concerns that the variation in ages across the EU results in a level of uncertainty for information society services–any economic activities taking place online–and may hamper “cross-border business, innovation, in particular as regards new technological developments and cybersecurity solutions.”
“For the effective functioning of the internal market and to avoid unnecessary burden on companies, it is also essential that national legislation does not go beyond the margins set by the GDPR or introduces additional requirements when there is no margin,” stated the Commission in its report. Some believe stringent child privacy requirements can push companies to abandon the development of online services for children to avoid legal risks and technical burdens, which creates a void for companies from countries with lax child privacy protections. In addition to the GDPR’s varying ages of digital consent, there are also differing interpretations of the obligations on information society services regarding children. For example, the United Kingdom’s proposed Age Appropriate Design Code defines a child as a person under the age of 18 and lays out additional requirements for information society services to build in privacy by design to better protect children online.
Prior to the GDPR, European data protection law did not include special protections for children, instead providing the same privacy protections across all age groups. The GDPR recognized that children are particularly vulnerable to harm and exploitation online and included provisions extending a higher level of protection for children. However, a universal consensus on the age of a child does not exist, and the flexibility provided by the GDPR creates a fragmented landscape of ages requiring parental consent across the EU. While complying with different ages of consent is relatively straightforward in the physical world where activities are generally limited within national boundaries, given the nature of online services operating across states, the lack of consistency of ages is a significant barrier for companies. Information society service providers are obliged to verify the age of a user, their nationality, and confirm the age of consent for children for that Member State prior to allowing access to their services. This burden may pose a competitive disadvantage for companies operating in the EU or result in measures depriving children and teens the benefits of using these services, as companies choose either to invest significant resources in age verification and parental consent mechanisms or to abandon the market for children and age gate their services instead.
The Commission also initiated a pilot project to create an infrastructure for implementing child rights and protection mechanisms online, which is scheduled to commence on January 1, 2021. The project aims to map existing age-verification and parental consent mechanisms both in the EU and abroad and assess the comprehensive mapping results to create “an interoperable infrastructure for child online protection including in particular age-verification and obtaining parental consent of users of video-sharing platforms or other online services.”
Currently, Member States require or recommend varying age verification and parental consent mechanisms. In addition to the UK’s Age Appropriate Design Code, the German youth protection law requires businesses to use scheduling restrictions to ensure that content harmful to children is not available during the day when children are online; to use technical methods to keep children from accessing inappropriate content, such as sending adults a PIN after age verification; or to use age labeling that youth protection software, downloaded by parents on their children’s devices, can read. However, the efficacy of these methods is unclear and unproven. As such, a sweeping review of existing methods may reveal best practices to be widely adopted within the EU and serve as a model for other countries, including the United States.