The EU Commission’s Approach to Age Verification: Mobile Apps, DSA Enforcement, and Challenging National Social Media Bans
On 29 April 2026, the European Commission published its Recommendation for a common approach for EU-wide age verification technologies, a non-binding policy document with the aim of harmonizing future measures for the protection of children online.
This blog post outlines the Commission’s emerging strategic approach to the implementation of EU-wide age verification measures, provides an analysis of the legal framework envisioned for their deployment, and includes notes on the Commission’s thinking with regard to possible social media bans in individual Member States. A number of key takeaways emerge:
- In response to growing tensions surrounding the possibility of social media bans in a number of EU countries, the Commission is accelerating its attempts to enable the roll-out of age verification solutions, urging Member States to implement these by 31 December 2026;
- An analysis of the applicable legal framework, and primarily the Digital Services Act (DSA), shows that since none of its Articles include specific mention of minimum age requirements or of age verification measures, it is still unclear whether age verification solutions will be voluntary or mandatory – it is worth noting here, however, that this does not mean that age assurance methods should not be implemented, as shown by emerging DSA enforcement on the topic;
- While the Commission’s 2025 Guidelines on the protection of minors under the DSA focus on a variety of age assurance methods, this Recommendation aims to advance the EU’s strategic approach to age verification in particular, contributing to a growing global trend focused on age verification for service access or limitations;
- The Commission aims to develop an EU age verification blueprint – a publicly available technical specification comprising the architecture, protocols, and interfaces to be used by Member States and providers to roll out national age verification measures;
- An EU age verification scheme will also be developed by the Commission to establish the framework for “proof of age attestations,” including a list of trusted EU-based providers for such attestations;
- While significant references are made to privacy and to ensuring that age verification measures are “privacy-preserving,” there is no reference to the GDPR and little detail regarding the technical parameters that will be expected;
- This gap is surprising, taking into account that DPAs have started to enforce GDPR in the context of age verification solutions in recent months, after the EDPB adopted a Statement on age assurance in February 2025.
- Invoking Directive 2015/1535 on technical regulations and two CJEU cases from 1996 and 2000, the Commission aims to make it procedurally challenging for any individual EU Member State to implement a social media ban.
1. Applicable legal framework – From the Digital Services Act to the (not-yet-published) Digital Fairness Act
Article 28(1) DSA states that “providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.” While the remainder of the Article covers advertising based on profiling and the further processing of personal data for the purpose of proving whether the user is a minor, it does not include mention of age verification measures.
The Commission’s Recommendation, in paragraph 3, also makes reference to the July 2025 Guidelines for the protection of minors under the DSA, also issued by the Commission, which specifies general guidance on the application of age assurance measures. It is worth noting that, while in the 2025 DSA Guidelines the Commission focuses on self-declaration, age estimation, and age verification as tools to ensure the protection of minors online, the 2026 Recommendation aims to advance the EU’s strategic approach to age verification in particular, recognizing the higher degree of accuracy of the latter.
The Recommendation additionally references Articles 34 and 35(1) of the Digital Markets Act (DMA) in which Very Large Online Platforms and Online Search Engines are required to “assess and mitigate actual or foreseeable risks that their service may pose to the protection of minors.” It also references Article 44(1)(j) DSA which enables the Commission to develop voluntary targeted standards to protect minors online, and recognizes that no such standards have been developed yet.
The Audiovisual Media Services Directive, through which video-sharing platforms have an obligation to protect minors from accessing harmful audiovisual content, and the Unfair Commercial Practices Directive which recognizes minors as vulnerable users that must be protected, similarly form the basis of the applicable legal framework for age verification in the EU. Finally, the upcoming Digital Fairness Act is expected to fill any gaps left unaddressed, though the Recommendation does not specify which ones.
Two notes are particularly relevant when considering the applicable legal framework:
- Mandatory or voluntary? – While the requirement to implement age verification tools is not explicitly included in any of the abovementioned laws as a legally binding obligation for digital services providers, both the Commission’s DSA Guidelines and the Recommendation may be taken into consideration by national Courts when interpreting existing, binding EU law.
- Lessons from emerging enforcement under the DSA is, however, showing the inadequacy of age assurance methods currently being implemented for compliance which are, so far, largely based on self declaration and age estimation (rather than age verification) – for example, the Commission preliminarily finds (April 2026) Meta in breach of the DSA for failing to prevent minors under 13 from accessing Facebook and Instagram; and the Commission opens an investigation into Snapchat (March 2026) for not preventing users under 13 from accessing the app, and not adequately assessing whether users are under 17, which it deems necessary in order to ensure an age-appropriate experience.
- Enforcement also shows inconsistencies in EU harmonization regarding the age of a minor – While there is no consistent and agreed upon age of the child under EU law, the Recommendation defines a “minor” as anyone under the age of 18 – however, across individual Member States the age of the minor can range from 13 to 18.
- Under the GDPR, which is not referenced by the Recommendation, the processing of personal data of a child in relation to the offer of information society services directly to them is lawful where that child is at least 16 years old (Article 8(1)), though Member State law may provide for a lower age (which must not be under 13).
- Since Member States have discretion in defining the age of a minor within their national territory, “EU-wide” age verification measures may become fragmented depending on this definition.
2. Age verification blueprint and age verification scheme
When it comes to operationalizing EU-wide age verification tools, the Commission will develop a blueprint consisting of the technical specifications that such tools should follow and an open source implementation as a mobile app that can be customized to national contexts. This will be consistent with the EU Digital Identity Wallet, acting as an additional “age verification functionality”, which Member States are expected to operationalize by the end of 2026. It is worth noting that the EU Digital Identity Wallet is also voluntary for citizens and businesses, although Member States have the obligation to make the option available.
The Commission will additionally develop an age verification scheme, with requirements for providers of proof of age attestations and age verification solutions to meet, and including a list of EU-based trusted providers of such attestations. The role of the attestation is to ensure conformity with the criteria of effectiveness of the age verification solution, namely accuracy, reliability, robustness, non-intrusiveness, and non-discrimination (these criteria are outlined in the Commission’s 2025 DSA Guidelines, mentioned above).
Two notes are particularly relevant here:
- While the Recommendation does not include significant details regarding the proof of age attestations, its reference to conformity is reminiscent of the Conformity Assessment required under the EU AI Act, hinting at the further expansion of a product safety approach across the EU digital regulatory ecosystem;
- The Recommendation specifically notes that the trusted providers of such attestations, which can be public or private entities, must be EU-based, recalling the Commission’s broader strategic goals in the area of EU digital sovereignty.
From a global perspective, the Commission’s age verification scheme may be comparable to recent age assurance developments in other jurisdictions—such as the ongoing rulemaking efforts by the New York Attorney General’s Office to establish age assurance standards and accuracy benchmarking requirements under the SAFE for Kids Act, and Australia’s Age Assurance Technology Trial which assessed a variety of age assurance solutions and vendors but sought only to determine the feasibility of age assurance mechanisms from participating vendors rather than assess provider conformity with legal requirements. Notably, the Commission’s efforts seemingly go beyond both New York’s and Australia’s since it aims to establish requirements for conformity supplemented by a list of EU-vetted, trusted providers for use in legal compliance.
3. “Privacy-preserving” age verification?
Notable references are made throughout the Recommendation to the importance of privacy. Through this Recommendation, the Commission aims to facilitate the development of “harmonised, privacy-preserving, cybersecure, data protection compliant and robust EU age verification solutions.” Without reference to the GDPR, the Recommendation nonetheless relies on key data protection principles and requirements, interpreting “privacy-preserving” as preventing unnecessary data collection, unauthorized access or misuse of personal information.
To be privacy-preserving, the age verification solution should, by default, limit the information shared to the relying party to a true or false response regarding the age of the individual, without providing any further information about them. Additionally, the Recommendation states that verification methods “should include technical safeguards to protect citizens from privacy and data protection risks, such as tracking of their online activity, including the use of zero knowledge proofs.”
While there is no further elaboration of the expected technical safeguards or the privacy-enhancing technologies that could be deployed, it is likely that there will be significant interest in these attributes, particularly following the security flaws found in the EU “age checking app” launched by the Commission in early April.
4. On social media bans: From political debate to procedural impossibility
The Commission’s Recommendation is timely in that it comes as some individual EU Member States, such as France (for under 15s), Spain (for under 16s), and Germany (for under 14s, with stricter rules for under 17s), consider social media bans.
With a view to harmonization and the prevention of barriers within the internal market, the Recommendation invokes an administrative requirement found in Directive 2015/1535 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services. On this basis, where Member States consider introducing technical measures restricting minors’ access to online platforms, they have an obligation to report such measures to the Commission before they are adopted. This notification triggers a 3-month (extendable) standstill period during which the Member State is prevented from adopting the restriction, and a series of dialogues both with the Commission and with other Member States through the Digital Services Expert Group. Digital Services Coordinators, on the basis of the DSA, can also bring the issue for consideration to the European Board for Digital Services, a forum for cooperation for ensuring the coherent enforcement of the DSA.
Should a Member State fail to notify the Commission of the draft technical measure they are considering for restricting minors’ access to online platforms, it would be considered “a procedural defect that renders the measure unenforceable against individuals in national court proceedings”, and would be inapplicable to individuals. The Recommendation cites CJEU Case C-194/94, CIA-Security and Case C-443/98, Unilever in its reasoning. Furthermore, the Commission could initiate proceedings against a Member State should the proposed national measures regarding restricting minors’ access to online platforms be found to be incompatible with the DSA.
As regulators globally continue to navigate the intensifying youth online safety space, the Commission’s Recommendation adds another thread to the global patchwork of proposals aimed at restricting or banning social media access for minors. Several countries outside the EU are considering bans for minors, such as Australia and Indonesia which both recently started implementing social media bans (for under 16s), or targeted restrictions on social media access, such as in Brazil (which requires that accounts of minors under 16 are linked to a parent account in the recently effective Digital ECA) and the US (where legislation is pending that would ban minors under 13 from holding accounts and restrict use of certain platform features within teen accounts).
5. Concluding Notes
It is still uncertain how the age verification landscape will develop across the EU. As enforcement shows that the currently implemented lower-accuracy age assurance measures are increasingly deemed incompatible with the DSA, and political pressure grows within and across Member States to more adequately protect minors online, the Commission is attempting to set the tone for a harmonized approach.
While the Recommendation is a non-binding, soft law instrument, it shows the Commission’s strategic direction and positioning regarding age verification measures. Nevertheless, specific details regarding the technical specifications, protocols, interface, the interoperable and privacy-preserving features of such tools, as well as how (and when) each individual Member State will operationalize them, remain open questions.