News & Insights

Client Alert

April 24, 2026

When Regulators Build the Age Verification App: It Is Not Necessarily for the Greater Good


On 15 April 2026, European Commission President Ursula von der Leyen announced that the EU's age verification app is "technically ready" and will soon be available to citizens, declaring that the world's biggest tech platforms have "no more excuses" for failing to check users' ages before granting access to restricted content such as pornography, gambling, and potentially social media.

The new app is presented as a free, open-source tool built on the same technical foundations as the forthcoming EU Digital Identity Wallets. While the Commission touts the solution as the "gold standard" for privacy, cybersecurity researchers identified vulnerabilities within hours of its release, and privacy advocates have raised pointed concerns about online anonymity and freedom of expression. But the real significance of the announcement may lie elsewhere: by offering a free, Commission-backed tool and positioning it as the compliance benchmark, the EU is effectively reversing the burden of proof. Platforms that decline to adopt it will need to demonstrate that their alternative mechanisms offer equivalent privacy and accuracy guarantees, a burden that grows heavier with each enforcement action the Commission brings.

In the following sections, we examine the legal foundations of the age verification initiative under the DSA, explain how the app works in practice, explore the key challenges it presents, and identify practical takeaways for companies that may need to integrate age verification into their services.

Key Takeaways for Online Providers

  1. Self-declaration is dead. Tick-box age confirmations are no longer sufficient. While most major platforms have already moved beyond simple self-declaration, the Commission’s guidelines and recent enforcement actions make clear that any residual reliance on self-reported age will attract regulatory scrutiny.
  2. Conduct a risk assessment now. Companies should map their services against the "5Cs" risk framework and calibrate age assurance measures accordingly. These assessments must be updated annually to reflect the evolving regulatory environment.
  3. The EU app is the benchmark, not a mandate. Platforms are not required to adopt it, but any alternative must demonstrably meet the same standards, and platforms bear the burden of proving equivalence.
  4. Privacy compliance is non-negotiable. Any age verification mechanism must comply with GDPR data minimization (Article 5(1)(c)) and Article 28(3) DSA, meaning platforms cannot collect more personal data than strictly necessary to determine a user’s age. The EDPB's Statement on Age Assurance provides detailed guidance on how to reconcile these twin obligations and should be treated as a critical reference point.
  5. Prepare for cross-jurisdictional complexity. National requirements continue to diverge significantly, with some Member States imposing outright social media bans for minors that go well beyond the DSA framework. Platforms operating across multiple EU jurisdictions should closely monitor these developments and be prepared to adapt their compliance strategies accordingly.
  6. Refresh your DPIA. Any age assurance deployment will require a Data Protection Impact Assessment explicitly addressing risks to children, a step that should not be deferred.
  7. Watch the security space. The rapid hacking of the EU app is a cautionary tale. Any solution must undergo rigorous security testing, and companies should monitor ongoing updates to the open-source code for patches and improvements.
  8. Plan for EUDI Wallet integration. The age verification app is a transitional solution. Companies would be well advised to begin planning their architecture now to accommodate both the standalone app and the wallet-integrated version that will eventually supersede it.

The Legal Backbone: Article 28 DSA and the Protection of Minors

The EU's age verification push is rooted in the Digital Services Act (Regulation (EU) 2022/2065, "DSA"), which establishes a tiered system of due diligence obligations for online intermediaries.

The primary provision addressing the protection of minors is Article 28, which requires all providers of online platforms accessible to minors to "put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service." In addition, Article 28 prohibits targeting minors with personalized advertising based on profiling. Critically, Article 28(3) stipulates that platforms are not required to process additional personal data solely to determine whether a user is a minor, thereby creating a dual obligation: platforms must protect children, but any age assurance mechanism they deploy must also comply with the GDPR's data minimization principle under Article 5(1)(c). In practice, this balance has proved extremely difficult to strike. The Commission’s proceedings against social media and multiple pornographic platforms in 2026 all centered, in part, on the inadequacy of existing age assurance measures, demonstrating that regulators view most current approaches as falling short.

At the same time, market solutions are evolving rapidly: AI-powered facial age estimation tools can now determine whether a user meets an age threshold in seconds from a selfie, without retaining biometric data or requiring document uploads, achieving accuracy rates that in some cases surpass human judgment.

For the largest platforms (VLOPs and VLOSEs), the DSA imposes additional layers of responsibility. These platforms must identify and assess systemic risks to users, including risks to children’s rights and wellbeing. Where significant risks are found, they must take proportionate steps to mitigate them, which may include adapting recommendation algorithms, redesigning user interfaces, or deploying age assurance mechanisms. The Commission's enforcement proceedings in February 2026, in which it provisionally found that one platform's "addictive design" features harmed minors, vividly illustrating these obligations in practice.

Against this backdrop, on 14 July 2025, the Commission published guidelines under Article 28(4). While these guidelines are not strictly legally binding, the Commission has stated that it will use them as a "significant and meaningful benchmark" when assessing compliance. They may also inform national Digital Services Coordinators in their enforcement actions, making them an indispensable reference for any platform seeking to navigate the evolving regulatory landscape.

In substance, the guidelines recommend that platforms assess risks to minors through a "5Cs" framework: content, conduct, contact, consumer, and cross-cutting risks. They further specify that age assurance methods must be accurate, reliable, robust, non-intrusive, and non-discriminatory, and they explicitly state that self-declaration alone is insufficient. Where age assurance is deemed required, platforms should offer at least two methods and provide a redress mechanism for incorrect assessments.

Notably, the guidelines identify specific scenarios in which full age verification, as opposed to softer age estimation, is required. These include access to high-risk content such as pornography, gambling, alcohol, and tobacco advertising, or situations where a risk review identifies significant risks that less intrusive measures cannot adequately manage. In such scenarios, the Commission recommends the EU age verification app and the EU Digital Identity Wallet as the reference standard, underscoring the Commission's willingness to enforce these obligations proactively.

How the App Works: Architecture and Privacy Design

The app is built on the same technical foundations as the forthcoming EU Digital Identity Wallets. Under the amended eIDAS Regulation (Regulation (EU) 2024/1183, “eIDAS 2.0”) every Member State must offer at least one EU Digital Identity Wallet ("EUDI Wallet") by 6 December 2026. The age verification app serves as a transitional tool: it uses the same technical specifications as the EUDI Wallets but focuses exclusively on proving age, giving platforms and users an early, practical taste of the broader digital identity infrastructure that will follow.

In practical terms, the app offers three methods to prove age: national ID card verification, passport-based verification (including biometric matching), or attestation via a trusted third party. This last category could include entities such as banks or established digital identity providers. Belgium’s Itsme, for example, which is used by over seven million people and is already certified as the highest eIDAS assurance level, illustrates the type of trusted intermediary that could serve as an attestation source within the EUDI Wallet ecosystem. Once verification is complete, the app stores only whether the user is above a certain age threshold (13, 15, or 18) without retaining names, dates of birth, or ID numbers, thereby minimizing the personal data footprint from the outset.

The app’s privacy architecture is designed around a key principle: proving age without revealing identity: It uses Zero-Knowledge Proof ("ZKP") cryptography, a technique that allows the user’s device to confirm a single fact, such as “this user is over 18”, without transmitting any identifying information to the platform. Each verification is independent, meaning that responses to different platforms cannot be linked or correlated, which prevents cross-service tracking of individual users.

The solution is fully open source, allowing independent security auditing and fostering broader trust. It is worth noting that the "EUDI Wallet" is a technical and legal standard, not a single centralized app. While each Member State must offer at least one compliant wallet, the eIDAS 2.0 framework also permits private sector solutions. In that sense, the aforementioned Belgium’s Itsme, developed by a consortium of Belgian banks and telecom operators, already provides eIDAS-certified digital identity services and is actively preparing for EUDI Wallet compliance. Seven frontrunner Member States - France, Denmark, Greece, Italy, Spain, Cyprus, and Ireland - are meanwhile piloting the Commission’s age verification app and planning to integrate it into their national wallets.

Beyond the app itself, eIDAS 2.0 also imposes mandatory acceptance obligations. VLOPs must accept the EUDI Wallet whenever they require user authentication, likely from December 2026 or early 2027. Other private parties required by law to implement strong authentication (including banks, healthcare providers, and telecoms) must accept it by 6 December 2027.

While the standalone age verification app is not mandatory, the Commission has made clear that it is their "preferred solution" and that platforms choosing alternatives must demonstrate equivalent privacy and effectiveness guarantees. This framing effectively positions the app as the de facto compliance benchmark, even in the absence of a strict legal obligation to adopt it.

Key Challenges: Between Ambition and Execution

Security Concerns

Within hours of the source code being published on GitHub, a number of security stakeholders claimed to have hacked the app in under two minutes, confirmed that biometric authentication could be bypassed, and warned that the rushed launch "could undermine trust in future digital identity wallets." These revelations are particularly troubling given that more than 400 experts had already sent an open letter calling for a "moratorium on deployment plans" until there is scientific consensus on the benefits and harms of age assurance. Taken together, these developments suggest that the gap between the Commission's ambitious timeline and the app's technical maturity remains significant.

Privacy and Fundamental Rights

Despite the ZKP architecture, significant privacy concerns remain. First, the app relies on IP addresses to determine user location, meaning it can be circumvented with a VPN, precisely by the users most likely to seek to bypass age restrictions. Second, critics have raised concerns about government data access and tracking risks: EDRi has argued that "other digital rights, the freedom of information, of expression of young people and of the adults who won't be able to use the tool, are completely forgotten," while the Center for Democracy and Technology Europe has warned about insufficient protections for high-risk individuals such as journalists and human rights defenders. Third, the EDPB's Statement on Age Assurance has emphasized that platforms must carefully balance data minimization under Article 5(1)(c) GDPR against the safety objectives of Article 28 DSA, a standard that is far easier to prescribe than to execute in practice.

The Fragmentation Risk

In many ways, the Commission’s harmonized approach may be arriving too late. Several Member States have already moved ahead with national measures that go well beyond the DSA. France has banned social media for those under 15. Spain and Denmark have introduced their own age restrictions, and Greece has followed suit. Meanwhile, the European Parliament's IMCO Committee has urged an EU-wide "digital minimum age" of 16. With these national bans already taking effect and shaping the political landscape, the Commission’s app-based approach risks being overtaken by more radical legislative interventions. The EC, for its part, views certain national initiatives as potentially clashing with the DSA's country-of-origin principle, creating a real risk of lengthy proceedings before the CJEU and, consequently, delayed coherent enforcement across the bloc.

A Glance at the UK: A Parallel but Distinct Approach

The stats on the topic speak for themselves: one in five UK internet users are children, but they are using an internet that was not designed for them. The Online Safety Act 2023 (“OSA”) now serves as the primary legislative framework for age-gating and age verification, supplemented by UK data protection laws affording heightened protections to children, the ICO’s Age Appropriate Design Code (applicable to information society services “likely to be accessed by children”), and advertising self-regulation aimed at shielding minors from harmful or age-restricted content.

The OSA imposes duties on certain regulated services to prevent children from encountering specified categories of content (e.g., self-harm and eating disorder material) and makes age verification and age estimation central to treating parts of a service as effectively adult-only. Where required, such measures must be “highly effective” at correctly determining whether a user is a child. Ofcom, as the regulator overseeing this framework, must recommend highly effective age assurance methods and, in doing so, must have regard to the ICO’s Age Appropriate Design Code, working closely with the ICO to ensure compliance with UK data protection laws.

Ofcom has released several pieces of guidance as part of the broader OSA implementation, including its guidance on highly effective age assurance dated 24 April 2025 (“Age Assurance Guidance”), which details the four criteria i.e. technical accuracy, robustness, reliability, and fairness, that the age assurance process must fulfill. Notably, Ofcom has dismissed self-declaration and payment methods not requiring users to be over 18 as incapable of being highly effective, while endorsing photo-ID matching, facial age estimation, open banking, and credit card checks (UK individuals must be 18 or over to obtain a credit card) as methods capable of meeting the standard. Ofcom is expected to publish its age assurance statutory report by the end of July 2026, assessing how services have used age assurance in practice.

Unlike the EU’s Commission-backed app, the UK framework is not prescriptive: no single mechanism has been singled out. Instead, it adopts a risk-based approach, setting guardrails and parameters for selecting appropriate methods, including third-party solutions, to limit access to certain content. Platforms must use safe, proportionate, and secure methods, and any company that misuses personal data or fails to protect users could face penalties of up to the greater of £18 million and 10% of qualifying worldwide revenue.

Conclusion

The Commission's announcement is not merely a product launch; it is a regulatory signal of considerable force. By developing and offering the app for free, the Commission is doing something more subtle than creating a useful technology: it is establishing a compliance benchmark that effectively shifts the burden of proof onto every platform that chooses a different path. Any alternative mechanism must now demonstrably match the app’s privacy and accuracy guarantees, a standard that will only tighten as enforcement accelerates. At the same time, the age assurance technology landscape is evolving rapidly, with AI-powered solutions, reusable identity credentials and zero-knowledge proof architectures maturing at a pace that may outstrip the Commission’s own tool. Platforms that invest in flexible, privacy-by-design systems today will be better placed to meet regulatory expectations regardless of which specific technology ultimately prevails.

At the same time, the regulatory environment remains far from settled. The technical vulnerabilities exposed within hours of the app's release, the unresolved tension between child safety and online anonymity, and the patchwork of national measures all underscore how contested this space remains. The recent collapse of the EU’s CSAM scanning framework, where the temporary derogation allowing platforms to voluntarily detect child sexual abuse material expired in April 2026 amid a standoff between privacy advocates and child safety groups, offers a cautionary parallel: even where the political will to protect children is strong, finding a legally and technically sustainable path forward is anything but straightforward. That instability, however, does not reduce the compliance burden; it increases it. Platforms must build systems flexible enough to adapt to a moving target while robust enough to satisfy regulators today. 

The regulatory landscape will continue to evolve rapidly as Member States roll out national wallets, the Digital Fairness Act takes shape, and the Commission refines its enforcement approach. Remaining vigilant, monitoring legislative and technical developments closely, and stress-testing compliance frameworks against each new iteration will not simply be prudent — it will be essential.