EU Pushes New Age Verification App to "Tighten Online Child Protection Rules"

· novinite.com

The European Commission is urging EU member states to accelerate the rollout of a new European age verification app, aiming to have it operational by the end of the year as part of broader "efforts to strengthen online protection for children".

The tool is designed to allow users to confirm they meet minimum age requirements without disclosing personal details such as exact age or identity. Authorities say it is intended to reduce exposure of minors to harmful or inappropriate online content while maintaining user privacy.

Countries will be able to deploy the system either as a standalone application or integrate it into the European Digital Identity framework. The Commission has issued technical guidance to ensure rapid deployment and cross-border compatibility, with member states now responsible for adapting and implementing the system.

Protecting minors online has been identified as a central objective under the EU’s Digital Services Act, which requires platforms to maintain high standards of safety, privacy, and security for younger users.

“Effective and privacy-friendly age verification is another piece of the puzzle we are working towards for an online space where our children are safe and able to consume it positively and responsibly, without restricting the rights of adults,” said Henna Virkkunen, the Commission’s executive vice president for technological sovereignty, security, and democracy.

The push comes as the EU continues broader scrutiny of major technology companies over their handling of underage users. In a separate case, the Commission has accused Meta of failing to properly enforce its own age restrictions across platforms such as Facebook, Instagram, and WhatsApp.

Officials said preliminary findings suggest that Meta’s systems allow users under the age of 13 to bypass safeguards by entering false birth dates during account creation. Regulators also pointed to weaknesses in reporting tools, which they said can be cumbersome and ineffective for flagging underage accounts.

The company has been given an opportunity to respond before any final decision is made. If confirmed, violations of EU rules could result in significant financial penalties under the Digital Services Act framework.

“Terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users, including children,” Virkkunen said in relation to enforcement expectations.

The European Union’s broader digital strategy has focused on regulating platforms rather than imposing blanket bans on access for minors. However, several countries are exploring stricter approaches, including age-based restrictions on social media use.

Internationally, Australia has already introduced a ban on social media use for users under 16, while countries including the United Kingdom, France, and Denmark have debated similar measures. Germany has also expressed support for raising minimum age thresholds for platforms such as Instagram and TikTok in an effort to reduce screen time among children.

A key privacy-based criticism is that the app could still concentrate sensitive verification processes in a single system, even if it is designed to avoid storing exact personal data.

Opponents argue that any form of centralized age-check infrastructure may create new risks, since it could become a target for cyberattacks or enable indirect tracking through underlying digital identity links. There is also concern about “function creep,” where a tool introduced for age verification could later be expanded into broader identity controls.

Even if anonymized on the surface, privacy advocates warn that backend systems may still allow correlation or data reuse if safeguards differ across EU states, potentially undermining the original goal of limiting personal data exposure online.