Apr 30, 2026 · 7 min read
Brussels Just Told Meta Its Age Check Is a Birthday Field—And 12% of European Children Under 13 Are Already On Instagram
The European Commission's preliminary finding under the Digital Services Act says Meta cannot rely on self declared birthdays to enforce its own minimum age. The reporting tool to flag a minor's account takes seven clicks. The maximum penalty is 6% of Meta's global annual turnover.
The Preliminary Finding
On April 29, 2026, the European Commission issued a preliminary finding that Meta is in breach of the Digital Services Act for failing to prevent minors under 13 from using Instagram and Facebook. The investigation began in May 2024. The conclusion, after almost two years of evidence gathering: Meta's age controls are not credible enforcement, and underage usage is widespread.
The Commission's specific complaint is structural. Per the press release, Meta has not "diligently identified, assessed and mitigated systemic risks" related to underage access. The age gate is a self declared birthday at signup. There is no verification. Detection of underage accounts after the fact is described as "limited." And the reporting tool that lets users flag a child's account is, in the Commission's words, "difficult to use"—it requires up to seven clicks to surface.
10 to 12 Percent of European Children Are Already On the Platforms
The number that anchors the Commission's case is the prevalence figure. According to the Commission's own measurement, between 10% and 12% of children under 13 are on Instagram or Facebook in the EU, despite Meta's stated minimum age of 13.
That figure directly contradicts Meta's internal risk assessments, which claim the under-13 population is negligible. Per the Commission's quoted language reported by Euronews, Meta "disregarded readily available scientific evidence indicating that younger children are more vulnerable to potential harms."
If a tenth of children under 13 in Germany, France, Italy, and Spain are on Instagram, the question stops being whether Meta missed a few accounts. The question becomes whether self declared age has any enforcement value at all.
Six Percent of Global Revenue
If the Commission moves from preliminary finding to formal non compliance decision, the maximum DSA penalty is 6% of Meta's worldwide annual turnover. Meta reported total revenue of approximately $164 billion in 2024. A maximum DSA fine would land near $9.8 billion.
For context, that exceeds every prior privacy fine ever imposed on a tech company. Meta's largest GDPR fine to date—€1.2 billion from the Irish Data Protection Commission in 2023 for unlawful EU to US data transfers—would be roughly an eighth of the DSA maximum. Per CNBC's coverage, the Commission's playbook in prior DSA cases has been to negotiate consent decrees with major behavioral changes attached, then reserve maximum fines for repeat violations.
The Seven Click Reporting Tool
The most operationally specific part of the Commission's finding is the critique of Meta's reporting flow. Per TechPolicy.Press's analysis, the form an EU citizen has to fill in to report a suspected underage account requires up to seven separate clicks to reach. The Commission characterized this as "difficult to use" and noted that even submitted reports inconsistently triggered follow up action.
The DSA includes specific obligations for "trusted flagger" mechanisms and accessible reporting. Burying the form behind seven clicks is the kind of design choice the regulation was written to challenge. The Commission's posture is that friction in user reporting is a deliberate compliance failure, not a UX accident.
The EU Age Verification App Lands the Same Day
The Commission's timing was not coincidental. On the same day as the Meta finding, the Commission adopted a formal recommendation that all EU member states make the EU age verification app available by the end of 2026, either as a standalone tool or integrated into the European Digital Identity Wallet.
The recommendation does what the Meta enforcement implies: it removes the excuse that "there's no scalable way to verify age." Once a government issued, privacy preserving age verification API exists at EU scale, platforms cannot defend a self declared birthday by claiming there is no alternative. The App uses zero knowledge proofs to confirm "this user is over 18" or "over 13" without revealing the underlying identity—a model designed to address the legitimate privacy concerns about age gating that have stalled implementation in the US and UK.
This is the same regulatory pattern that produced GDPR enforcement: first the rule, then the tooling, then the fines. The EU is now in phase three for age verification. Compliance officers reading this should treat the verification app's 2026 deployment date as the de facto deadline for any platform serving EU users.
Meta's Defenses Are Predictable—And Probably Insufficient
Meta has not yet issued a formal response to the preliminary finding. Based on its prior posture in DSA proceedings, the likely defenses are: (1) parental controls and Family Center are sufficient mitigation, (2) the under-13 prevalence figure is methodologically flawed, and (3) the company is investing in AI age estimation tools that improve accuracy without invasive verification.
Each of these has been raised in prior cases and rejected by the Commission. The DSA's standard is "diligent" mitigation, not best effort. The Commission has signaled that voluntary tools do not satisfy the obligation if measurable harm persists at the scale Brussels is documenting. Per Business Standard's coverage, Brussels has invited Meta to respond, but the procedural path now leads toward a formal decision unless Meta concedes structural changes.
What This Sets Up for the Rest of the Industry
The Meta finding is not a one off. The Commission has parallel DSA proceedings against TikTok, X (Twitter), and Aliexpress. The age verification finding establishes the legal standard the others will be judged against. Any major platform with EU presence that relies on self declared age is now on notice.
The pattern echoes what Brussels did with consent banners: an initial enforcement establishes the precedent, recommendations and tools fill in the implementation path, and a wave of follow up fines arrives over the next 18 to 24 months. Compliance teams reading the Meta press release should be modeling what their own age verification stack looks like under DSA scrutiny—and reading the related FTC COPPA rule update that took effect April 22 in the US for the parallel American enforcement direction. Earlier EU action against Italy's Poste Italiane app surveillance program shows how quickly preliminary findings become real fines.
What Parents and Users Should Take From This
The Commission's measurement—10 to 12 percent of children under 13 on the platforms—is itself the most useful number for parents. If Meta's age controls were effective, that figure would be near zero. It isn't.
Practical takeaways for non EU users who don't get the regulatory protection:
- Don't assume platform age gating works. Self declared birthdays are unenforced almost everywhere. If you have a child under 13 you don't want on Instagram or Facebook, the device level controls (Apple Screen Time, Google Family Link) are more effective than platform settings.
- Use the Meta reporting tool anyway. The Commission criticized it for being buried, but it works. From any account, "Report → Account → They're a child under 13" exists. Reports do trigger review, even if inconsistently.
- Watch the EU app rollout. If you live in the EU, the age verification app launching by end of 2026 will eventually become the standard for platforms operating in Europe. Once available, it's the privacy respecting alternative to handing identity documents to every site.
- Compliance teams: your age verification approach is now a regulatory liability. The DSA standard moves the bar above "good faith effort." If your platform serves any EU users, plan for hard verification by 2027.
The Larger Direction
Brussels has now done with the DSA what it did with GDPR a decade ago: target the largest possible enforcement subject, set a precedent against the most defensible posture (Meta's "we tried our best" age gate), and pair the enforcement with implementation tooling so the rest of the industry can't claim there's no path forward. The math the Commission ran—10% prevalence, 6% maximum fine, 13 and under target population—is a deliberate framing of how scale interacts with regulation.
The next 18 months will test whether Meta concedes operational changes or fights the finding through the Court of Justice. Either path lands at the same place: self declared age is no longer an acceptable control, and any platform that relies on it for EU users is operating on borrowed time.