A child who can type a fake birthday can open a Facebook or Instagram account. That is the entirety of Meta’s age-verification system for European users — a text box and good faith. After nearly two years of investigation, the European Commission has concluded what anyone with a seven-year-old and a smartphone already knew.

On Wednesday, the Commission issued preliminary findings that Meta breached the Digital Services Act by failing to prevent children under 13 from accessing Facebook and Instagram. The investigation, opened in May 2024, found that Meta’s own terms set 13 as the minimum age — then did almost nothing to enforce it.

A Text Box and a Shrug

The specifics are damning in their banality. A minor can enter a false birth date during sign-up with no controls to verify it. The tool for reporting an underage user requires up to seven clicks to reach the form, according to the Commission, and even when a report is filed, there is often no follow-up. The child keeps scrolling.

The Commission estimates that 10 to 12 percent of children under 13 in the EU use Facebook or Instagram. Meta’s own risk assessment, which the Commission described as “incomplete and arbitrary,” contradicted “large bodies of evidence from all over the European Union.” The company, the Commission found, “seems to have disregarded readily available scientific evidence indicating that younger children are more vulnerable to potential harms caused by services like Facebook and Instagram.”

Henna Virkkunen, the Commission’s lead official on tech policy, was blunt. “Our preliminary findings show that Instagram and Facebook are doing very little to prevent children below this age from accessing their services,” she said.

An Industry-Wide Challenge

A Meta spokesperson said the company disagreed with the findings. “We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age,” the spokesperson told CNBC. The company called age verification “an industry-wide challenge, which requires an industry-wide solution” and said it would have “more to share next week about additional measures rolling out soon.”

“An industry-wide challenge which requires an industry-wide solution” is a useful construction. It distributes responsibility so thinly that no single company bears the full weight of accountability. It is also what a company says when it has been running the same ineffective system for years and would prefer not to change it.

The Invoice Question

If the findings are upheld, Meta faces fines of up to 6 percent of its global annual turnover. With $201 billion in revenue for 2025, that figure could reach roughly $12 billion.

That would sting. But Meta made $201 billion. The fine is a cost of doing business until it isn’t — and the structural question beneath all of this is whether the DSA was designed to change behavior or merely to penalize its absence. Europe’s regulatory apparatus can generate impressive invoices. Whether those invoices alter corporate conduct remains the open question.

Not an Isolated Problem

The Meta investigation sits in a widening field of scrutiny. In March, two US court rulings found the company liable — one for platform design contributing to addiction and mental health harms among teenagers, another for misleading users about children’s safety on its platforms. Australia has implemented a blanket social media ban for under-16s. The UK, Spain, and France are pursuing similar restrictions. The UK’s Children’s Wellbeing and Schools Bill, now approaching final passage, would give ministers power to impose age restrictions for under-16s.

The Commission’s separate investigation into whether Facebook and Instagram cause “behavioral addictions in children” through algorithmic “rabbit hole” effects remains ongoing.

Meta will now review the investigation file and mount a defense. The Commission has called on the company to update its risk-assessment methodology and implement more robust age verification. Separately, the EU is urging member states to deploy a union-wide age verification app by the end of the year — though that app was reportedly hacked in under two minutes during a demo. The Commission says the vulnerability has been fixed.

The preliminary findings “do not prejudge the final outcome of the investigation.” But they are not subtle. Meta set a rule — no one under 13 — then built a system a determined nine-year-old could defeat in under a minute. The company’s defense is that this is hard for everyone. The Commission’s reply is that “hard” is not the same as “impossible,” and Meta barely tried.

Sources