The European Union has formally accused Meta Platforms of failing to keep underage children off Facebook and Instagram, escalating a year-long probe and exposing the company to a potential multi-billion dollar fine.
The European Commission issued preliminary findings on April 29 that Meta is breaching the Digital Services Act by not having effective age verification, a charge that could result in a fine of up to 6% of the social media giant’s global turnover. The finding alleges a violation of DSA Article 28(1), which requires platforms to implement appropriate measures to ensure a high level of safety and privacy for minors.
"The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users--including children," Henna Virkkunen, the EU's top tech enforcer, said in a statement.
The finding targets Meta's reliance on self-declaration for age gating, a method the EU deems ineffective and easily circumvented. This action follows the Commission's launch of a privacy-preserving age verification app on April 15 and similar charges against four adult content sites in March, signaling a lower tolerance for trivially-bypassable age controls across all platforms.
At stake for Meta is not just a potential fine that could run into billions of dollars based on its 2025 revenue, but also a mandate to overhaul its age verification systems. The finding sets a significant precedent for how the DSA's child safety rules will be enforced against mainstream social media platforms, moving beyond adult content and putting the entire sector on notice.
No More Excuses
The timing of the charge appears to be a deliberate strategic move by Brussels. Two weeks before charging Meta, Commission President Ursula von der Leyen unveiled a new EU-developed age verification app. "Online platforms can easily rely on our age verification app so there are no more excuses," von der Leyen said at the launch. "We will have zero tolerance for companies that do not respect our children’s rights." By providing a technical solution, the Commission has preemptively countered industry arguments that robust, privacy-preserving age verification is not feasible.
A spokesperson for Meta said the company disagrees with the findings, stating, "We're clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age." The company maintains that determining a user's age is a complex, industry-wide challenge.
However, regulators pointed to research from the Interface-EU think tank in 2025 which demonstrated that a simulated 14-year-old could create an account on Instagram by simply entering a false date of birth, with no further checks. The Commission’s preliminary view is that Meta's current tools, which combine self-reporting with AI-based age estimation, are inadequate.
A Broadening Regulatory Front
This specific charge is part of a wider, formal investigation into Meta's child safety practices that the Commission opened in May 2024. The broader probe also includes concerns around addictive design and recommender systems. The EU's action brings the same legal standard for age verification previously applied to pornographic sites like Pornhub and XNXX to a mainstream platform with billions of users.
If the preliminary finding is confirmed, the Commission can impose a fine of up to 6% of Meta's global annual turnover and issue periodic penalty payments to enforce compliance. Meta now has the right to review the Commission's investigation file and submit a formal response before a final decision is made. The proceedings have no fixed deadline, but the simultaneous charges on multiple fronts indicate an acceleration of the EU's enforcement posture against Big Tech.
This article is for informational purposes only and does not constitute investment advice.