EU Investigation Accuses Meta of Inadequate Child Protection on Facebook and Instagram

An ongoing preliminary investigation by the European Union has concluded that Meta, the parent company of Facebook and Instagram, is failing to adequately prevent minors from accessing its platforms. The findings, disclosed on June 12, 2025, highlight concerns under the Digital Services Act (DSA), which mandates strict measures to protect young users online.

The European Commission, the EU’s executive arm, launched the probe in December 2024 to assess whether Meta’s policies and enforcement mechanisms comply with the DSA’s requirements. According to preliminary assessments, Meta’s age verification and content moderation systems are insufficient, leaving minors vulnerable to harmful content and interactions.

Thierry Breton, the EU’s Commissioner for Internal Market, stated:

"We have serious concerns that Meta is not doing enough to protect children on its platforms. The preliminary findings indicate systemic failures in age verification and content moderation that put minors at risk. We expect Meta to take immediate action to address these issues."

The investigation focuses on two critical areas:

  • Age Verification: Meta’s current methods for verifying user age are deemed inadequate, allowing underage users to create accounts and access restricted content.
  • Content Moderation: The EU has identified gaps in Meta’s ability to detect and remove harmful content, including material that could lead to mental health issues, cyberbullying, or exposure to inappropriate interactions.

If the final assessment confirms these preliminary findings, Meta could face fines of up to 6% of its global annual revenue under the DSA. For Meta, which reported $134.9 billion in revenue for 2024, this could amount to penalties exceeding $8 billion.

Meta has responded to the allegations, asserting that it has invested heavily in child safety measures. A company spokesperson stated:

"We take our responsibility to protect young users seriously and have implemented numerous safeguards, including AI-driven content moderation and stricter privacy settings for minors. We are reviewing the EU’s preliminary findings and will continue to cooperate fully with the investigation."

The investigation is part of a broader crackdown by the EU on tech giants to enforce compliance with the DSA, which came into full effect in February 2024. Other companies, including TikTok and X (formerly Twitter), have also faced scrutiny over similar concerns.

As the investigation progresses, the EU may impose interim measures or additional requirements on Meta to ensure compliance. The final report is expected by the end of 2025.

Source: Engadget