Meta Charged by EU Over Failure to Stop Children From Using Instagram and Facebook
![]()
Meta was charged by European Union (EU) regulators with breaching landmark tech rules after failing to prevent children under 13 from accessing Instagram and Facebook.
The European Commission announced on Wednesday that its preliminary investigation found that Meta was violating the Digital Services Act, a law passed in 2022 requiring social media companies to more actively police their platforms. The legislation requires companies to “diligently identify and mitigate the risks” of children under 13 using their services.
Regulators say the tech giant failed to meet its own terms and conditions, which set 13 as the minimum age to use Facebook and Instagram safely. According to a report by CNBC, the Commission found that Meta did not have effective systems in place to verify users’ self-declared ages, in breach of online safety rules.
When creating an account, minors can enter a false date of birth without any verification checks, according to the Commission. EU Regulators also found that Meta’s tool for reporting underage users is “difficult to use and not effective,” requiring up to seven steps to access the relevant form. In cases where underage users are reported, the company often does not follow up, allowing accounts to remain active without review. Across the EU, regulators estimate that around 10 to 12 percent of children under 13 are accessing Instagram and Facebook.
A Meta spokesperson tells CNBC that it disagrees with the European Commission’s preliminary findings.
“We disagree with these preliminary findings. We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age,” the company spokesperson says in a statement.
“We continue to invest in technologies to find and remove underage users and will have more to share next week about additional measures rolling out soon. Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue.”
The European Union is also investigating Meta over other concerns, including whether Facebook and Instagram use addictive design features, as well as a separate case examining its recommender systems.
Meta Faces Growing Scrutiny Over Child Safety
Meta has faced increased scrutiny over child safety this year. In March, Meta was ordered to pay $375 million after a jury in New Mexico found it liable for failing to protect children from exploitation and harmful content on its platforms. In the same month, a Los Angeles jury found that Meta and Google were liable for harm caused to a young woman who said she became addicted to their platforms as a child, awarding her $6 million in damages in the landmark case.
Meanwhile, proposals to restrict children’s access to social media are gaining momentum globally. In Australia, a “world-first” law banning social media use for children under 16 came into effect in December. Although, Australia recently warned it could take legal action against social media companies including Meta, accusing them of failing to enforce the legislation and allowing children to remain on their platforms.
In April, Greece announced plans to ban access to social media for under-15s. Meanwhile, in January, the House of Lords backed an amendment to the government’s Children’s Wellbeing and Schools Bill supporting a ban on social media use for under-16s in the U.K. France’s National Assembly has also approved legislation to ban social media use for children under 15 — and Spain’s prime minister Pedro Sánchez said that he plans to introduce a similar under-16 ban. Denmark has also said it will move to ban social media use for children under 15.
Image credits: Header photo licensed via Depositphotos.