Brussels: The European Commission has said that Meta Platforms may have breached key online safety rules, warning that its platforms Facebook and Instagram are not doing enough to protect children under the age of 13.
The findings come under the Digital Services Act, which requires large online platforms to assess risks and put strong safeguards in place, especially for minors. EU regulators said Meta’s current systems are not effective in stopping younger children from accessing its services.
According to the Commission, many underage users are still able to create and use accounts, showing gaps in age verification and enforcement. It also noted that tools meant to report underage users are limited and do not always lead to action.
EU technology chief Henna Virkkunen said platforms must take practical steps to ensure children are protected, not just rely on policies written in their terms of service. She called for stronger and more reliable systems to detect and remove accounts that do not meet the age requirement.
The Commission has asked Meta to improve how it evaluates risks to children and to introduce more effective protections. If the company is found to have violated the law, it could face a fine of up to 6 percent of its global annual turnover.
Meta has been given time to respond to the allegations before a final decision is made.
The case highlights increasing concern among governments about the impact of social media on children, with more countries considering stricter rules to limit access and improve online safety for young users.