Britain’s media and privacy regulators have warned major social media companies that they must do more to prevent children from accessing their platforms, saying firms are failing to enforce their own minimum age rules.
The warning came from Ofcom and the Information Commissioner’s Office (ICO), which said they were increasingly concerned about algorithm-driven feeds exposing young users to harmful or addictive content.
“These online services are household names, but they’re failing to put children’s safety at the heart of their products,” Ofcom chief executive Melanie Dawes said, warning that regulators would take action if companies did not improve safeguards quickly.
Platforms Told To Strengthen Age Checks
Under the latest phase of Britain’s Online Safety Act, Ofcom has asked several major platforms to explain how they will improve child protection measures by April 30.
Companies including Facebook and Instagram, owned by Meta, as well as Roblox, Snapchat, TikTok and YouTube have been instructed to tighten age verification, restrict strangers from contacting children and make content feeds safer.
They have also been told to stop testing new features or products on minors.
Calls For Modern Age Verification Tools
The ICO issued a separate open letter urging companies to adopt stronger age-assurance technologies to prevent children under 13 from accessing services not designed for them.
“There’s now modern technology at your fingertips, so there is no excuse,” said ICO chief executive Paul Arnold.
Meta said it already uses artificial intelligence to detect and estimate user ages and applies built-in protections to teen accounts. The company added that age verification should ideally be handled centrally through app stores.
YouTube said it provides age-appropriate experiences but criticised Ofcom for moving away from a risk-based approach and urged regulators to focus on high-risk services that fail to comply with safety laws.
Roblox, Snapchat and TikTok did not immediately comment.
Risk Of Heavy Fines
Regulators have the authority to impose significant financial penalties on companies that fail to comply.
Ofcom can fine firms up to 10% of their global revenue, while the ICO can impose penalties of up to 4% of a company’s worldwide annual turnover.
Last month, the privacy watchdog fined Reddit nearly £14.5 million for failing to introduce effective age checks and for unlawfully processing children’s data.
(with inputs from Reuters)


