New Ofcom Rules on Online Safety Criticized as Inadequate
The UK’s communications regulator Ofcom has unveiled new rules aimed at protecting internet users from harmful content, but critics argue they don’t go far enough.
Social Media Giants Face Fines Up to £18m
Under the new regulations, social media platforms like Facebook, Twitter, and TikTok could face fines of up to £18 million or 10% of their annual global turnover if they fail to remove illegal content quickly. Smaller sites may be blocked entirely.
“Too many people are exposed to dangerous material online, from hate speech to fraud,” said Ofcom CEO Melanie Dawes. “These new rules will force tech companies to take responsibility for keeping their users safe.”
Campaigners: Rules Have “Serious Weaknesses”
However, online safety advocates say the measures fall short. “While it’s a step in the right direction, these rules have some serious weaknesses,” warned Jim Killock, director of the Open Rights Group.
Killock points out that the regulations only cover illegal content, not material that is legal but still harmful, such as cyberbullying or encouragement of self-harm. “This leaves vulnerable people, especially children, at risk,” he said.
Balancing Safety and Free Speech
The rules also require platforms to protect free speech and privacy, creating a delicate balancing act. Companies must have clear terms of service, allow users to easily report harmful content, and give them the right to appeal takedowns.
Ofcom says it will judge platforms’ actions based on the severity of the content and the number of users exposed. Fines will be a last resort. “We want to work with companies to keep people safe, not just punish them,” Dawes explained.
Critics: Rules Give Ofcom Too Much Power
Some free speech advocates worry the rules give Ofcom too much control over online expression. “This essentially allows a government agency to police the internet,” said Heather Burns of the Open Rights Group. “That sets a dangerous precedent.”
Others argue that without strict regulations, tech giants have little incentive to clean up their platforms. “Relying on these companies to self-regulate clearly isn’t working,” said Labour MP Jo Stevens. “We need strong enforcement and real consequences for inaction.”
Calls for Stronger Legislation
Many campaigners are now pushing for the government to strengthen the upcoming Online Safety Bill, which will give the new rules legal force. They want the bill to cover more types of harmful content and to hold company executives personally liable for failures.
“Ofcom’s rules are a bare minimum,” said Killock. “The Online Safety Bill is a chance to give them real teeth and create a safer internet for everyone. The government mustn’t waste this opportunity.”
The bill is expected to be introduced in Parliament later this year. In the meantime, Ofcom says it will continue working with tech companies, lawmakers, and the public to refine its approach. The new rules are set to take effect in 2024.
Leave a comment