Skip to main content skip to search skip to contact
 

Governments around the world have been grappling with how to protect citizens, especially children, from the perils of our increasingly digital lives for at least a decade, but the challenges for legislators are enormous.

The internet recognises few international borders and the businesses behind some popular online platforms have more financial clout than most countries. While Brexit may have given the UK the autonomy to shape its own response, the EU is one of very few markets in the world big enough to influence the US tech titans’ practices and policies.

Compounding the issue is the pace of digital change compared to the speed with which legislators can act. By the time on law can be drafted, considered and enacted, technology has long-since moved on.

Nonetheless, the Online Safety Act 2023 attempts to regulate online services and protect UK users from exposure to harmful content and behaviour. While the law was passed almost two years ago, it is only this year that it has come into force, with November 3rd 2025 seeing the final implementation of key measures regulating user-to-user services.

What the Act does

While the legislation may have had major search, social media and pornography platforms squarely in its sights when it was passed, two years ago, Ofcom has estimated that more than 100,000 online services could fall within its scope.

As a result, any business that offers internet search, pornographic content or user-to-user communication will need to comply.

Platforms that children are likely to access must take proactive steps to protect them from harmful content, including pornography, and material promoting self-harm, suicide, eating disorders or violence.

Services have a duty to introduce age verification or ‘age assurance’ technologies to stop children from accessing pornographic / inappropriate content, and use age-appropriate design features that prioritise child safety by default.

The legislation acknowledges that adults can be vulnerable to online harms too. As well as requiring platforms to take proportionate measures to prevent users from encountering illegal content, such as child sexual abuse, terrorism and hate crimes, companies must put in place robust systems to swiftly detect and remove such material.

Search engines must also minimise the risk of users encountering illegal content and protect them from fraudulent advertisements.

Transparency tools

Major platforms must also now publish transparency reports, detailing the harmful content that appears on their services and what they've done to remove and prevent it. Platforms must provide clear terms of service that are easy to access and explain their content policies.

Users now have stronger rights to appeal decisions about content, through effective complaints procedures, giving them more recourse should they disagree with decisions by the platforms

An incomplete solution

While the Online Safety Act 2023 has finally delivered important safeguards, vigilance is still vital, especially for parents. Talking to children about online risks and maintaining parental limits and controls over their devices remains as important as ever.

Anyone suspecting that a digital service isn’t meeting its obligations should report it through the provider’s own channels first, before taking the matter to Ofcom.

 
 
 

Disclaimer - all information in this article was correct at time of publishing.

 

Related articles