The UK Online Safety Act 2023 is groundbreaking legislation that came into force in July 2025, requiring all platforms accessible to UK users to implement age verification and content moderation systems. With fines reaching £18 million or 10% of global revenue, the Act affects over 100,000 companies worldwide through its extraterritorial reach.
Ofcom serves as the regulator with unprecedented powers to block services, while platforms face criminal liability for senior managers who fail to comply with child safety duties.
Overview
The UK Online Safety Act 2023 represents the most significant transformation of internet regulation in British history, fundamentally changing how digital platforms operate for UK users. After seven years of development beginning with the 2019 Online Harms White Paper, this comprehensive legislation aims to make the UK “the safest place in the world to be online“.
Companies now need mandatory age verification systems and content moderation requirements or they face severe financial penalties for non-compliance. The Act’s scope extends far beyond traditional social media platforms, encompassing search engines, messaging services, file-sharing sites, and even community forums, affecting an estimated 100,000+ companies globally.
The legislation establishes Ofcom as the online safety regulator with unprecedented enforcement powers, including the ability to impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.
These new regulations apply to any service with significant UK users or targeting the UK market, regardless of where companies are based. This creates extraterritorial jurisdiction that has sparked international diplomatic tensions between large tech companies and the UK Government.
The Act categorises platforms into different tiers, with Category 1 social media services like Facebook (Meta) and Google facing the most stringent requirements including algorithmic transparency and democratic content protections. Implementation follows a phased approach with key milestones already in effect since March 2025 for illegal content duties, while child safety measures became enforceable from July 2025.
The Act requires platforms to conduct comprehensive risk assessments, implement “highly effective age assurance” technologies, and maintain detailed compliance records subject to Ofcom inspection.
Criminal liability extends to senior managers who fail to ensure compliance with information requests or child protection duties, marking a significant escalation in personal accountability for digital platform leadership under UK internet law.