UK Online Safety Act 2023
Overview
The UK's Online Safety Act 2023 introduces a comprehensive regulatory framework aimed at making the internet safer, especially for children. It grants the Office of Communications (Ofcom) oversight over regulated internet services, requiring them to assess and manage risks from illegal and harmful content.
Regulation Summary
Timeline
- 26 October 2023 – The Act received Royal Assent.
- 10 January 2024 – Key sections came into force, including requirements related to risk assessments and safety duties for illegal content and child protection.
- July 31, 2025: Deadline to finalize risk assessments and implement necessary safety measures.
What Businesses Are Affected
The Act applies to regulated services, including:
- User-to-user services (e.g., social media platforms, messaging apps)
- Search services (e.g., search engines)
- Pornographic content services
The Act applies to such services if they:
- Have a significant number of UK users
- Target the UK market
- Are accessible in the UK and pose a material risk to UK users
Exemptions
- Email and private messaging services (e.g., SMS, one-to-one encrypted chats)
- Internal business tools
- Comments on provider-created content (in limited cases)
Responsibilities for Businesses
- Conduct regular risk assessments regarding illegal content, children’s safety, and user empowerment features
- Take proportionate steps to mitigate risks, including using age assurance (age verification or estimation) when appropriate
- Implement safety systems and content moderation, including tools to restrict or remove harmful content
- Provide user reporting mechanisms, internal complaint systems, and clear records of decisions
- Apply and enforce transparent terms of service
- Report serious incidents and risk assessment outcomes to Ofcom
Specific Responsibilities for Website Owners
If a website is considered a regulated service under the Act (such as a platform with user-to-user functionality, search, or pornographic content), the provider must:
- Use age verification or estimation where required to protect children from harmful content
- Block or filter primary priority content unless age assurance is applied
- Offer easy-to-use content reporting and complaint tools
- Publish clear content moderation policies, including use of automated tools
- Provide tools for adult users to control the content they encounter (e.g., filters or blocking options)
Additional Requirements
- Special protections for certain content: Providers must not interfere with content of democratic importance, journalistic content, or content published by recognized news publishers
- Fraudulent advertising duties: Category 1 and 2A services must implement measures to detect and prevent fraudulent ads
- Code of practice compliance: Providers must follow Ofcom's statutory codes of practice—or demonstrate alternative measures that are equally effective
Individual Rights
- Platforms must respect privacy and freedom of expression, especially when using automated moderation or age estimation tools.
- Measures taken must comply with the UK GDPR and Data Protection Act 2018.
Enforcement
- Ofcom is the enforcing authority of the Act.
- Investigates non-compliance through audits, information requests, and inspections
- Issues fines of up to £18 million or 10% of global annual turnover, whichever is greater
- Requires platforms to take or cease specific actions
- Seeks court orders to restrict or block noncompliant services
Questions?
If you would like to learn more, our compliance experts are happy to support you.
Leave us a Messagesupport@clym.io
+1 980 446 8535 +1 866 275 2596