23 Jul 2025
by Oliver Alderson

Online Safety Act Implementation: What’s Changing and What’s Next

As new age assurance rules take effect this week, techUK looks at what’s changing now – and what’s still to come under the Online Safety Act. 


This week marks a major moment for Online Safety Act implementation, as the 25 July 2025 deadline approaches for companies in scope to comply with Ofcom’s child safety duties. 

User-to-user services – that are considered likely to be accessed by children – must now take steps to prevent young users from encountering certain types of harmful material. This includes preventing access to “primary priority” content, such as pornography and content promoting suicide or self-harm, and reducing exposure to other “priority” harms. 

To meet these new responsibilities, some services will need to implement “highly effective” age assurance measures. The implementation of these rules represents one of the most significant steps towards full rollout of the Online Safety Act. 

Passed in October 2023, the Online Safety Act is one of the UK’s most wide-ranging digital laws. It introduces new legal duties for online services, aimed at reducing harm to users, particularly children. These include requirements to tackle illegal content, introduce safety-by-design features, and operate with greater transparency and accountability. 

Ofcom, the UK’s online safety regulator, is responsible for enforcing the Act. Since late 2023, it has been working through a phased implementation plan - engaging with stakeholders and publishing guidance to help services understand their responsibilities. 

So far, the rollout has been structured around three main phases: 

  • Phase 1: Tackling illegal content 
  • Phase 2: Content harmful to children 
  • Phase 3: Duties for categorised services 

Each phase brings its own guidance, codes of practice, and compliance timelines. Implementation will continue into early 2026, with further duties - especially for larger platforms — expected over the next 12 to 18 months. 

This week’s deadline marks a critical shift from planning to delivery - as age checks become a legal expectation, and the next stage of the Act comes into force. 

What’s changing in July 2025? 

From 25 July 2025, Ofcom’s Protection of Children Codes of Practice  will come into effect - a major milestone in the rollout of the Online Safety Act. These Codes introduce binding duties for online services that are likely to be accessed by children, with the aim of reducing children’s exposure to harmful content and requiring stronger safety systems by default. 

The Codes apply to both user-to-user and search services and were finalised by Ofcom in early July, following Parliamentary approval. From this point on, services in scope must either follow the detailed safety measures set out in the Codes, or demonstrate that they are applying equally effective alternative approaches. 

For user-to-user services, duties focus on proportionate and effective measures to: 

  • Prevent children from encountering the most harmful content - such as pornography, self-harm, suicide, or eating disorder material - using highly effective age assurance.. 
  • Protect children from a wider range of harmful material, including bullying, hate content, violent imagery, and dangerous challenges (priority content), as well as other content that poses a material risk of harm (non-designated content). 
  • Mitigate both the risks and the potential impact of exposure to harmful content, using insights from a service’s children’s risk assessment. 
  • Set out safety measures clearly in terms of service, and maintain accessible reporting and complaints mechanisms. 

For search services, the duties are similar in scope but tailored to search functionality. They include: 

  • Minimising the risk of children being exposed to harmful results - particularly the most high-risk content 
  • Managing risks for different age groups based on the outcome of a children’s risk assessment 
  • Publishing a publicly available statement on how these duties are being met 
  • Providing clear routes to report harmful content and raise complaints 

These new requirements are part of Phase 2 of Ofcom’s implementation roadmap. Earlier this year, services were expected to complete children’s access assessments and children’s risk assessments between April and July to inform their approach to compliance. 

From this point onwards, Ofcom will be able to enforce compliance. This includes the power to impose fines or take action against services that fail to meet their safety duties. 

expected-timetable-for-implementing-the-online-safety-act RESIZED (1).png

(Source Ofcom

What's still to come? 

Expected – July 2025 
Ofcom is expected to publish the Register of Categorised Services, formally designating which online services fall under specific regulatory categories within the Act. Categorised services are likely to face additional duties and will have the opportunity to respond to consultations before these requirements come into force. 

Deadline: 20 October 2025 
Consultation on Additional Safety Measures
, expected to close on 20 October, sets out proposed requirements for categorised services – including content moderation systems, governance standards, and systemic risk controls. 

Expected – August to November 2025 
Ofcom is expected to begin issuing draft and final transparency notices to categorised services. These notices will outline specific reporting obligations, including how services assess risks, apply safety measures, and prepare to publish annual transparency reports from 2026. 

Expected – Q4 2025 
The Secretary of State for Science, Innovation and Technology is expected to lay regulations before Parliament to set the Qualifying Worldwide Revenue (QWR) threshold. Providers whose global turnover exceeds the threshold (anticipated to be £250 million) will be required to notify Ofcom and pay annual regulatory fees. 

Expected – October 2025 to March 2026 
Ofcom is expected to open a consultation on further duties for categorised services, covering proposals on systemic risk management, governance obligations, and other long-term regulatory measures. While responding will be voluntary, this will be a key opportunity for industry to help shape the future of the regime. 

Expected – 31 December 2025 
The super-complaints regime is expected to come into force. This new mechanism will allow designated organisations to escalate systemic issues—such as repeated failures by a service to uphold safety duties—directly to Ofcom. It aims to strengthen the role of civil society and ensure that widespread or unresolved harms are addressed more effectively. 

Preparing for the online safety fees regime 

Another key element of implementation is the introduction of a fees and penalties regime, enabling Ofcom to recover the costs of delivering its online safety duties. 

On 26 June 2025, Ofcom published its final policy statement confirming that regulated providers will need to pay annual fees if their Qualifying Worldwide Revenue (QWR) exceeds £250 million. However, providers with less than £10 million in UK-derived revenue will be exempt from fees, even if they exceed the global threshold. 

The QWR model is designed to ensure proportionality and is expected to be backed by secondary legislation in late 2025, when the Secretary of State for Science, Innovation and Technology lays the regulations before Parliament. Ofcom is due to publish further guidance on calculating QWR later this year. 

By October 2025, the fees regime comes into force, and in-scope providers should notify Ofcom of their liability. This will mark another shift toward embedding the long-term operational framework of the OSA. 

Get in touch 

techUK is working with members on the ongoing implementation of the Online Safety Act. For any questions, or to get involved in our online safety work, please contact [email protected], or reach out to the team below. 


Samiah Anderson

Samiah Anderson

Head of Digital Regulation, techUK

Audre Verseckaite

Audre Verseckaite

Senior Policy Manager, Data & AI, techUK

Daniella Bennett Remington

Daniella Bennett Remington

Policy Manager - Digital Regulation, techUK

Oliver Alderson

Oliver Alderson

Junior Policy Manager, techUK


techUK's Policy and Public Affairs Programme activities

techUK helps our members understand, engage and influence the development of digital and tech policy in the UK and beyond. We support our members to understand some of the most complex and thorny policy questions that confront our sector. Visit the programme page here.

 

Upcoming events

Latest news and insights 

Learn more and get involved

 

Policy Pulse Newsletter

Sign-up to get the latest tech policy news and how you can get involved in techUK's policy work.

 

 

Here are the five reasons to join the Policy and Public Affairs programme

Download

Join techUK groups

techUK members can get involved in our work by joining our groups, and stay up to date with the latest meetings and opportunities in the programme.

Learn more

Become a techUK member

Our members develop strong networks, build meaningful partnerships and grow their businesses as we all work together to create a thriving environment where industry, government and stakeholders come together to realise the positive outcomes tech can deliver.

Learn more

Meet the team 

Antony Walker

Antony Walker

Deputy CEO, techUK

Alice Campbell

Alice Campbell

Head of Public Affairs, techUK

Edward Emerson

Edward Emerson

Head of Digital Economy, techUK

Samiah Anderson

Samiah Anderson

Head of Digital Regulation, techUK

Audre Verseckaite

Audre Verseckaite

Senior Policy Manager, Data & AI, techUK

Mia Haffety

Mia Haffety

Policy Manager - Digital Economy, techUK

Archie Breare

Archie Breare

Policy Manager - Skills & Digital Economy, techUK

Nimmi Patel

Nimmi Patel

Head of Skills, Talent and Diversity, techUK

Daniella Bennett Remington

Daniella Bennett Remington

Policy Manager - Digital Regulation, techUK

Oliver Alderson

Oliver Alderson

Junior Policy Manager, techUK

Tess Newton

Team Assistant, Policy and Public Affairs, techUK

 

Related topics

Authors

Oliver Alderson

Junior Policy Manager, techUK