Ofcom publishes first post-OSA assessment of the tech sector

Ofcom’s first assessment since the Online Safety Act came into force highlights early progress, remaining gaps, and what businesses should expect in 2026. 


On 4 December, Ofcom published its first assessment of how the technology sector has responded to the Online Safety Act (OSA) since major duties came into force earlier this year. The report sets out the regulator’s view of progress made so far, the areas where expectations have not yet been met, and the direction of travel in 2026. 

For businesses operating in scope of the OSA, the report provides an insight into Ofcom’s view of the regulatory environment. It sets out how the regulator intends to approach 2026 and highlights the issues that companies will need to prepare for over the coming year. 

A year of transition: OSA Implementation 

2025 marked the first year where many of the OSA’s requirements translated into live operational changes for services. Providers have now completed their first statutory risk assessments and introduced new safety measures across content moderation, age assurance and product design. Ofcom notes that this has already resulted in visible change for users, particularly around strengthened age assurance, reduced access to high-risk content, and updates to reporting tools. 

However, the regulator also highlights that these early efforts vary significantly in quality. Many providers did not name a senior individual responsible for the risk assessment process, and in a substantial number of cases, assessments of illegal and harmful content were inconsistent or incomplete. Ofcom also reports that risks arising from service design, such the algorithmic visibility of harmful content,  were often only lightly addressed. In several assessments, providers did not set out how they would monitor the effectiveness of their measures over time. These gaps, Ofcom will pay sharper attention to in the next year. 

Priority areas for 2026 

A significant portion of the report highlights the areas where Ofcom expects providers to “go further” in the year ahead. For businesses, these priorities are useful signals of where regulatory scrutiny will likely be most intense: 

1. Effective age assurance 

Ofcom recognises that age-check systems have been rolled out by a wide range of services, including areas where content poses the highest risk. However, it is clear that the next step will involve evaluating whether these systems are accurate, reliable, and proportionate. The regulator notes concerns about over-moderation and inconsistent implementation. It also highlights a need for clearer evidence on how children’s experiences have changed following deployment. 

For businesses, this reinforces the importance of understanding the performance, privacy impacts, and unintended consequences of any age assurance approach they adopt. 

2. Stronger protections for children 

Children’s safety remains central to Ofcom’s programme for 2026. While a wide range of platforms popular with under-18s have expanded safety features, Ofcom identifies areas where industry’s own assessment of risk may underestimate potential harms. Notably, no service in scope classified itself as presenting a “high” risk of exposing children to suicide or self-harm content, despite this being one of the key harm categories Ofcom has prioritised. The regulator suggests that some providers relied on weak or insufficient justifications when assigning low or negligible risk levels, and reiterates that where evidence is inconclusive, services should adopt a precautionary approach. 

Looking ahead, Ofcom plans to collect more detailed information from the largest platforms early in 2026, including evidence on how personalised feeds and recommendation systems are configured to minimise harmful content exposures. Providers will also be expected to demonstrate how they validate whether their interventions are working in practice, particularly in light of Ofcom’s finding that 70% of children reported encountering some form of harmful content in the four weeks prior to the survey period.. 

3. Tackling child sexual abuse material (CSAM) and grooming 

Ofcom identifies online sexual exploitation as an area where progress has been uneven. While many services are using hash-matching or proactive detection technologies, gaps remain, particularly among smaller providers and in high-risk service categories. 

In 2026, the regulator intends to broaden its enforcement focus beyond file-sharing services, looking more closely at how different types of services prevent harmful contact and identify illegal material. Businesses should be prepared for closer examination of detection systems, reporting pathways, and the strength of internal escalation procedures. 

4. Illegal content and content moderation processes 

While most services have core moderation systems in place, Ofcom found that many risk assessments provided limited detail on how these systems operate in practice or how providers test their effectiveness. In several cases, providers did not fully account for risks relating to child sexual abuse material or terrorist content, despite these being priority areas for enforcement. The regulator also notes gaps in providers’ descriptions of how their detection tools, recommender systems, or automated workflows reduce risk, and how these interventions are monitored for effectiveness over time. 

As a result, Ofcom will place greater emphasis in 2026 on auditing the robustness of providers’ moderation processes. This includes examining whether services are effectively removing illegal content, whether recommender systems inadvertently elevate harmful material, and whether governance structures support timely updates when new risks emerge. 

5. Governance and accountability 

One of the clearest messages from Ofcom’s assessment is the priority on strengthened internal governance. Around two-thirds of the risk assessments Ofcom reviewed did not name a specific individual responsible for producing or overseeing the assessment, despite the regulator’s guidance recommending this as a core part of good record-keeping. Ofcom also highlights widespread gaps in how risks were documented and justified, including missing evidence inputs and limited explanation of how providers determine that their safety measures are effective. 

In 2026, governance arrangements will therefore form a major area of scrutiny. Services will be expected to demonstrate clearer ownership of risk, more detailed documentation of the reasoning behind risk level decisions, and greater use of internal and external evidence sources when assessing harms. The regulator is particularly focused on ensuring that risk assessments are not treated as one-off exercises but form part of an ongoing process as services evolve. 

Wider sector insight 

While the report focuses on compliance, it also reflects broader shifts in the digital environment. Ofcom’s commentary highlights emerging challenges including: 

  • Generative AI and chatbots, where new safety risk, such as harmful outputs, deepfakes, and inappropriate interactions, require updated risk assessments. 
  • VPN usage among children, which Ofcom is monitoring to understand whether it undermines safety interventions - Ofcom is unsure how much of the spike is driven by children. 
  • New forms of fraud and online scams, which remain among the most common harms experienced by UK users. 
  • Developing regulatory technologies, with Ofcom pointing to increasing uptake of machine-learning-based moderation tools and cross-industry collaboration. 

For industry, these observations suggest a regulatory landscape that is highly adaptive and responsive to emerging risks, rather than static or purely compliance-driven. 

Conclusion 

Ofcom’s first post-OSA assessment provides valuable clarity on how the regulator views the sector’s progress and where it believes further action is required. For techUK members, the direction Ofcom desire is clear: the next 12 months will be characterised by deeper scrutiny and more expectations of more robust systems. 

techUK will continue to review developments and engage with members as Ofcom’s supervision and enforcement activity evolves. We encourage members interested in the online safety policy area to join our Digital Regulation Group (members-only), as well as sign-up to Policy Pulse – our weekly tech policy newsletter open to all. 


Contact techUK team 

Samiah Anderson

Samiah Anderson

Head of Digital Regulation, techUK

Daniella Bennett Remington

Daniella Bennett Remington

Policy Manager - Digital Regulation, techUK

Oliver Alderson

Oliver Alderson

Junior Policy Manager, techUK

techUK's Policy and Public Affairs Programme activities

techUK helps our members understand, engage and influence the development of digital and tech policy in the UK and beyond. We support our members to understand some of the most complex and thorny policy questions that confront our sector. Visit the programme page here.

Upcoming events

Latest news and insights 

Learn more and get involved

 

Policy Pulse Newsletter

Sign-up to get the latest tech policy news and how you can get involved in techUK's policy work.

 

 

Here are the five reasons to join the Policy and Public Affairs programme

Download

Join techUK groups

techUK members can get involved in our work by joining our groups, and stay up to date with the latest meetings and opportunities in the programme.

Learn more

Become a techUK member

Our members develop strong networks, build meaningful partnerships and grow their businesses as we all work together to create a thriving environment where industry, government and stakeholders come together to realise the positive outcomes tech can deliver.

Learn more

Meet the team 

Antony Walker

Antony Walker

Deputy CEO, techUK

Alice Campbell

Alice Campbell

Head of Public Affairs, techUK

Edward Emerson

Edward Emerson

Head of Digital Economy, techUK

Nimmi Patel

Nimmi Patel

Associate Director for Policy, techUK

Samiah Anderson

Samiah Anderson

Head of Digital Regulation, techUK

Audre Verseckaite

Audre Verseckaite

Senior Policy Manager, Data & AI, techUK

Archie Breare

Archie Breare

Policy Manager - Skills & Digital Economy, techUK

Daniella Bennett Remington

Daniella Bennett Remington

Policy Manager - Digital Regulation, techUK

Oliver Alderson

Oliver Alderson

Junior Policy Manager, techUK

Tess Newton

Team Assistant, Policy and Public Affairs, techUK

 

 

Related topics