13 May 2021

Government publishes long-awaited Draft Online Safety Bill

The UK Government has published the long-awaited draft Online Safety Bill which takes years of thinking and debate about how to make the UK the safest place to be online into practice.

The draft Online Safety Bill follows the December 2020 Full Government Response to the Online Harms White Paper which confirmed Ofcom as the regulator and outlined how companies in scope will have a ‘duty of care’ towards their users which they can fulfil by following different codes of practice.

What is the draft Online Safety Bill?

The draft Bill is separated into seven sections. The first three sections address the scope and duties of different platforms and codes of practice while the remaining four outline Ofcom’s powers, relating to media literacy, enforcement, information gathering and appeals. 

There is much to work through and digest, yet there are some significant developments to address and areas where legal certainty is still needed for tech companies to understand what will be required of them.

Scope and exemptions

The draft Bill defines “regulated services” in scope as “user-to-user services” where content can be generated and shared by users and “search services” such as search engines.

Email, SMS and MMS, aural communication, internal business services, limited functionality services which only allow comments and reviews, as well as public bodies, are all set to be out of scope, as seen in Schedule 1 of the text (p.121).

This remains largely the same as the full Government response to the White Paper where techUK welcomed the removal of ‘low risk functionality services’ from scope.

Categorisation of companies

The draft Bill does not provide any certainty on exactly where the thresholds will lie between different categories of service. It does, however, introduce three sets of categories that will have specific ‘threshold conditions’ to be set by the Secretary of State with Ofcom responsible for the designation of companies (p.125).

  • Category 1 ‘regulated user-to-user services’ threshold conditions will be determined by a) number of users and b) functionalities.
  • Category 2A ‘regulated search services’ threshold conditions will be determined by a) number of users and b) any other factors the Secretary of State considers relevant.
  • Category 2B ‘regulated user-to-user services’ threshold conditions will be determined by a) number of users b) functionalities and c) any other factors the Secretary of State considers relevant.

techUK is disappointed to not see further clarity on where the thresholds lie between different categories and whether companies will be likely to change category and how this might impact their obligations. Providing certainty on this will be vital for so many tech companies who often scale fast and need to understand, as they grow, how their responsibilities change and what action to take.

Legal process and duties 

As outlined in the full Government response to the White Paper, companies will have different duties to fulfil based on their category and whether their service can be accessed by children.

Here is an outline of what will be required of different services and platforms:

  • All companies will be required to conduct risk assessments relating to illegal content and have duties about rights to freedom of expression and privacy.
  • Services that can be accessed by children will have to conduct “children’s risk assessments” and protect children from harmful content.
  • Category 1 services will have separate additional duties to protect adults from harmful content, as well as to “protect content of democratic importance” and “journalistic content”.

Content of democratic importance encompasses “news publisher content” and “content that is or appears to be specifically intended to contribute to democratic political debate in the UK”, while journalistic content refers to “content generated for the purposes of journalism”.

Companies will be able to fulfil their duties by following codes of practice set by Ofcom which will consist of systems and processes for companies to put in place. When preparing codes Ofcom must consult the Secretary of State and consider “different kinds, sizes and capacities” of companies.

When it comes to reporting, as outlined in Part 3, Chapter 1 (p44), companies of both categories will be required to follow Ofcom’s guidance around “producing an annual transparency report” which will be subject to Ofcom’s requirements.

techUK supports the differentiation between companies when considering codes of practice but warns against an over prescriptive and complex approach which could overburden smaller companies with excessive reporting and reviewing requirements.

Definitions of harmful content

One of the biggest gaps in this draft legislation is the lack of legal definitions on what constitutes harmful content.

Part 2, chapter 6 (p.37) outlines types of illegal content in scope including “a terrorism offence” and “a CSEA offence”, while types of harmful content are not specified in the draft Bill and will be left to secondary legislation.

Pushing this down the tracks delays a fundamental part of this regime and it is important to remember that this legislation is not just about regulating companies, it is about regulating people’s behaviour. Companies of all sizes and function need some levels of certainty to understand where the line should be drawn between individual freedoms and possible harm.

Placing the onus on the companies to decide what is and is not acceptable online has the potential to create unequal standards, interrupt technological innovation and undermine democratic process and individual rights.

Enforcement

Part 4 Chapter 6 of the draft Bill (p.70) outlines how Ofcom will be responsible for enforcement of the regime with maximum fines for non-compliance of up to 10% of worldwide revenue or a maximum of £18 million. The draft Bill also includes provisions relating to "liability of corporate officers", yet it is not clear how this might work in practice.

Although the Government has indicated this would be a ‘last resort’ in their full Response to the White Paper, there is a risk that including this provision in regulation will have a chilling effect on smaller companies and investment.

Media literacy

It is encouraging to see that Part 4 Chapter 8 of the Draft Bill (p.93) includes duties to promote media literacy which is something techUK has long called for. “Encouraging educational initiatives” and supporting individuals to “understand the nature of online material” are welcome provisions to enable literacy online.

Digital skills and media literacy initiatives are part of a longer-term solution to support positive online experiences. They allow individuals to develop agency and resilience to respond to potential online risks and should be prioritised alongside regulation. techUK supports the development of this work.  

What are the next steps of the Draft Bill?  

The draft Online Safety Bill has been the first Bill to be published following the Queen’s Speech. It now requires careful and detailed pre-legislative scrutiny to ensure it delivers on its objectives and the next step is for the House to appoint the pre-legislative scrutiny committee and outline when this work will begin.

The pre-legislative scrutiny committee will act similarly to a select committee with oral and written sessions, alongside reports. We anticipate this could run through to the next Parliamentary session with the Bill entering Parliament after the summer 2021.

Once in Parliament, the next stage of scrutiny will begin which can take up to several years to complete. techUK looks forward to working closely with Government, officials and Parliamentarians as we move through this process together.  

If you have any questions about our work or on the draft Online Safety Bill, please get in touch.