The final Online Safety Bill – what is in it and what does it mean for techUK members?

The Government has published the final Online Safety Bill following years of consultation and a detailed pre-legislative scrutiny process that resulted in a long list of recommendations from parliamentarians to improve the draft Bill.

What is the Online Safety Bill?

The Online Safety Bill is a complex and wide-ranging piece of legislation that aims to create safer online spaces for adults and children, protect freedom of expression and promote innovation by regulating 25,000 tech companies and requiring providers of pornographic sites to operate strict age verification.

The Bill consists of 12 Parts, 194 Clauses and 14 Schedules, which together with the Explanatory Notes, Impact Assessment and Powers Memorandum amount to 550 pages of text to read and digest.

The structure and much of the substance of the final Bill aligns with the draft Bill although new developments around illegal content, fraudulent advertising and identify verification have the potential to significantly change the nature of the regime. 

What does the Online Safety Bill mean for techUK members?

Scope (Part 2) and exemptions (Schedule 1 and 2)

The Bill will impact over 25,000 tech companies including ‘user-to-user’ and ‘search’ services as well as ‘internet services that provide pornographic content’, all of which must be linked to the UK.

‘User-to-user' services are defined as services that have a functionality allowing content to be ‘generated, uploaded, shared or encountered’ by users, while a ‘search service’ refers to a search engine.

The types of services exempt include email, SMS, MMS, one-to-one aural communication, internal business services and limited functionality services, although if any of these ‘provide pornographic content’ they will be subject to age verification requirements.  

Categorisation of companies (Part 7, Schedule 10)

In-scope services will be registered into categories with category 1 referring to the highest risk user-to-user services, category 2A referring to search and category 2B referring to lower risk user-to-user services. Category 1 services will have additional duties to Category 2 services (see below under 'duties for all regulated services'). 

The final Bill does not provide any clarity on where the thresholds will lie between category 1 and 2 services, and instead focuses on a lengthy procedure for how the Secretary of State will determine thresholds in secondary legislation following research from Ofcom.

When deciding on thresholds, the Secretary of State will be required to consider size, functionality and any other factors deemed relevant, as well as Ofcom’s research on the relationship between the dissemination of harmful and illegal content and relevant duties for each category of service. 

Ofcom will be given 6 – 18 months from when the Act is passed to complete research and any decisions by the Secretary of State will be conditional on reviewing the findings. This risks delaying legal certainty for tech businesses, many of which often scale fast and will need to understand, as they grow, how their responsibilities change and what action to take. 

Duties for all regulated user-to-user and search services (Part 3)

Part 3 is one of the most substantial sections outlining duties for regulated services. The process starts with Ofcom issuing guidance on risk assessments and once published, services will have 3 months to return their assessments.

Following completion of risk assessments, services will be required to follow Ofcom’s codes of practice which will consist of proportionate systems and processes, or services can introduce alternative measures that meet Ofcom’s guidance.

The following duties apply to all regulated services:

  • Illegal content – 1) complete illegal content risk assessment addressing user base, types of ‘priority illegal content’ and other illegal content 2) follow Ofcom codes or introduce alternative measures 3) allow users to easily report content and issue complaints.
  • Child safety – 1) if a child can access your service, meaning you do not have age verification in place (explanatory notes, p.35), complete child safety risk assessment addressing factors such as harmful content and algorithmic risk 2) follow Ofcom codes or introduce alternative measures 3) allow users to easily report content and issue complaints.
  • Freedom of expression and privacy –  have ‘regard to the importance of protecting users right to free expression within the law’ and protect users from unwarranted infringement of privacy when designing and implementing policies and safety measures. 

Additional duties apply for category 1 services:

  • Harmful content towards adults – 1) complete adult risk assessment addressing user base, level of risk of adults, levels of risk of harms of ‘priority content towards adults’, design and operation, business model, use of pro-active tech and media literacy 2) follow duties around transparency, take down, restricting access and limiting recommendations 3) allow users to easily report content and issue complaints.
  • User empowerment – allow adults to use features that prevent non-verified users from interacting with content of a user and reduce the likelihood of users encountering content from non-verified users.  
  • Content of democratic importance – include provisions in terms of service around the proportionate steps that will be taken to ensure that the free expression of ‘news publisher content’ and content that is ‘intended to contribute to political debate’ is protected.
  • Journalistic content – include provisions in terms of service around the proportionate steps that will be taken to ensure that the free expression of ‘news publisher content’ and ‘content generated for the purpose of journalism that is UK linked’ is protected.

techUK is encouraged to see a continued focus on proportionality with Ofcom developing codes reliant on systems and processes yet the regime still lacks clarity on what types of content are considered ‘for the purpose of journalism’, ‘democratically important’ and ‘harmful’. 

Placing the onus on the companies to decide what is and is not acceptable online through risk assessments has the potential to create unequal standards, interrupt technological innovation and undermine individual rights.

New communications offences (Part 10) and priority illegal content offences (Schedule 7) 

The final Bill adopts some of the Law Commission proposals, endorsed by the Joint Committee, on harmful communications offences and adds priority illegal content offences to the face of the Bill. 

The new criminal offences include 'harmful, false or threatening' communications that are 'likely to cause harm to a likely audience', with harm being defined as 'physical or psychological harm amounting to at least serious distress'. These offences will replace existing clauses in the Communications Act meaning they are not limited to the online environment and could extend to cover other types of communication such as writing letters (Explanatory Notes p 96). 

Schedule 7 (p 183) outlines a range of new priority illegal content offences that, in addition to terror content and CSAM, will help inform Ofcom’s illegal content code(s) of practice. The new offences include:

  1. Assisting suicide
  2. Threats to kill
  3. Harassment, fear, or provocation of violence
  4. Drugs and psychoactive substances
  5. Firearms and other weapons
  6. Assisting illegal immigration
  7. Sexual exploitation and images
  8. Proceeds of crime
  9. Fraud

Listing the types of priority illegal content offences on the face of the Bill is something techUK has long called for, however, we have concerns that leaving types of harmful content to secondary legislation will infringe on the ability of in scope services and the regulator to make quick and effective decisions.

Paid for advertising and fraud (Part 3)

One of the most significant additions to the Bill is the inclusion of paid-for advertising and fraud as a new priority offence.

Category 1 services will have a duty to prevent individuals from encountering fraudulent ads, minimise the amount of time the fraudulent ad is present and swiftly remove the ad once they are made aware of it through any means. Category 2A services will have a similar duty addressing fraudulent ads that are encountered by users via search.

The types of fraud offences are taken from the 2000 Financial Services Markets Act (FSMA) and 2006 Fraud Act and include false claims, restrictions on financial promotions and false representation.

techUK believes the most effective way to disrupt fraudsters is to work across private and public sectors and form collaborative solutions instead of focusing on siloed responses. It will be crucial for Ofcom and the Financial Conduct Authority to work transparently and ensure that the Online Safety Bill regime remains focused to deliver on the objectives.

Enforcement and liability (Part 7)

Part 7 outlines the powers of the regulator including duties to maintain a register of companies, gather information and carry out impact assessments. Some of the more significant additions to the Bill include Ofcom’s enforcement powers around pro-active technologies.

Ofcom will have the power to require regulated services to use pro-active technologies for illegal content, harmful content towards children and fraudulent advertising with non-compliance resulting in financial penalties.

There is a risk that this might require general monitoring - either for companies to avoid being issued a 'confirmation decision' around pro-active technologies, or as a result of the use of such technologies - which undermines intermediary liability protections set out in Article 15 of the eCommerce Directive (Explanatory Notes, p.7). 

Intermediary liability is fundamental for an open internet and techUK questions whether changes to existing global norms are necessary for the Online Safety Bill to meet its objectives. We would welcome an economic impact assessment on potential changes to liability to inform future direction of regulation.

Next steps

The Online Safety Bill will soon enter Parliament where MPs and Lords will be able to table amendments to change the regime.

Following what will inevitably be a long amendment process, both houses will need to approve a final version of the Bill for Royal Assent. This process is expected to be complete by the end of 2022, and services will likely be given a 12-month transition period to prepare for compliance.

Lulu Freemont

Lulu Freemont

Head of Digital Regulation, techUK

Lulu is Head of Digital Regulation at techUK, working across areas related to digital regulation, such as online harms and competition.

Prior to working at techUK, Lulu worked at social enterprise Parent Zone for a number of years, heading up the Policy and Public Affairs team. Working closely with technology companies, Parliamentarians and schools, her focus was on building digital resilience to help improve outcomes for children growing up in a digital world.

Lulu holds a MA (Hons) in Human Rights Law from SOAS, and a BA (Hons) in Politics from the University of Exeter.

Email:
[email protected]
Website:
www.techuk.org/
LinkedIn:
https://www.linkedin.com/in/lulu-freemont-b087a8a2

Read lessmore


techUK - Building a Thriving Digital Society

Visit our Digital Society Hub to learn more or to register for regular updates.

techUK is in constant dialogue with Government and policy makers to provide the perspective of the tech industry on a wide range of policy issues. Current policy engagement includes online safety, data protection, competition in digital markets, and online fraud. Get in touch to see how we can support your policy work. Visit our Digital Society Hub and complete the ‘contact us’ form.

digital_regulation_generic_card_1200x675px_final.jpg