Online Safety Bill: A compliance headache for tech firms
The UK government published its draft Online Safety Bill in May this year, setting out its ambitious new framework to protect users against harmful content online. Much of the Bill reflects the Government’s previously announced position but it does include new obligations applicable to services with the widest reach (‘Category 1’), adding to their already substantial compliance burden.
Wide range of duties applicable to all
Broadly, all search engines and services which allow user interaction will need to:
- Conduct risk assessments to understand the risk of harm to users from illegal content and implement systems and processes to protect users against illegal content;
- Conduct regular compliance reviews, and keep records of all risk assessments and steps taken to comply with their obligations;
- Have regard to freedom of expression and user privacy when designing systems and processes;
- Have effective user reporting and complaints mechanisms; and
- Have clear and accessible T&Cs and apply those consistently.
All services will need to determine if they are likely to be accessed by children and, if so, conduct risk assessments to understand the risk of harm to children from harmful content and implement systems and processes to protect children against such content.
Services whose worldwide revenue is at least equal to a threshold set by Ofcom (and approved by the Secretary of State) will need to notify Ofcom and pay an annual fee.
Additional duties for Category 1 services
Category 1 services will also be required to conduct risk assessments to understand the risk of harm to adult users from lawful but harmful content and specify in T&Cs how they will deal with such content. They will also be required to produce an annual transparency report that meets criteria set by Ofcom.
The Bill introduced duties for Category 1 services that had not previously been detailed, requiring them to:
- Periodically assess the impact of new and existing policies on the protection of freedom of expression and user privacy;
- Design systems and processes to ensure free expression of content of ‘democratic importance’ and journalistic content is given due consideration when deciding how to treat content and users; and
- Have a dedicated and expedited complaints procedure for journalistic content.
Possible senior management liability
The Bill includes a criminal offence for senior managers, but this will not be introduced for at least two years after the regime is fully operational and only if services do not ‘step up their efforts to improve safety’. Even if individual liability is introduced, it will only apply if companies do not comply with Ofcom’s information requests.
However, the Bill does introduce a criminal offence for ‘officers’ (who are, or purport to be, a director, manager, secretary or other officer) where the officer has consented or connived with a service’s failure to comply with Ofcom’s information requests.
Be ready, act now
Complying with this regime alone would be a major undertaking for many services. Yet this is just one of many regulatory regimes either in effect or being developed across the globe. Putting in place systems and processes that can comply with each regime, while maintaining a broadly consistent user experience, is a challenge that services will need to begin tackling sooner rather than later.
While preparation is key, lack of detail in the Bill in key areas - such as what constitutes harmful content and the thresholds for category 1 services – means that platforms will not be able to fully prepare until the legislation is in final form. Services will therefore need to maintain some flexibility when planning their compliance regimes at this stage.
Lulu is Head of Digital Regulation at techUK, working across areas related to digital regulation, such as online harms and competition.
Prior to working at techUK, Lulu worked at social enterprise Parent Zone for a number of years, heading up the Policy and Public Affairs team. Working closely with technology companies, Parliamentarians and schools, her focus was on building digital resilience to help improve outcomes for children growing up in a digital world.
Lulu holds a MA (Hons) in Human Rights Law from SOAS, and a BA (Hons) in Politics from the University of Exeter.
- [email protected]