17 Jan 2022

Online Safety: A guiding hand for tech companies

Georgina Kon and Peter Church from Linklaters LLP consider Ofcom's approach to the VSP regime and how this may provide insight into how it might regulate the broader online safety regime.

Governments across the globe face unique challenges in creating a regulatory framework to control online harms. The need to protect users must be balanced against rights to freedom of speech and difficult questions about positioning tech companies as arbiters of questions of fundamental rights. Moreover, the solutions need to work at scale. 

In the UK, the draft Online Safety Bill is the main legislative response addressing online safety. It targets both illegal content and some legal but harmful content (e.g. disinformation and cyberbullying), with the intention of making the UK “the safest place in the world to be online”. Although the Bill will not take effect until 2022 at the earliest, there is a separate regime already in place that requires UK-based video sharing platforms (“VSPs”) to take measures to protect users from harmful content.  

The VSP regime has attracted less attention than the online safety reforms and is much narrower in scope, but there are similarities between the two, including a focus on protecting users. The VSP regime is therefore an important testbed for the reforms under the Online Safety Bill, and the approach taken by Ofcom in regulating the VSP regime will provide insight as to how it may regulate the broader online safety regime.

We consider Ofcom’s recent VSP strategy paper (here) and guidance for VSP (here) below.

 

Ofcom’s plans and approach for the coming year

Ofcom’s strategy paper sets out its priorities for the year ahead and provides insight into the types of content it will focus on and the sorts of measures it expects VSPs to prioritise.

1. Reducing child sexual abuse material (“CSAM”). Unsurprisingly, this is a key focus for Ofcom, who expects all VSPs to implement terms and conditions to prohibit CSAM, and to have robust processes for identifying and removing CSAM.

2. Tackling hate and terror. Ofcom also expects VSPs’ terms and conditions to prohibit uploading content relating to terrorism, racism and xenophobia, and intends to look closely at how these restrictions are enforced.

3. Protections for under 18s. Another key concern is protecting children from accessing unsuitable material online. Ofcom intends to work with VSPs whose services are likely to be accessed by children to ensure they provide an age-appropriate experience.

4. Age verification on adult VSPs. Similarly, where VSPs provide adult content they should take stringent steps to limit access to children. For example, pornographic material should be shielded by strict age verification systems.

5. Reporting and flagging. Finally, a key means to address unsuitable content is to rapidly address material that is flagged by users. Ofcom intends to review VSPs’ processes and consider how reports are actioned and what providers are doing to increase the engagement of their users with these safety measures.

 

Ofcom’s harm and measures guidance

Accompanying these priorities is broader guidance from Ofcom about the measures it expects VSPs to implement. Importantly, this is not about the regulation of individual videos but rather the wider systems and processes necessary to address these issues at scale.

The starting point is to have the right governance. VSPs must put in place a risk management framework - which senior decision-makers in the organisation should have good oversight of – that includes risk identification processes to identify emerging risks.

In terms of the substantive measures used to address these risks, there are examples within the guidance. VSPs should be prepared to explain any deviation from them and monitor their effectiveness in practice. An important part of those measures is to clearly set out what types of content are prohibited by the VSP in the relevant terms and conditions, and to allow users to flag content that is harmful.

Recognising that this is a new and difficult area, Ofcom has indicated it wants to work collaboratively with VSPs when carrying out supervision and enforcement. It has also suggested that in relation to particular types of harms, providers may want to collaborate with relevant specialist organisations (e.g. the NSPCC). 

 

Conclusions

The Online Safety Bill will eventually supersede the VSP legislation and apply to a much wider range of online services. Companies within the scope of the Bill will no doubt be watching carefully to see how Ofcom exercises its powers under the VSP regime for a possible indication of how this broader regime is likely to play out.

 


This insight was authored by Georgina Kon and Peter Church, Linklaters LLP.

Related topics