UK Government publishes final online harms response
The Government has confirmed new rules for companies to regulate online content, with a focus on illegal content, greater protection for children and stronger enforcement measures, such as fines of up to 10% of business turnover and possible senior management liability.
This announcement confirms Ofcom as the regulator and broadly outlines a differentiated approach towards regulation, where companies will fall into different categories based on Ofcom’s assessment.
All companies in scope will have a ‘duty of care’ towards their users which they can fulfil by following different codes of practice which will be issued by Ofcom in the next stages, following the ‘objectives of the legislation’, set by Government. Voluntary non-binding interim codes on terrorism and child sexual exploitation and abuse have also been published to ‘help companies begin to implement the necessary changes and bridge the gap until Ofcom issues its statutory codes’.
In this Government response there is a lot to be supportive of, including the exclusion of financial harms and low risk services, the focus on systems and processes, providing greater protection for children, and understanding the differentiation between online platforms. However, further clarity is needed on exactly how the key elements of the regime will be operationalised for the 175,000 tech companies still in scope, to ensure a proportionate outcome which supports the UK digital economy.
Some of the key areas that will need to be discussed as we move to the next stages of the ‘Online Safety Bill’ include legal but harmful content, defining categories, private messaging, available technical tools, and enforcement measures. Clarity will be essential to help ensure that the Government’s direction of travel secures the stated aims without being disproportionate and excessive for the range of companies in scope.
‘Legal but harmful’
The Government has updated the terms of ‘legal but harmful’ content, with all companies in scope required to take action against this kind of content directed towards children and some against adults, yet there is no real direction on how this content will be defined.
The Government’s response has confirmed that a ‘limited number of priority categories of harmful content’ towards adults will be introduced in secondary legislation, with the aim of providing ‘legal certainty for companies and users’. It also references how companies will need to understand the ‘risk of harm to individuals on their services’ under the duty of care.
Companies with services ‘likely to be accessed by children’ will ‘need to make clear what is acceptable on their services’, following Ofcom’s guidance on ‘appropriate levels of protection’ and codes of practice which are expected to include harms such as ‘cyberbullying’ and ‘online pornography’. It is not confirmed which additional harms will fall into this category and how they will be defined, creating uncertainty about what exactly will be required for companies to protect children from harmful content.
techUK supports the ambition to provide ‘legal certainty’, yet the subjectivity of online experience places this content in a legal grey area. Clear definitions and an evidence-led process for defining harm will be essential to both help companies understand their obligations and protect freedom of expression.
Taking an approach that considers ‘differentiated expectations on companies’ regarding content and activity is a welcome step in the Government’s response. In addition to protecting children from harmful and illegal content, ‘high risk and high reach category 1’ services will be required to protect adults from harmful content and conduct transparency reporting. ‘Category 2’ services will not have these additional requirements.
The Government will be creating the thresholds, while Ofcom will be able to ‘add companies to category 1 services if they meet the threshold’. When creating the thresholds, consideration should be given to the actions which might result in a service changing category. The shift from ‘category 2’ to ‘category 1’ would place a range of new responsibilities on companies, which could be unplanned or unexpected. Outlining what this transition might look like, when it could occur and any implications will be beneficial for companies to understand what this differentiation means in practice.
The regulatory framework will apply to ‘private and public communication channels and services where users expect a greater degree of privacy’. At this stage, it is unclear exactly what this means for encrypted services, with users’ ‘expectations of privacy’ being somewhat subjective. Clear legal definitions will help companies understand what steps are required under a ‘user’s expectation of privacy’. Individuals already have a ‘right to privacy’, as proscribed and defined in international law, and the Government should consider its existing human rights obligations and how the ‘Online Safety Bill’ will interact and align with them.
Supporting and protecting children
The Government rightly places emphasis on the need to provide greater protections for children. Our members have varied levels of technical tools and resources based on their size, resources, and functionality, and will make every appropriate effort to protect children from illegal and harmful online content.
When considering technical tools to ‘prevent anonymous adults from contacting children’, clarity is needed on exactly what this means for the range of different companies in scope. We welcome the Government’s commitment to not put ‘any new limits on online anonymity’, yet how this might work in practice alongside the suggestion of preventing anonymous adults from contacting children, is not confirmed. Outlining how these technical solutions will work, including whether they are required to be applied for both illegal and legal harms towards children, will be important to assess their appropriateness for each different service. There is overriding support from the sector to protect children online from illegal and legal harms, yet the feasibility and efficacy of technical tools must be debated as we move into the next stages of the ‘Online Safety Bill’.
The Government has outlined how Ofcom will have the power to ‘issue sanctions in the form of civil fines of up to £18 million or 10% of annual global turnover’. This seems excessive for the range of companies in scope and has the potential to discourage investment, while not necessarily achieving the stated aims of the regulation. We welcome Ofcom taking a ‘proportionate approach to its enforcement activity’, yet it is not clear how this will work in practice and if whether this approach will be effective in combatting non-compliance.
The Government will also ‘reserve the right to introduce criminal sanctions for senior managers’ if a company fails to respond to the regulator for a breach of duty of care. Although this power is in reserve and will act as a ‘last resort’, there is a risk that including this provision in this regulation will have a chilling effect on smaller companies and investment in the UK digital economy.
Media Literacy Strategy
The Government has confirmed they will publish their online media literacy strategy in Spring 2021. We welcome the recognition of ‘the vital role education can play in supporting adults and children to navigate the online world safely’ and companies are already taking actions to support adults and children develop skills and resilience online. Developing media literacy is part of a longer-term solution to enable positive experiences online and we are disappointed to see a delay to the publication of the Government’s strategy. techUK and our members will continue to put in efforts to demonstrate the importance of education and digital citizenship, to go hand in hand with any form of regulation.
This Government response shapes the path for the publication of the ‘Online Safety Bill’, which is due in early 2021. This primary legislation is likely to include a framework for how some of these processes might work, including the setting out of ‘high level factors which lead to significant risk of harm’ to help determine the category 1 services.
The Government has referred many of the key decisions on legal definitions and boundaries to ‘secondary legislation’, including the categories of harmful but legal content and objectives for the regulator’s codes of practice. This means that further decisions are to be made before we have clarity on some of the most important elements of the regime. techUK looks forward to working with Government and Parliament as we move through this process to ensure the outcome is effective in protecting users, while also being workable for the variety of companies in scope.
The ‘Online Safety Bill’ is entering a crowded digital regulatory space and coherence could not be more important. We support the Government’s intention for coordination, and it is essential that Ofcom, ICO and CMA continue to work closely together to form an approach which does not disproportionately impact innovation and investment in UK digital markets.
Lulu is a Policy Manager at techUK, working across areas related to digital regulation, such as online harms and competition.
Prior to working at techUK, Lulu worked at social enterprise Parent Zone for a number of years, heading up the Policy and Public Affairs team. Working closely with technology companies, Parliamentarians and schools, her focus was on building digital resilience to help improve outcomes for children growing up in a digital world.
Lulu holds a MA (Hons) in Human Rights Law from SOAS, and a BA (Hons) in Politics from the University of Exeter.
- [email protected]