techUK is pleased to see continued progress being made by Government on the issue of online harms. Our members remain absolutely committed to keeping users safe online and today's announcement marks the next phase of close and constructive working between Government and Industry.
The direction of travel is encouraging, but much more work is needed to help build a framework that is effective and proportionate – protecting and empowering users whilst ensuring the UK remains pro-innovation and investment.
The Government’s response provides some clarity while also raising many questions of its own. We look forward to engaging over the coming months to gain clarity on these issues and to create workable and effective regulation for the whole sector.
Critical to this will be clarity on the definition of harm. By leaving this decision to industry it ignores the core challenge in tackling harmful content and why the wrong decisions can sometimes be reached. It is difficult to define when someone is simply teasing or is a toxic troll; whether an individual has a forceful personality or is a bully; or if someone is spreading disinformation is just misinformed.
Requiring industry to define these issues by themselves, and proposing sanctions when they get the decision wrong, is not the best way forward. This is one of many challenges presented by the current state of debate. techUK has looked through the Government’s response below, highlighting issues for concern and where more detail is still needed, including:
Scope of Regulation:
It is highly encouraging to see the range of B2B companies be removed from the scope of the regulation. This is something we have called for since Day 1 and is the right decision, creating a more proportionate and risk-based approach than previously suggested.
However, the proposed scope of the regulation remains much too wide, and further guidance will be needed on whether some companies are in scope. While the Government say this only effect 5% of UK businesses this still accounts for approx. 300,000 UK businesses, far beyond the traditional social media companies in many people’s minds.
This regulation will effect SMEs like Mumsnet, online retailers with a review section like Argos, and even online media which allow user comments, such as the Daily Mail.
As the economy digitizes this scope will only grow, as more and more businesses create an online presence that may facilitate user-generated content, presenting a greater and greater challenge to the regulator. While enforcement will no doubt be proportionate this will only serve to increase the costs and stress faced by many small businesses.
Defining Harmful Content
It is important to bear in mind that while this is advertised as regulation of technology companies, it is in fact regulating user speech with the company acting as an enforcer. It is right that the regulator must give due regard for freedom of expression, and the Government’s commitment that regulation will not prevent adults from accessing or posting legal content, nor require companies to remove specific pieces of legal content is welcome.
It is appropriate that illegal content and legal but harmful content are treated differently, however the proposed regulation still avoids the toughest questions that would assist companies in the identification and removal of harmful content, primarily the definition of harm.
Who defines harmful content will be a key aspect of any upcoming regulatory regime, and it is important that there are strong democratic safeguards in place so that legal content is not made de facto illegal online. It is not appropriate to simply delegate this decision to industry, requiring them to police legal content and then sanctioning them for potential mistakes.
Age Verification & Assurance:
Beyond deciding what type of legal content or behavior is acceptable on their services, the proposals suggest an further responsibility to take “reasonable steps to protect children from harm” and offer children a higher level of protection.
It is unclear if this obligation overrides the ability of companies to decide what content should be allowed on their service, or how it could impact the commitment for Government not to restrict adults’ ability to view, share and post legal content.
The proposal to rely on age assurance and age verification technology is a worrying development. As we have highlighted in our responses to the ICO’s Age Appropriate Design Code, this is an area fraught with challenges. There is a risk that regulation would lead to age verification becoming the norm for most, if not all, services in scope. This could have very significant implications that need to be assessed and thought through carefully.
There are real questions about whether the wider use of age verification is in the interests of either the user of a service or the service provider. Implemented badly, this could lead to a situation where companies are encouraged to collect more data, including documentation to verify age and introduce log-in measures to minimise disruption to user experience.
Moreover, it is questionable whether robust, privacy centric and user-friendly age-verification tools are sufficiently well developed to be deployed at the scale and pace that would be required for companies to comply. Many companies have no desire to collect highly personal ID that may be used to verify age, such as passports.
Not only would this provide high burdens on companies, but could also lead to the restriction of children’s access to vital online services, either because they are unable to purchase new forms of ID, or because some only services may opt to make their services only available to adults to reduce the liability under the regulation.
The Safer Internet Centre recently published research which showed how critical the internet is to young people’s development and identity. We should be wary of any proposals that would restrict this.
Ofcom as the Regulator:
Ofcom's experience makes it an appropriate voice in this debate but if it is to take on this new role, vastly expanding on its current remit, it must be given the appropriate resources and be upskilled to meet the challenge ahead. More information is needed on how Ofcom’s responsibility to tech innovation and proportionality means in practice, but this is encouraging and certainty moving in the right direction.
However, not all of Ofcom’s lessons from broadcast regulation appropriately translate. The scope of businesses Ofcom would need to deal with would vastly increase under these proposals. Currently, Ofcom regulates a small number of TV and radio stations, which would grow to up to 300,000 businesses in the current scope of the regulation.
Furthermore, broadcast regulation covers a much smaller amount of content, with only 24 hours in a day. In comparison, there are over 500 million tweets a day, 500 hours of content uploaded to YouTube every minute, and billions of people are active on Facebook everyday. This will require a different approach by Ofcom, one that looks at processes and not posts.
Such scale also raises a question for the funding of the regulator – Ofcom is currently funded by a fee on all those under its scope, however this would not be appropriate for an online world and would amount to a tax on user-generated content. Careful consideration will need to be given as to how the regulator has the proper resources it needs to get on with the job.
We await details of the kinds of enforcement powers that the regulator would have. However, proposals such as senior manger liability would be a huge disrupter to the success of the UK tech sector. It would discourage start-ups from choosing the UK as their destination of choice, and discourage new investments at a time when we should be captalising on the sector’s record growth and investment.
Moreover increasingly harsh sanctions ignore that these are difficult decisions where the wrong decisions are sometimes made not out of malice or negligence but because of subjective nature of some of the harm. Large fines or manager liability would not help make these any easier.
Such sanctions would also be targeted at the good actors in this space, those regularly engaged with Government to come the right decision, rather than the most harmful actors. These are companies often from less-democratic regimes who may have no interest in abiding by any regulation, or nominating a UK director. It is important we look at how enforcement would help tackle these actors, rather than those currently constructively engaging the process.
Far more work is needed to answer these and other questions in the coming months. It is right that proposed regulation is carefully thought through in consultation with industry, civil society and parliamentarians, with pre-legislative scrutiny forming a core part of this.
techUK is committed to working with Government as it continues to develop these proposals, and helping reach an outcome that works for all.