There is a clear desire to develop better solutions to tackle online harms but we must be careful to ensure that any actions taken are targeted, proportionate and effective at tackling the harms we want to tackle.
Tech companies are committed to working constructively with government to find the best way forward. While the Committee found little evidence linking the internet and social media to the negative health of young people, we support the call for research to be undertaken. The internet brings enormous social and mental health benefits to young people and can be a groundswell of support for young people, but it is right that any potential downsides are thoroughly investigated.
There are some good suggestions in the report: the tech industry has long supported mandatory PSHE in schools, including the inclusion of an online element. Companies already host and provide a range of initiatives and toolkits across the country for children, parents and teachers on these subjects, but it is right that we look at how Government can support these efforts to ensure that people young and old can effectively use the tools provided to manage their online experience.
We do not believe there to be a standards lottery in place. Online companies, including social media, are subject to a range of regulation depending on the activity. Where there are adverts, ASA rules apply and where there is finance, the FCA regulates. As the Law Commission’s review on online abuse legislation highlighted, speech that is illegal offline is illegal online, something that is reflected in new regulations such as the EU AVMS Directive which explicitly state that online video-sharing platforms are within scope.
Other proposals, such as a broad duty of care, are not yet fully developed or understood. In its widest form a Duty of Care could require platforms to monitor all speech on their platforms in breach of other fundamental rights. Solutions must be found that are effective and proportionate taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.
These are the some of same issues faced by proposals for strict time limits for removing illegal content, which we do not think would be effective in helping tackle this material. In Germany, the similar NetzDG law has been criticised by rights groups for promoting censorship as companies opt to remove legitimate content for fear of fines.
These are difficult issues that impact everyone and it is vital that we all work together constructively to get the solutions right.