Social media and screen time on young people’s health

There is a clear desire to develop better solutions to tackle online harms but we must be careful to ensure that any actions taken are targeted, proportionate and effective at tackling the harms we want to tackle.

Tech companies are committed to working constructively with government to find the best way forward. While the Committee found little evidence linking the internet and social media to the negative health of young people, we support the call for research to be undertaken. The internet brings enormous social and mental health benefits to young people and can be a groundswell of support for young people, but it is right that any potential downsides are thoroughly investigated.

There are some good suggestions in the report: the tech industry has long supported mandatory PSHE in schools, including the inclusion of an online element. Companies already host and provide a range of initiatives and toolkits across the country for children, parents and teachers on these subjects, but it is right that we look at how Government can support these efforts to ensure that people young and old can effectively use the tools provided to manage their online experience.

We do not believe there to be a standards lottery in place. Online companies, including social media, are subject to a range of regulation depending on the activity. Where there are adverts, ASA rules apply and where there is finance, the FCA regulates. As the Law Commission’s review on online abuse legislation highlighted, speech that is illegal offline is illegal online, something that is reflected in new regulations such as the EU AVMS Directive which explicitly state that online video-sharing platforms are within scope.

Other proposals, such as a broad duty of care, are not yet fully developed or understood. In its widest form a Duty of Care could require platforms to monitor all speech on their platforms in breach of other fundamental rights. Solutions must be found that are effective and proportionate taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.

These are the some of same issues faced by proposals for strict time limits for removing illegal content, which we do not think would be effective in helping tackle this material. In Germany, the similar NetzDG law has been criticised by rights groups for promoting censorship as companies opt to remove legitimate content for fear of fines.

These are difficult issues that impact everyone and it is vital that we all work together constructively to get the solutions right.

  • Ben Bradley

    Ben Bradley

    Senior Policy Manager | Digital Strategy
    T 07834 126 826

Share this


An important proposal and something that @techUK is very proud to be a part of.
Local Digital Fund open for next round of funding for councils looking to improve public services through digital t…
Join our newest flagship event #CreatingDigitalFutures on 08 October in Manchester. We'll be exploring how the tech…
Just a few days left to submit your vote – who do you think should be named @ComputerWeekly most influential woman…
One month to go until #SmarterStateUK 2019. Join us on 18 September, along with digital leaders from across the pub…
Join us on 06 November in #Manchester for #Supercharging the Digital Economy 2019. We'll be discussing how digital…
Make sure you get your vote in before the deadline this Friday! #womenintech #CWwit50
Become a Member

Become a techUK Member

By becoming a techUK member we will help you grow through:

Click here to learn more...