Social media and screen time on young people’s health

There is a clear desire to develop better solutions to tackle online harms but we must be careful to ensure that any actions taken are targeted, proportionate and effective at tackling the harms we want to tackle.

Tech companies are committed to working constructively with government to find the best way forward. While the Committee found little evidence linking the internet and social media to the negative health of young people, we support the call for research to be undertaken. The internet brings enormous social and mental health benefits to young people and can be a groundswell of support for young people, but it is right that any potential downsides are thoroughly investigated.

There are some good suggestions in the report: the tech industry has long supported mandatory PSHE in schools, including the inclusion of an online element. Companies already host and provide a range of initiatives and toolkits across the country for children, parents and teachers on these subjects, but it is right that we look at how Government can support these efforts to ensure that people young and old can effectively use the tools provided to manage their online experience.

We do not believe there to be a standards lottery in place. Online companies, including social media, are subject to a range of regulation depending on the activity. Where there are adverts, ASA rules apply and where there is finance, the FCA regulates. As the Law Commission’s review on online abuse legislation highlighted, speech that is illegal offline is illegal online, something that is reflected in new regulations such as the EU AVMS Directive which explicitly state that online video-sharing platforms are within scope.

Other proposals, such as a broad duty of care, are not yet fully developed or understood. In its widest form a Duty of Care could require platforms to monitor all speech on their platforms in breach of other fundamental rights. Solutions must be found that are effective and proportionate taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.

These are the some of same issues faced by proposals for strict time limits for removing illegal content, which we do not think would be effective in helping tackle this material. In Germany, the similar NetzDG law has been criticised by rights groups for promoting censorship as companies opt to remove legitimate content for fear of fines.

These are difficult issues that impact everyone and it is vital that we all work together constructively to get the solutions right.

Share this

FROM SOCIAL MEDIA

Conflict minerals are a huge tech issue for tech and on 29 April we’re running a conference on how businesses from… https://t.co/vuIxKDqSV7
Don't miss the #techUKAnnualDinner on Wed 10 July where you'll hear from our keynote speaker @DCMS_SecOfState - mak… https://t.co/fyMiExG7f2
Fantastic report from @WILDLABSNET @ODIHQ & @OfficeforAI on a Data Trust pilot to aid the fight against the illega… https://t.co/gcAfkp8ROh
Will nation states reach a consensus on responsibly governing cyberspace? How can AI be used responsibly to secure… https://t.co/15cCjar0jo
Congratulations to all of the companies that have been selected for this great initiative. @techUK is pleased to h… https://t.co/dSC9vKSXG0
With climate activism in the news we're holding a workshop w/ @thecarbontrust looking at how tech firms can set amb… https://t.co/DJbOWuDDCJ
@dstlmod launches first SME Searchlight event looking at Radio-Frequency systems. For more info and to register: https://t.co/ZSy2fpHCVF
We were pleased to host the last digital buying community @gov_procurement, the next meetup on 02 May hits the Nort… https://t.co/oqlnDGbMT5
Become a Member
×

Become a techUK Member

By becoming a techUK member we will help you grow through:

Click here to learn more...