Social media and screen time on young people’s health

There is a clear desire to develop better solutions to tackle online harms but we must be careful to ensure that any actions taken are targeted, proportionate and effective at tackling the harms we want to tackle.

Tech companies are committed to working constructively with government to find the best way forward. While the Committee found little evidence linking the internet and social media to the negative health of young people, we support the call for research to be undertaken. The internet brings enormous social and mental health benefits to young people and can be a groundswell of support for young people, but it is right that any potential downsides are thoroughly investigated.

There are some good suggestions in the report: the tech industry has long supported mandatory PSHE in schools, including the inclusion of an online element. Companies already host and provide a range of initiatives and toolkits across the country for children, parents and teachers on these subjects, but it is right that we look at how Government can support these efforts to ensure that people young and old can effectively use the tools provided to manage their online experience.

We do not believe there to be a standards lottery in place. Online companies, including social media, are subject to a range of regulation depending on the activity. Where there are adverts, ASA rules apply and where there is finance, the FCA regulates. As the Law Commission’s review on online abuse legislation highlighted, speech that is illegal offline is illegal online, something that is reflected in new regulations such as the EU AVMS Directive which explicitly state that online video-sharing platforms are within scope.

Other proposals, such as a broad duty of care, are not yet fully developed or understood. In its widest form a Duty of Care could require platforms to monitor all speech on their platforms in breach of other fundamental rights. Solutions must be found that are effective and proportionate taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.

These are the some of same issues faced by proposals for strict time limits for removing illegal content, which we do not think would be effective in helping tackle this material. In Germany, the similar NetzDG law has been criticised by rights groups for promoting censorship as companies opt to remove legitimate content for fear of fines.

These are difficult issues that impact everyone and it is vital that we all work together constructively to get the solutions right.

Share this

FROM SOCIAL MEDIA

Final few places for our 'Tech and the future of social care roundtable' next Tuesday! Get in touch with… https://t.co/DoLox1w0XW
Great second #AnalystLaunchpad session @techUK with @uoebusiness, @DuncanChapple & @neilpollock on helping SMEs suc… https://t.co/aP8h6XKiNL
Great to be part of the @ukinfrance panel on how tech can address modern slavery in fashion - thanks to… https://t.co/rzjxyWbnAE
techUK would like to congratulate SME member @teslasuit for winning an @DIGITALEUROPE Future Unicorn Award at yeste… https://t.co/YUYg1RgRqI
Challenge time after some fascinating speakers at Tech Connect Industry Day 2 at @fujitsu_uk https://t.co/W6x1ji9WVk
Not often we're invited to fashion week! Looking forward to speaking on how tech & innovation can help fashion bran… https://t.co/yhYMP8GBPD
Become a Member
×

Become a techUK Member

By becoming a techUK member we will help you grow through:

Click here to learn more...