The Government's Growing Up in the Online World consultation closes on 26 May. techUK sets out its positions ahead of submitting a formal response.
A legal minimum age of 16 for social media will dominate the headlines around the Government’s ‘Growing Up in the Online World’ consultation - but it is not the question that will most shape children's experience online. The questions that matter are about scope, functionality, AI, and age assurance, and the answers need to be risk-based, proportionate, and grounded in evidence. techUK's response sets out how.
Britain already has one of the most comprehensive frameworks for protecting children online anywhere in the world. The Online Safety Act 2023 imposes serious obligations on platforms - risk assessments, age-appropriate design, proactive removal of harmful content - with Ofcom as a regulator that can enforce with huge fines. It is still being implemented and is already producing results.
None of which means the conversation should end there. Technology moves fast, harms evolve, and the Government is right to keep asking whether the framework is keeping pace. But it would be unwise to reach for unworkable solutions lacking a proper evidence base. The risk with this consultation is that it produces measures that feel decisive while shifting harm rather than reducing it. techUK is not opposed to age-appropriate access, nor to targeted new obligations where genuine gaps exist. What we are opposed to is poorly designed regulation that mistakes visible action for actual progress. This insight lays out where we stand.
Regulate by risk, not by service label
techUK’s position: scope should be determined by the features and functionalities a service offers and whether it is accessed by children - not platform labels, brand recognition, or size. We are calling on Government to adopt universal standards: all services with the relevant features and functionalities, accessed by children, held to the same expectations. That is a stronger, fairer and more future-proof basis for regulation than category-by-category definitions that will be out of date before the ink is dry.
The most consequential decision in this consultation will not be where to set the age limit, but rather which services that limit applies to.
"Social media" is not a coherent regulatory category. The features people are worried about - user-to-user interaction, content recommendations, messaging, livestreaming - appear across gaming platforms, educational tools, and messaging apps that nobody would recognise as social media. Regulatory lines drawn too narrowly around a handful of well-known platforms simply redirect children to less moderated alternatives. Drawn too broadly, they sweep up low-risk services and imposes compliance costs that protect nobody.
The criteria that actually matter are the nature and scale of user interaction, the degree of content exposure, the moderation complexity involved, and the underlying incentive structure of the service. A gaming platform that pairs children with strangers and allows unmoderated voice chat is a meaningfully different proposition from a closed educational tool, even if both technically permit "user-to-user interaction." Regulation that cannot draw that distinction will not make children safer - it will just provide more work for compliance lawyers.
Size-based thresholds are no better. Some of the most harmful online spaces are not the largest ones. A workable scope definition needs to be built around functional characteristics, not fixed lists of named services that will be out of date before the ink is dry.
Strengthen the OSA — don't bypass it
techUK's position: Ofcom-led, service-specific risk assessment under the Online Safety Act is the right mechanism for tackling harmful features and design. We are calling on Government to strengthen and accelerate that approach rather than replace it with prescriptive feature bans that cannot keep pace with how technology works in practice.
The consultation asks whether specific features - livestreaming, disappearing messages, algorithmic recommendations, push notifications, infinite scroll - should be restricted or banned for children. The concern is understandable; however, the solutions need to be proportionate and workable.
Features are the right unit of regulatory analysis – but they have to be assessed in context. The same feature can be benign on one service and high-risk on another, depending on how itis configured by default, what friction exists around it, the moderation systems supporting it, and the incentive structures of the service it sits within. A livestream on a carefully moderated children's creative platform is not the same risk as one on a network with minimal safety investment. A recommendation algorithm that serves a child relevant learning content is not the same thing as one that is designed purely to maximise session length. Statutory bans imposed in primary legislation cannot make those distinctions; Ofcom-led risk assessment under the OSA is built to. Policy that treats them identically will disrupt the services doing the right things while leaving the incentive structures of the worst offenders largely intact.
On recommendation algorithms specifically, the policy debate too often treats them as inherently risky. They are not. Recommendation systems are how modern platforms deliver age-appropriate experiences at scale - filtering inappropriate content, demoting harmful material, surfacing safer alternatives, and enforcing age-based standards across billions of pieces of content. Blanket restrictions on algorithmic recommendations for children would not make their experiences safer; in many cases it would make them meaningfully less safe. The question is not whether algorithms should serve children, but how they are designed, what they optimise for, and what oversight applies.
Many services already configure teen experiences differently - restricting engagement mechanics, reconfiguring defaults, building supervised account models. This is not universal, and it is clearly not sufficient everywhere. But blanket feature bans would cut across that progress without replacing it with anything more effective.
The stronger argument in the consultation, and the one techUK finds more persuasive, focuses on business models rather than individual features. Where a platform's design is fundamentally oriented around keeping children online for very long periods of time – that warrants some kind of intervention. Ofcom-led, service-specific risk assessment is built for exactly that kind of nuanced judgment.
There is a reasonable case for harder requirements on the clearly high-risk functions - direct messaging with strangers, live image-sharing - where the harm evidence is concrete and where some platforms have simply declined to act. Where individual platforms have reduced trust and safety investment, the right response is targeted regulatory action against those specific failures, not blanket restrictions on services that have moved significantly in the right direction.
Distinguish AI companions from AI tools
techUK's position: AI regulation in a children's safety context must be service-specific and feature-specific. We are calling on Government to scope new AI obligations narrowly to companion services and to expressly exclude embedded AI features in non-companion products – the risks are different, and treating them the same will produce worse outcomes for children, not better.
The concerns raised about AI chatbots are real. Internet Matters research finds that sixty-four per cent of 9 to 17-year-olds already use them, with a proportion saying they turn to chatbots because they have no one else to talk to. The risks of emotional dependence, particularly where services are designed to feel like relationships, are real.
Extending Online Safety Act duties to chatbots not currently in scope is, in principle, reasonable. But it matters enormously that we get this right.
A companion chatbot designed to simulate emotional intimacy is a categorically different proposition from an AI tutoring tool embedded in a school platform. Treating them as equivalent would over-regulate services with genuine educational value - AI's potential to support personalised learning, accessibility, and creativity is substantial and should not be casually constrained - while potentially under-scrutinising the services that actually warrant attention.
The questions that should determine the scope of new obligations are whether a service encourages strong emotional attachment, whether it personalises in ways that deepen dependence, and whether it is used in a context where children are already vulnerable. Those questions produce very different answers for different services, and the framework should reflect that.
The Department for Education's product safety standards for AI tutoring tools are instructive here: no anthropomorphisation, no manipulative design, robust safeguarding protocols. That is the right kind of approach.
Proportionate age assurance — not blanket verification
techUK's position: age assurance should function as an enabler of differentiated, proportionate access calibrated to the features and content involved - not a uniform restriction applied across all services regardless of what they offer. We are calling for verification requirements to be explicitly calibrated to the sensitivity of the features and content being accessed, and for Government to prioritise a national approach to interoperability.
Age assurance is a legitimate tool within an online safety regime. On its own, it is not sufficient - and applied without proportionality, it can just create more of its own problems.
The technology has real limits that the consultation at least acknowledges honestly. Distinguishing a 14-year-old from a 16-year-old is significantly harder than distinguishing a child from an adult. Facial age estimation is a useful tool but not something that can be a precise threshold within this debate. Circumvention through shared devices, borrowed accounts, and VPNs is persistent.
Early data from Australia's eSafety Commissioner shows a substantial proportion of under-16s retained access after the ban came into force, with enforcement uneven and largely platform-initiated, and outcomes varying significantly between services. The regulator itself acknowledges ongoing compliance issues. The level of verification required should match the level of risk involved. Requiring document-level ID to access a content feed is not proportionate. Requiring it to access explicit content clearly is. A framework without that calibration will burden ordinary users - adults as much as children - without delivering meaningfully better safety outcomes for the children it is supposed to protect.
The structural problem that the consultation underweights is interoperability. Users currently verify their age separately across multiple services through multiple providers. The result is unnecessary friction, inconsistent outcomes, and repeated exposure of personal data. A coherent national standard built around data minimisation and layered verification would be substantially more effective, and substantially less intrusive, than the current patchwork.
On VPNs specifically: the evidence cited in the consultation does not support age-restricting them. The spike in VPN usage after the Online Safety Act's age assurance requirements came in was not driven by children trying to circumvent the rules. Many children use VPNs for legitimate privacy and safety reasons, which the consultation itself acknowledges. Restricting access would harm those children while doing little to stop anyone determined to circumvent the rules.
The test that matters
techUK supports what this consultation is trying to do. Children should have safe, enriching online lives, and the industry has a genuine responsibility for the environments it creates.
But every measure that comes out of this process should be held to a straightforward question: will it actually reduce harm to children, or will it move harm somewhere less visible while giving the appearance of action?
Age limits without effective enforcement, feature restrictions without contextual judgment, and age assurance requirements without proportionality all struggle to pass that test.
The Online Safety Act is a good foundation that is already working to protect children online. The right approach is to build on it with targeted, evidence-based additions where real gaps exist - not to duplicate, contradict, or undermine what is already there.
That is the case techUK will make in its formal response, and we welcome engagement from members and policymakers who want to get this right.
Doniya Soni-Clark
Associate Director of External Affairs, techUK
Doniya Soni-Clark
Associate Director of External Affairs, techUK
Doniya Soni-Clark is Associate Director of External Affairs at techUK, where she leads the organisation's relationships with key political stakeholders and ensures the voice of the UK tech sector is heard loud and clear - in Westminster, Whitehall, and beyond. She is responsible for shaping techUK's political engagement strategy and representing members' interests to media, translating complex tech policy into compelling narratives that cut through.
With a career spanning public affairs, public policy, and industry advocacy, Doniya brings a combination of political instinct and policy depth. She began her career at Westbourne Communications before joining techUK in 2015 as Policy Manager for Skills — her first stint at the organisation. She then moved to the Greater London Authority as Principal Policy Officer, before spending seven years at Multiverse, the fast-growing British Edtech scale-up, as Head of Policy and Public Affairs. Her return to techUK brings that experience full-circle.
Doniya is a committed advocate for the UK's tech ecosystem and understands what it takes to grow a business in a complex regulatory and political environment - having lived it from the inside.
Doniya holds an MSc in Public Policy from the London School of Economics and a BA in Economics & Politics from the University of Exeter.
Samiah Anderson is the Head of Digital Regulation at techUK.
With over seven years of Government Affairs expertise, Samiah has built a solid reputation as a tech policy specialist, engaging regularly with UK Government Ministers, senior civil servants and UK Parliamentarians.
Before joining techUK, Samiah led several public affairs functions for international tech firms and coalitions at Burson Global (formerly Hill & Knowlton), delivering CEO-level strategic counsel on political, legislative, and regulatory issues in the UK, EU, US, China, India, and Japan. She is adept at mobilising multinational companies and industry associations, focusing on cross-cutting digital regulatory issues such as competition, artificial intelligence, and more.
She holds a BA (Hons) in Politics, Philosophy, and Economics from the University of London, where she founded the New School Economics Society, the Goldsmiths University chapter of Rethinking Economics.
Oliver is a Junior Policy Manager at techUK, working across Public Affairs and Digital Regulation policy. He supports the organisation’s engagement with government and parliament, contributes to shaping techUK’s regulatory agenda, and plays a key role in coordinating political outreach, policy projects, and flagship events.
He joined techUK in November 2023 as a Team Assistant to the Policy and Public Affairs team, before stepping into his current role. He has been closely involved in efforts to ensure the tech sector’s voice is heard in the policymaking process.
Oliver holds a Master’s in Policy Research from the University of Bristol and a BSc in Policy from Swansea University. During his studies, he contributed to mental health research as a Student Research Assistant for the SMaRteN network.
Outside of work, Oliver is a keen debater and remains active in the UK debating community, having previously led the Swansea University Debating Union. He enjoys exploring complex issues from multiple perspectives and values clear, thoughtful communication in policy discussions.
techUK's Policy and Public Affairs Programme activities
techUK helps our members understand, engage and influence the development of digital and tech policy in the UK and beyond. We support our members to understand some of the most complex and thorny policy questions that confront our sector. Visit the programme page here.
The state of UK tech in 2026: polling from techUK and Public First
Read techUK’s latest insight on the State of UK Tech in 2026, exploring how policy and innovation can strengthen the UK’s competitive edge.
Our members develop strong networks, build meaningful partnerships and grow their businesses as we all work together to create a thriving environment where industry, government and stakeholders come together to realise the positive outcomes tech can deliver.
Antony Walker is deputy CEO of techUK, which he played a lead role in launching in November 2013.
Antony is a member of the senior leadership team and has overall responsibility for techUK’s policy work. Prior to his appointment in July 2012 Antony was chief executive of the Broadband Stakeholder Group (BSG), the UK’s independent advisory group on broadband policy. Antony was closely involved in the development of broadband policy development in the UK since the BSG was established in 2001 and authored several major reports to government. He also led the development of the UK’s world leading Open Internet Code of Practice that addresses the issue of net neutrality in the UK. Prior to setting up the BSG, Antony spent six years working in Brussels for the American Chamber of Commerce following and writing about telecoms issues and as a consultant working on EU social affairs and environmental issues. Antony is a graduate of Aberdeen University and KU Leuven and is also a Policy Fellow Alumni of the Centre for Science and Policy at Cambridge University.
Nimmi Patel is the Associate Director of Policy at techUK. She works on all things skills, education, and future of work policy, focusing on upskilling and retraining. Nimmi is also an Advisory Board member of the Digital Futures at Work Research Centre (digit). The Centre’s research aims to increase understanding of how digital technologies are changing work and the implications for employers, workers, job seekers, and governments.
Prior to joining the techUK team, she worked for the UK Labour Party and New Zealand Labour Party, and holds an MA in Strategic Communications at King’s College London and BA in Politics, Philosophy and Economics from the University of Manchester. She also took part in the 2024-25 University of Bath Institute for Policy Research Policy Fellowship Programme and is the Education and Skills Policy Co-lead for Labour in Communications.
Doniya Soni-Clark is Associate Director of External Affairs at techUK, where she leads the organisation's relationships with key political stakeholders and ensures the voice of the UK tech sector is heard loud and clear - in Westminster, Whitehall, and beyond. She is responsible for shaping techUK's political engagement strategy and representing members' interests to media, translating complex tech policy into compelling narratives that cut through.
With a career spanning public affairs, public policy, and industry advocacy, Doniya brings a combination of political instinct and policy depth. She began her career at Westbourne Communications before joining techUK in 2015 as Policy Manager for Skills — her first stint at the organisation. She then moved to the Greater London Authority as Principal Policy Officer, before spending seven years at Multiverse, the fast-growing British Edtech scale-up, as Head of Policy and Public Affairs. Her return to techUK brings that experience full-circle.
Doniya is a committed advocate for the UK's tech ecosystem and understands what it takes to grow a business in a complex regulatory and political environment - having lived it from the inside.
Doniya holds an MSc in Public Policy from the London School of Economics and a BA in Economics & Politics from the University of Exeter.
Tom McGee is Associate Director for Government Affairs, with a particular focus on techUK’s Digital Economy workstream and Nations & Regions programme.
Tom joined techUK in 2026 from the OECD’s Science, Technology & Innovation Directorate, where he specialised in semiconductors policy, providing analysis and recommendations on industrial policy and supply chain resilience to the OECD’s member countries and partner economies. Tom also has considerable experience in the UK Civil Service, as a Senior Policy Advisor (at the Department for Business, Energy & Industrial Strategy and Cabinet Office) and Ministerial Private Secretary (Department for Education).
Tom holds a Master’s in Public Policy from Harvard University, where his studies were generously funded by the Knox Memorial Fellowship. His Master’s thesis – on AI chips and industrial policy – was awarded the Harvard Kennedy School’s top prize for outstanding policy analysis. Tom also holds a BA from the University of Cambridge.
As Head of Public Affairs, Alice supports techUK’s strategic engagement with Westminster, Whitehall and beyond. She regularly works to engage with ministers, members of the UK’s parliaments and senior civil servants on techUK’s work advocating for the role of technology in the UK’s economy as well as wider society.
Alice joined techUK in 2022. She has experience working at both a political monitoring company, leading on the tech, media and telecoms portfolio there, and also as an account manager in a Westminster-based public affairs agency. She has a degree from the University of Sheffield in Politics and Philosophy.
Edward leads the Digital Economy programme at techUK, which includes our work on online safety, fraud, and regulation for growth initiatives.
He has prior experience working for the Department for Digital, Culture, Media and Sport and has previously worked for a number of public affairs consultancies specialising in research and strategy, working with leading clients in the technology and financial services sectors.
Samiah Anderson is the Head of Digital Regulation at techUK.
With over seven years of Government Affairs expertise, Samiah has built a solid reputation as a tech policy specialist, engaging regularly with UK Government Ministers, senior civil servants and UK Parliamentarians.
Before joining techUK, Samiah led several public affairs functions for international tech firms and coalitions at Burson Global (formerly Hill & Knowlton), delivering CEO-level strategic counsel on political, legislative, and regulatory issues in the UK, EU, US, China, India, and Japan. She is adept at mobilising multinational companies and industry associations, focusing on cross-cutting digital regulatory issues such as competition, artificial intelligence, and more.
She holds a BA (Hons) in Politics, Philosophy, and Economics from the University of London, where she founded the New School Economics Society, the Goldsmiths University chapter of Rethinking Economics.
Jake has been the Policy Manager for Skills and Future of Work since May 2022, supporting techUK's work to empower the UK to skill, attract and retain the brightest global talent, and prepare for the digital transformations of the future workplace.
Previously, Jake was the Programme Assistant for Policy. He joined techUK in March 2019 and has also worked across the EU Exit, International Trade, and Cloud, Data Analytics and AI programmes.
He also holds an MA in International Relations from the University of Sussex, as well as a BA(Hons) in International Politics from Aberystwyth University. During his time at Aberystwyth University, he won the International Politics Dissertation Prize.
Archie Breare joined techUK in September 2022 as the Telecoms Programme intern, and moved into the Policy and Public Affairs team in February 2023.
Before starting at techUK, Archie was a student at the University of Cambridge, completing an undergraduate degree in History and a master's degree in Modern British History.
In his spare time, he likes to read, discuss current affairs, and to try and persuade himself to cycle more.
Dani joined techUK in February 2025 as a Policy Manager in the Digital Regulation team.
Prior to this, Dani worked in political monitoring where she was a consultant for Digital, Culture, Media and Sport. In this role, she developed a strong understanding of parliamentary procedure, closely following all of the major developments in the tech centre and working with several key stakeholders and regulators.
She has an undergraduate degree in History from the University of Bristol and an MPhil in Modern European History from the University of Cambridge.
Outside of tech, Dani has a strong interest in addiction policy, particularly towards drugs, having written her dissertation on the topic as well as several subsequent research projects. In her spare time, she enjoys cooking and following all things motoring, whether that be F1, MotoGP or Formula E.
Oliver is a Junior Policy Manager at techUK, working across Public Affairs and Digital Regulation policy. He supports the organisation’s engagement with government and parliament, contributes to shaping techUK’s regulatory agenda, and plays a key role in coordinating political outreach, policy projects, and flagship events.
He joined techUK in November 2023 as a Team Assistant to the Policy and Public Affairs team, before stepping into his current role. He has been closely involved in efforts to ensure the tech sector’s voice is heard in the policymaking process.
Oliver holds a Master’s in Policy Research from the University of Bristol and a BSc in Policy from Swansea University. During his studies, he contributed to mental health research as a Student Research Assistant for the SMaRteN network.
Outside of work, Oliver is a keen debater and remains active in the UK debating community, having previously led the Swansea University Debating Union. He enjoys exploring complex issues from multiple perspectives and values clear, thoughtful communication in policy discussions.
Tess joined techUK as an Policy and Public Affairs Team Assistant in November of 2024. In this role, she supports areas such as administration, member communications and media content.
Before joining the Team, she gained experience working as an Intern in both campaign support for MPs and Councilors during the 2024 Local and General Election, and working for the Casimir Pulaski Foundation on defence and international secuirty. She has worked for multiple charities, on issues such as the climate crisis, educational inequality and Violence Against Women and Girls (VAWG). In 2023, Tess obtained her Bachelors of Arts in Politics and International Relations from the University of Nottingham.
techUK is keeping track of the 2026 updates to the AI Opportunities Action Plan. A full summary of all the announcements this month is forthcoming.Today, the government has announced a £36 million investment to upgrade the DAWN supercomputer at the University of Cambridge, aimed at increasing its computing power sixfold and significantly expanding access to high-performance AI compute in the UK.
The UK has a world-leading regulatory system that supports the economy while protecting the society. However, strategic reforms to the UK’s regulatory regime could help unlock its full potential as a vital catalyst for growth, bringing considerable rewards across industry.