12 Sep 2025
by Tom Moule

Why access to AI privacy must be part of the Digital Inclusion Conversation

As we mark End Digital Poverty Day 2025, it’s worth pausing to reflect on the impact of artificial intelligence (AI) on digital inequality.   

The divide between those who have access to data, devices, and connectivity — and those who don’t — is well documented. Jisc's recently launched 2025 digital experience insights surveys show the number of students across both higher and further education experiencing issues with access to equipment and poor connectivity has increased year on year since 2022.  

 But, as AI tools become more embedded in education and employment, a new layer of disparity is emerging: the gap between AI platforms that offer data privacy, and those which don’t, and users experiencing disadvantage are often the ones to miss out. 

SomeAI tools require trading personal data for access. Prompts may be logged, analysed, and used to train future models. Outputs may be shaped by user profiling. And while this might seem like a fair exchange, it has deeper implications — not just for rights and ethics, but for learning itself.  

Privacy and Proficiency: A Hidden Link  

Privacy isn’t just a legal or moral concern; it’s a practical one. When students use AI tools that aren’t considered secure, they often hold back. They censor their prompts, avoid sharing personal context, and steer clear of sensitive or complex ideas. And this matters — because how you use AI shapes what you learn from it.  

Here are four ways that lack of privacy can quietly undermine skill development:  

1. Limited Willingness to Experiment  

When students suspect their prompts may be stored or scrutinised, they tend to play it safe. They avoid testing out innovative ideas, speculative scenarios, or controversial topics. But experimentation is a crucial part of building confidence and proficiency. Without privacy, curiosity shrinks and learning narrows.  

2. Limited Depth of Inputs  

Effective AI use depends on detail and context. Students who feel watched often strip prompts back to the basics, resisting the nuance that can unlock better outputs. Over time, this could condition them to engage superficially, leaving them less skilled at guiding AI in complex, real-world tasks.  

3. Caution in High-Stakes Contexts  

Students who trust their AI platform will apply it to meaningful, high impact work. Those without that trust are less likely to go beyond low-stakes tasks. The result is a divide: some learn to use AI as a genuine problem-solving partner, while others miss out on that capability.  

4. Erosion of Creativity and Confidence  

Privacy creates the conditions for free thinking. When users feel safe, they are freer to explore bold, original and potentially valuable ideas. Without that sense of safety, they might default to more guarded interactions. Over time, this not only limits creativity but also prevents users from learning critical skills such as being able to judge the validity of information, deciphering what they can share and how it should be framed. 

 

In short, privacy enables depth. It allows students to engage fully, think freely, and learn meaningfully. Without this, the result is shallow usage in the short term, and a skills deficit in the long term.  

Why This Matters  

As AI continues to disrupt the job market, proficiency in using these tools will become increasingly important. Students who can engage deeply and creatively with AI will be better equipped to navigate a changing economy, but those who can’t afford paid versions may be left behind.  

This isn’t just a theoretical concern. It’s a real-world divide, and it echoes other areas of digital life. It has been found, for instance, that young people listening to free versions of music streaming platforms are more likely to be bombarded with ads for ultra-processed food — a subtle but powerful example of how a person’s means can shape experience. The same principle applies to AI. Those who can’t pay for privacy may find themselves nudged toward lower-quality interactions, weaker learning outcomes, and ultimately, fewer opportunities.  

End Digital Poverty Day 2025

This blog is part of our blog series for End Digital Poverty Day 2025. You can find a full list of member blogs, as well as a summary of techUK's digital inclusion work, on our hub page

Go to the Hub Page now


Related topics

Authors

Tom Moule

Tom Moule

Senior AI Specialist, Jisc