29 Apr 2026
by Jules Anderson

Accelerating agentic AI in digital forensics

Guest blog by Jules Anderson, Business Development Director at Oxford Dynamics #digitaljusticeimpactday2026

Jules Anderson

Jules Anderson

Business Development Director, Oxford Dynamics

The case for action is not abstract. More than 30,000 prosecutions collapsed in England and Wales between 2020 and 2024 due to failures in evidence handling. Digital forensics investigators report being overwhelmed. Individual prosecutors are carrying caseloads more than double what they managed pre-pandemic. And criminal networks are already deploying AI at scale to commit, conceal, and complicate offences.

The technology to address this exists. The question is whether the CPS and policing can move fast and carefully enough to deploy it in a way that delivers more successful prosecutions rather than creating new grounds for cases to fail.

The opportunity: what Agentic AI can do right now

Agentic AI platforms: systems that actively pursue investigative goals rather than simply answering questions represent a genuine step change in forensic capability. The best of these are already operational in high-stakes government environments, having been used to synthesise millions of words and images across thousands of documents,  in complex national security contexts. That same capability is now directly applicable to criminal investigation.

In a digital forensics context, this means triaging terabytes of seized device data in hours rather than weeks; correlating communications, financial records, and social media into a single evidential picture across multiple suspects; autonomously mapping criminal networks from a single seed identity; and detecting synthetic or manipulated media with full audit trails. Crucially, every output is source-referenced, confidence-scored, and traceable to its origin, in effect the architecture courts require.

The barriers: why progress has been slow

Three interconnected problems are holding back adoption. First, the legal framework. The Post Office Horizon scandal exposed the danger of assuming technology is infallible, and the Ministry of Justice's 2025 call for evidence on the presumption of computer reliability reflects how unsettled the legal landscape remains. Any AI tool deployed in the forensic chain must be explainable, auditable, and defensible under cross-examination, a standard many generic AI platforms are struggling to meet.

Second, the literacy gap. The 2024 Northumbria University Digital Forensics Project found that little attention is paid across the system to whether digital evidence represents fact or opinion, or to the reliability of the methods that produced it. If investigators, prosecutors, and barristers cannot articulate how an AI tool reached its conclusions, the evidence will not survive defence challenge.

Third, capacity and fragmentation. With 43 police forces operating different standards, funding levels, and toolsets, and with the CPS's own digital infrastructure still in active transformation under Steve O’Connor and his leadership team, there is no consistent national baseline from which to scale AI-assisted forensics.

The acceleration plan: five practical steps

Establish a joint CPS-policing AI forensics standard. Building on the CPS's published AI mission statement and the Policing AI Covenant, a shared validation framework, specifying the explainability, chain of custody, and bias-testing requirements that any AI forensic tool must meet would provide procurement clarity and court defensibility simultaneously.

Prove agentic AI on high-volume, high-harm offence types first. Child sexual exploitation, fraud, and cybercrime cases involve the greatest data volumes, the longest backlogs, and the clearest public interest. Starting here maximises impact and generates the evidential track record needed to build judicial confidence incrementally.

Invest in cross-system digital literacy. The CPS's commitment to AI training must extend to defence practitioners and the judiciary. Evidence that AI cannot be explained to a court is evidence that will be challenged. Shared training, modelled on the Judicial College's work on digital evidence, reduces that vulnerability across the system.

Deploy sovereign, air-gapped platforms for sensitive casework. Platforms that operate fully on-premises, without routing data through third-party cloud environments, protect against disclosure vulnerabilities and data sovereignty concerns that would otherwise expose sensitive investigations. This is not optional for counter-terrorism, serious organised crime, or national security-adjacent cases.

Mandate human accountability at every evidential step. The CPS's own AI vision is clear: "Individuals are responsible for each use and output generated using AI, just as they would be if an AI tool had not been used." This principle must be embedded in police Digital Forensics Unit protocols as a non-negotiable condition of deployment and not a governance afterthought.

The bottom line

Agentic AI does not guarantee more successful prosecutions. Deployed carelessly, it will create new grounds for cases to collapse. Deployed correctly, with explainability built in from the ground up, human oversight at every decision point, and a legal framework that has caught up with the technology, it transforms what is currently a system under immense pressure into one capable of meeting the volume and complexity of 21st-century criminal casework.

The tools are ready. The legal impetus has never been stronger. What is now needed is the institutional will to build the governance framework that turns capability into convictions.


Digital justice impact day 2026

Explore how technology is transforming the justice system, from digital services to data-driven decision making. Gain insight into key themes, challenges and opportunities highlighted through Digital Justice Impact Day 2026. Read the update to understand where innovation is delivering impact and what comes next for the sector.

Read update


Justice and Emergency Services Programme activities

The techUK Justice and Emergency Services Programme represents tech firms operating in the public safety, criminal justice, and blue light markets. We create strong relationships between members and public sector customers, encouraging industry engagement, and unlocking innovation. Visit the programme page here.

 

Upcoming events

Latest news and insights 

Learn more and get involved

 

Justice and Emergency Services updates

Sign-up to get the latest updates and opportunities from our Justice and Emergency Services programme.

 

Here are the five reasons to join the Justice and Emergency Services Programme

Download

Join techUK groups

techUK members can get involved in our work by joining our groups, and stay up to date with the latest meetings and opportunities in the programme.

Learn more

Become a techUK member

Our members develop strong networks, build meaningful partnerships and grow their businesses as we all work together to create a thriving environment where industry, government and stakeholders come together to realise the positive outcomes tech can deliver.

Learn more


PA2026 Website banner (1).png

 

Meet the team 

Dave Evans

Dave Evans

Head of Programme - Justice and Emergency Services and Economic Crime Lead, techUK

Cinzia Miatto

Cinzia Miatto

Senior Programme Manager - Justice & Emergency Services, techUK

Fran Richiusa

Fran Richiusa

Junior Programme Manager - Justice and Emergency Services, techUK

 

 

 

Authors

Jules Anderson

Jules Anderson

Business Development Director, Oxford Dynamics