29 Apr 2026
by Finbarr Murphy

Using technology to support better decision making in justice

Guest blog by Finbarr Murphy, Co-founder & CEO at Modular Data Ltd #digitaljusticeimpactday2026

Finbarr Murphy

Finbarr Murphy

Co-founder & CEO, Modular Data Ltd

Technology can extend the reach of justice into gaps the system currently cannot fill. The condition is that it is built on trusted data, with human judgment at the centre and every decision traceable to its source.

Friday afternoon. A man leaves prison with minimal resources and faces urgent needs before his first probation consultation next week. In the days between, he accumulates questions and problems with nowhere to turn for professional support.

This gap is a structural feature of the current system. The Probation Service operates at 79% of its target staffing level, 1,479 officers short nationally. According to the Public Accounts Committee’s February 2026 report, probation officers have been working at around 118% capacity on average for several years, with highs of 126% in some regions. HMPPS has also acknowledged that it seriously underestimated the time required to complete core tasks, meaning actual workloads were higher still. Every year, 37% of adults released from custody reoffend within twelve months. For those serving short sentences, the rate rises above 55%. MoJ estimates the social and economic cost of reoffending at around £20.9 billion a year in 2024–25 prices.

 

Probation is one illustration of a broader pattern. Across the justice system, from courts under pressure to prisons managing complex caseloads to the services supporting victims, technology is being considered as a way to extend reach, free up practitioner capacity, and support better decisions. The conversation about how to use AI in justice is active and moving fast.

Two conditions should be considered. The first is reach: technology should extend the system’s capacity to support people when human availability is constrained. The second is explainability: every decision the technology supports must be traceable, verifiable, and open to challenge. Both conditions are requirements. The PAC report found that HMPPS has no clear red lines on AI risk thresholds. Without those thresholds, the sector has no consistent basis for governing AI deployment across probation or anywhere else in justice. Organisations in other regulated industries routinely set risk thresholds above the regulatory minimum, in anticipation of future regulatory change. Justice AI should do the same. AI built without deliberate, well-governed guardrails produces output that sounds authoritative, moves at speed, and cannot be interrogated. In justice, where decisions affect liberty, housing, safety, and rehabilitation, that risk falls hardest on the people already most exposed.

At Modular Data, we built a proof of concept to test whether both conditions could be met simultaneously in a high-stakes justice context. The focus was on the critical first seven days after release from custody, the highest-risk window in the reoffending cycle. We wanted to understand whether AI could provide meaningful, accurate support to people during that period, with practitioners retaining full control of every consequential decision.

What we built drew responses exclusively from authoritative, governed sources, with every response traceable to its origin. When a question fell outside the system's ability to answer reliably, the system escalated it to a human officer with full context. Officers arriving on Monday morning can have a structured view of what had happened over the weekend: who had sought help, what they needed, and which cases warranted immediate attention. The AI supported triage. Officers made the decisions.

Three things emerged from building and testing the system. First, the quality of the underlying data is still critically important. Responses built from governed, quality-assured sources produced guidance that practitioners could stand behind. Speed is irrelevant when the answer is unreliable. In justice contexts, AI built on weak or unverified data distributes risk invisibly across every interaction it touches. Second, human oversight shaped through the architecture rather than imposed as an afterthought produced a fundamentally different kind of system. Officers can remain in control because the system was designed around their judgement. Third, the purpose of AI in this context is capacity release, not efficiency. Time returned to practitioners through AI support goes back into the human relationship at the centre of probation work. A service operating at 118% capacity, with officers repeatedly answering the same questions, risks crowding out important relationships. Technology that restores these connections can help reduce the £20.9 billion annual reoffending cost.

This approach can be applied across the justice system. A risk assessment in a court context, a resource allocation decision in a prison, a victim support referral in a multi-agency setting: each draws on data, each involves human judgment, and each carries consequences for real people. The discipline required to make AI trustworthy in those settings is the same discipline we applied in our proof of concept. Governed data with clear ownership and quality assurance. Deliberately conservative guardrails, calibrated to the risk profile of each individual rather than applied as a blanket threshold. Human oversight built into the architecture. Every output traceable to its source.

The UK AI White Paper sets out transparency, explainability, accountability, and contestability as core principles for AI in high-risk settings. The NAO’s 2024 report on AI in government identified data quality and legacy infrastructure as the primary barriers to realising that ambition. The PAC’s February 2026 probation report goes further, finding that HMPPS lacks the risk thresholds needed to govern AI deployment responsibly. The policy framework is sound. The operational conditions to deliver it are not yet in place.

Technology can support better decision making in justice. The sector has the policy intent, the investment signals, and the operational need. What it requires now is the discipline to build AI on foundations that make every decision explainable to the person it affects. Everyone deserves an outcome they can trust. Technology can help deliver that. The sector needs to build it properly.

Finbarr Murphy is CEO and co-founder of Modular Data, which helps public sector organisations build trusted, explainable intelligence for consequential decisions.


Digital justice impact day 2026

Explore how technology is transforming the justice system, from digital services to data-driven decision making. Gain insight into key themes, challenges and opportunities highlighted through Digital Justice Impact Day 2026. Read the update to understand where innovation is delivering impact and what comes next for the sector.

Read update


Justice and Emergency Services Programme activities

The techUK Justice and Emergency Services Programme represents tech firms operating in the public safety, criminal justice, and blue light markets. We create strong relationships between members and public sector customers, encouraging industry engagement, and unlocking innovation. Visit the programme page here.

 

Upcoming events

Latest news and insights 

Learn more and get involved

 

Justice and Emergency Services updates

Sign-up to get the latest updates and opportunities from our Justice and Emergency Services programme.

 

Here are the five reasons to join the Justice and Emergency Services Programme

Download

Join techUK groups

techUK members can get involved in our work by joining our groups, and stay up to date with the latest meetings and opportunities in the programme.

Learn more

Become a techUK member

Our members develop strong networks, build meaningful partnerships and grow their businesses as we all work together to create a thriving environment where industry, government and stakeholders come together to realise the positive outcomes tech can deliver.

Learn more


PA2026 Website banner (1).png

 

Meet the team 

Dave Evans

Dave Evans

Head of Programme - Justice and Emergency Services and Economic Crime Lead, techUK

Cinzia Miatto

Cinzia Miatto

Senior Programme Manager - Justice & Emergency Services, techUK

Fran Richiusa

Fran Richiusa

Junior Programme Manager - Justice and Emergency Services, techUK

 

 

 

Authors

Finbarr Murphy

Finbarr Murphy

CEO, Modular Data