Using technology to support better decision making in justice
Guest blog by Finbarr Murphy, Co-founder & CEO at Modular Data Ltd #digitaljusticeimpactday2026
Finbarr Murphy
Co-founder & CEO, Modular Data Ltd
Technology can extend the reach of justice into gaps the system currently cannot fill. The condition is that it is built on trusted data, with human judgment at the centre and every decision traceable to its source.
Friday afternoon. A man leaves prison with minimal resources and faces urgent needs before his first probation consultation next week. In the days between, he accumulates questions and problems with nowhere to turn for professional support.
This gap is a structural feature of the current system. The Probation Service operates at 79% of its target staffing level, 1,479 officers short nationally. According to the Public Accounts Committee’s February 2026 report, probation officers have been working at around 118% capacity on average for several years, with highs of 126% in some regions. HMPPS has also acknowledged that it seriously underestimated the time required to complete core tasks, meaning actual workloads were higher still. Every year, 37% of adults released from custody reoffend within twelve months. For those serving short sentences, the rate rises above 55%. MoJ estimates the social and economic cost of reoffending at around £20.9 billion a year in 2024–25 prices.
Probation is one illustration of a broader pattern. Across the justice system, from courts under pressure to prisons managing complex caseloads to the services supporting victims, technology is being considered as a way to extend reach, free up practitioner capacity, and support better decisions. The conversation about how to use AI in justice is active and moving fast.
Two conditions should be considered. The first is reach: technology should extend the system’s capacity to support people when human availability is constrained. The second is explainability: every decision the technology supports must be traceable, verifiable, and open to challenge. Both conditions are requirements. The PAC report found that HMPPS has no clear red lines on AI risk thresholds. Without those thresholds, the sector has no consistent basis for governing AI deployment across probation or anywhere else in justice. Organisations in other regulated industries routinely set risk thresholds above the regulatory minimum, in anticipation of future regulatory change. Justice AI should do the same. AI built without deliberate, well-governed guardrails produces output that sounds authoritative, moves at speed, and cannot be interrogated. In justice, where decisions affect liberty, housing, safety, and rehabilitation, that risk falls hardest on the people already most exposed.
At Modular Data, we built a proof of concept to test whether both conditions could be met simultaneously in a high-stakes justice context. The focus was on the critical first seven days after release from custody, the highest-risk window in the reoffending cycle. We wanted to understand whether AI could provide meaningful, accurate support to people during that period, with practitioners retaining full control of every consequential decision.
What we built drew responses exclusively from authoritative, governed sources, with every response traceable to its origin. When a question fell outside the system's ability to answer reliably, the system escalated it to a human officer with full context. Officers arriving on Monday morning can have a structured view of what had happened over the weekend: who had sought help, what they needed, and which cases warranted immediate attention. The AI supported triage. Officers made the decisions.
Three things emerged from building and testing the system. First, the quality of the underlying data is still critically important. Responses built from governed, quality-assured sources produced guidance that practitioners could stand behind. Speed is irrelevant when the answer is unreliable. In justice contexts, AI built on weak or unverified data distributes risk invisibly across every interaction it touches. Second, human oversight shaped through the architecture rather than imposed as an afterthought produced a fundamentally different kind of system. Officers can remain in control because the system was designed around their judgement. Third, the purpose of AI in this context is capacity release, not efficiency. Time returned to practitioners through AI support goes back into the human relationship at the centre of probation work. A service operating at 118% capacity, with officers repeatedly answering the same questions, risks crowding out important relationships. Technology that restores these connections can help reduce the £20.9 billion annual reoffending cost.
This approach can be applied across the justice system. A risk assessment in a court context, a resource allocation decision in a prison, a victim support referral in a multi-agency setting: each draws on data, each involves human judgment, and each carries consequences for real people. The discipline required to make AI trustworthy in those settings is the same discipline we applied in our proof of concept. Governed data with clear ownership and quality assurance. Deliberately conservative guardrails, calibrated to the risk profile of each individual rather than applied as a blanket threshold. Human oversight built into the architecture. Every output traceable to its source.
The UK AI White Paper sets out transparency, explainability, accountability, and contestability as core principles for AI in high-risk settings. The NAO’s 2024 report on AI in government identified data quality and legacy infrastructure as the primary barriers to realising that ambition. The PAC’s February 2026 probation report goes further, finding that HMPPS lacks the risk thresholds needed to govern AI deployment responsibly. The policy framework is sound. The operational conditions to deliver it are not yet in place.
Technology can support better decision making in justice. The sector has the policy intent, the investment signals, and the operational need. What it requires now is the discipline to build AI on foundations that make every decision explainable to the person it affects. Everyone deserves an outcome they can trust. Technology can help deliver that. The sector needs to build it properly.
Finbarr Murphy is CEO and co-founder of Modular Data, which helps public sector organisations build trusted, explainable intelligence for consequential decisions.
Digital justice impact day 2026
Explore how technology is transforming the justice system, from digital services to data-driven decision making. Gain insight into key themes, challenges and opportunities highlighted through Digital Justice Impact Day 2026. Read the update to understand where innovation is delivering impact and what comes next for the sector.
Justice and Emergency Services Programme activities
The techUK Justice and Emergency Services Programme represents tech firms operating in the public safety, criminal justice, and blue light markets. We create strong relationships between members and public sector customers, encouraging industry engagement, and unlocking innovation. Visit the programme page here.
Digital justice impact day 2026
Discover how digital innovation is reshaping justice services, from connected systems to improved decision making. Gain insight into key themes and industry perspectives from Digital Justice Impact Day 2026, and where technology is delivering tangible impact. Read the insight to stay informed and aligned with the direction of travel.
Policing enters a new era: technology at the heart of reform
Policing is entering a new era, with technology playing a central role in reform, capability and public trust. This piece explores how digital tools, data and innovation are supporting more effective policing, the challenges forces face in adopting new technology, and what reform means for industry and public safety partners — read the insight to understand how technology is shaping the future of policing.
Our members develop strong networks, build meaningful partnerships and grow their businesses as we all work together to create a thriving environment where industry, government and stakeholders come together to realise the positive outcomes tech can deliver.
Head of Programme - Justice and Emergency Services and Economic Crime Lead, techUK
Dave Evans
Head of Programme - Justice and Emergency Services and Economic Crime Lead, techUK
Dave is a former senior police officer with the City of London Police, bringing extensive experience as a Detective and senior leader across frontline operations and multi-agency partnerships at regional and national levels.
He has led and supported responses to major national incidents, including mass casualty events, counter-terrorism operations and large-scale public disorder, working closely with partners across the criminal justice sector.
Alongside his public service, Dave has also held leadership roles in the private sector, managing projects focused on intellectual property and licensing. His combined experience across both sectors gives him a deep understanding of how collaboration between service providers and end users can strengthen resilience and trust.
Cinzia joined techUK in August 2023 as the Justice and Emergency Services (JES) Programme Manager.
The JES programme represents suppliers, championing their interests in the blue light and criminal justice markets, whether they are established entities or newcomers seeking to establish their presence.
Prior to joining techUK, Cinzia worked in the third and public sectors, managing projects related to international trade and social inclusion.
Junior Programme Manager - Justice and Emergency Services, techUK
Fran Richiusa
Junior Programme Manager - Justice and Emergency Services, techUK
Fran is the Junior Programme Manager for the Justice and Emergency Services (JES) Programme.
In this role she supports project delivery, stakeholder engagement, and policy development across portfolios including law enforcement, justice, and the fire sector.
Fran joined techUK in May 2025 as a Programme Team Assistant for the Public Sector Markets Programmes before progressing to her current role.
Prior to joining techUK, she gained experience working across local government and VAWG (Violence Against Women and Girls) charities, where she developed a deep understanding of public service and advocacy.