Why ethics might be the most important factor in AI enabled policing
Whether we realise or not, automation and AI is having a significant impact on how Public Services in the UK are delivered. You only need to hear about the Automation and AI journey undertaken at DWP to understand that these types of technology impact us all and are becoming a critical business resource supporting the day to day running of the Public Sector. Policing is no different.
Whilst this notion may conjure up images of Robocop or scenes from Minority Report, for better or worse, these movies aren’t an accurate reflection of Automation and AI in Policing. However, these technologies are having a direct impact on the day in the life of a police officer and the general public in ways you might not imagine.
Automation in Policing
Automation is already being used to unlock the value of the human workforce at a large policing authority who have partnered with Accenture. By automating mundane, high volume, repetitive tasks that were previously being carried out by skilled police officers, teams have been saved hundreds of hours of time per month. From a business case perspective, this saving seems impressive, but the important lesson learned is to look beyond time savings to appreciate how automation adds real value for the police.
By relieving highly skilled, trained officers of admin-heavy, repetitive tasks that consume their time, there are a number of potential benefits:
-
Improved service delivered to the public: The Authority have been able to reduce the time to serve from 24 hours to 45 minutes in one automated business process. A reduced time to serve doesn’t just benefit police operations, it can also help improve public perceptions of the police. Officers can spend more time on public facing roles that no robot or algorithm can carry out – delivering a public service with empathy.
-
Increased compliance: On routine tasks, automations can work faster than human beings and can be scaled infinitely (unlike human teams) to meet fluctuations in process volume with the right architecture in place. At this authority, it has helped teams to meet SLAs by reducing the time taken to complete processes and mitigate the risk of hefty regulatory fines.
-
Improved staff morale: Automating mundane tasks can help people find more intrinsic value in their daily work and focus on the reason a lot of them joined the police; making a difference in their community.
AI and Machine Learning in Policing
Beyond automation, Artificial Intelligence and Machine Learning is already being used by many police authorities in the UK. To name a couple of examples, Durham Constabulary’s Harm Risk Assessment tool predicts how likely an offender is to re-offend and the Metropolitan Police Service are trialling facial recognition software to automatically identify people in CCTV footage. Objectively, the benefits of using these tools over manual processing appear large and there is little doubt that the use of AI and Machine Learning will continue to grow. The outgoing Met Commissioner Cressida Dick has placed on record her support for allowing these solutions to support police officers in solving crimes.
The potential benefits of using AI in policing and vast number of use cases are already emerging. For example, it could help improve officer safety by training officers to act in difficult situations by using VR or help map out criminal networks by using data analytics. However, the technology also presents a number of challenges. As is widely reported, the way AI and algorithms are developed can introduce or exacerbate non-conscious biases.
Through a policing lens, there is a risk that if historic crime data is used to train any AI solution, it may inadvertently introduce historic biases and reinforce them today. With this in mind, it should be a priority to prevent those risks becoming serious issues for police forces and the public.
Currently, there is no AI specific regulation in the UK, although the UK government’s recently published AI strategy is a step in that direction. However, a good place for police authorities to start could be in 4 key areas:
-
Governance: An independent governance board could be established to regulate new AI solutions, sitting across all UK Police forces. Within specific authorities, establish data ethics teams – such as the one recently formed at West Midlands Police. This team are responsible for ensuring ethics and people’s rights are at the centre of any AI solutions delivered. At both levels, independence is key to ensure a disconnect from any internal bias that may exist.
-
Process: Ensure robust delivery models for new AI solutions include stress testing through red teams and algorithm impact assessments. This can help prevent bias affecting the stakeholders the technology will impact.
-
Technology: Creating explainable AI solutions by design and making use of tools and frameworks, such as IBM’s AI Fairness 360 or Google’s What-If tool, can help identify and mitigate biases if they do arise.
-
People: Increasing the diversity across the force is a key goal already, but in this area, it could be especially important. By bringing lived experiences to their work, people may be more able to recognise where biases might manifest and challenge teams to do more to tackle it.
The future of Automation, AI and Machine Learning is exciting, but police authorities in the UK need to be transparent with the public about their work in this area and ensure they get their governance structures right.
This content is provided for general information purposes and is not intended to be used in place of consultation with our professional advisors. This document may refer to marks owned by third parties. All such third-party marks are the property of their respective owners. No sponsorship, endorsement, or approval of this content by the owners of such marks is intended, expressed, or implied.
Further reading
AI in Policing and Security, UK Government
AI ethics & governance, Accenture
Author:
Matthew Cheesman, Public Safety, Accenture UK
Georgie Morgan
Georgie joined techUK as the Justice and Emergency Services (JES) Programme Manager in March 2020, progressing to Head of Programme in January 2022.