Guest blog: The CESIUM case study

  • techUK techUK
    Friday06Dec 2019
    Opinions

    Guest blog by Katrina Petersen, Senior Research Analyst at Trilateral Research as part of our #CounciloftheFuture campaign week

Ethically designed technology to support local government in combating child exploitation: the CESIUM case study 
 

Government services throughout the UK gather more data than the various agencies can process, leaving them wondering who might be slipping through the cracks from the lack of ability to analyse it. When safeguarding children, disconnects in the data kept by various agencies can mean lost opportunity to identify victims, to mitigate risks earlier, to save a young life. 
 
Project CESIUM, funded by Innovate UK, brings together Linconshire Police, National Working Group, and Trilateral Research to advance risk assessment practices by applying machine learning to analyse this big data to prevent and combat child exploitation. The aim is to innovate techniques to get a more complete and socially equitable picture, pointing to potential future harm while not subjecting any child or family to unnecessary intervention or possible social stigma.  
 
The Ethical challenge  
 
Algorithmic decisions can create power struggles, like political debates over who gets to define indicators around vulnerability, need, and neglect. They bring fears of such tools displacing professional decision-making.   
 
Various critiques of predictive policing have already identified possible deficiencies in the deployment of these new technologies . For example, algorithms that extrapolate historical trends into the future show bias and discrimination. Related critiques emerge when correlating an area’s potential to be crime-ridden based on past police responses rather than actual crime, making forecasts about violence that unintentionally turn geography into a proxy for race, or applying causal factors from online to face-to-face interactions. 
 
Transparency and new methods to assess accuracy can help, but they alone do not create fairness in data analytics. 
 
CESIUM response: develop technologies that promote ethical decisions and trusted government actions 
 
The Project CESIUM’s team is taking these ethical tensions as its starting point and work in developing ethically designed algorithms. 
 
Armed with ethical impact assessment methods that support ethics-by-design, CESIUM aims to design machine learning that can create insights from the diverse risk assessment practices and data bridge between each piece of the scattered child exploitation vulnerability puzzle and more readily make it possible for Linconshire Police to identify and assess the underlying risk factors that shape such exploitation.  
 
By starting design of the analytics from the ethical tensions, CESIUM takes a step back from asking ‘what data is available to be used’, ‘what data should be gathered’, or even ‘what data is best’. 


CESIUM’s interdisciplinary team - made of social scientist, legal expert and data scientists who work across the socio-technology divide - starts with questions around ethical impact: 
 
• What are acceptable results of a tools use?

• What ought better mitigation of child exploitation look like?

• Who is responsible for such use and decisions?

• To whom are designer and users accountable?

• What is needed to maintain the integrity of the police using the algorithm?

• What is needed to gain and maintain the trust of those whose data is collected? • How can the data, the algorithms, and the use of the outputs be used to preserve a child’s dignity?

• How do these tools help balance personal data protection and individual freedom with community justice? 
 
Framing data analytic design within these questions and units of ethical analysis along with the units of data analysis can act as a form of societal accountability, building into the tools from the start novel forms of responsibility and integrating societal concerns into the kinds of practices and outcomes that are enabled.  
 
It invites scrutiny into types of data used, methodological approaches, value of results in various contexts of use to really understand the ethical implications of various algorithmic options for bringing together more than just information, but also the different conceptions of vulnerability, care, and responsibility encoded in that data. 
 
By really digging into the question of what can be made better by predictive algorithms, what services changes can arise, which populations or communities benefit, CESIUM can better assess what tools ought to do and can do to improve both government services and community outcomes. 
 
For more information about the project contact communications@trilateralresearch.com  and to receive updates about the Project CESIUM join the project’s mailing here
 
Author: Katrina Petersen, Senior Research Analyst at Trilateral Research  

Share this

FROM SOCIAL MEDIA

A full house this morning for our joint @NHSDigital @NHSX App strategy engagement event, helping suppliers to under… https://t.co/NudNxpONlM
ICYMI: Yesterday the Migration Advisory Committee published its recommendations on the UK's future immigration syst… https://t.co/hmfyEBeVLT
We still need responses for our GovTech SME survey! If you're an SME working in the public sector (or aspire to!) t… https://t.co/mnvBsoQXMr
techUK's @benpmoody @TomPRM @brynsage @Lcolledge1983 @ShaneTickell and more are in parliament as new £140m AI Award… https://t.co/RAG5nfdmJS
The Migration Advisory Committee has today published its recommendations on the UK’s future immigration system. You… https://t.co/0Gs0bvvvpF
Become a Member
×

Become a techUK Member

By becoming a techUK member we will help you grow through:

Click here to learn more...