CDEI publish interim report on algorithmic bias in decision-making

The Centre for Data Ethics and Innovation (CDEI) recently published its interim report on algorithmic bias in decision-making along with a landscape summary, conducted by the Open Innovation Team.

The CDEI’s review focuses on exploring bias in four key sectors: policing, financial services, recruitment and local government. They have taken a phased approach to each sector, starting with policing and then moving onto financial services and recruitment, with work on local government starting in autumn 2019. Included in the interim report is a commitment from the CDEI to produce a briefing paper on Facial Recognition Technology (FRT) later in the autumn which will examine the wider ethical concerns surrounding the technology. This will not be limited to the use of FRT by the police.

The review seeks to answer three sets of questions:

  1. Data: Do organisations and regulators have access to the data they require to adequately identify and mitigate bias?
  2. Tools and techniques: What statistical and technical solutions are available now or will be required in future to identify and mitigate bias and which represent best practice?
  3. Governance: Who should be responsible for governing, auditing and assuring these algorithmic decision-making systems?

Data

The interim report highlights that data itself is often the source of bias but, at the same time, it is a core element of tackling the issue. One issue raised is that some organisations are not collecting diversity information, due to nervousness of a perception that this data might be used in a biased way. This then limits the ability to properly assess whether a system is leading to biased outcomes. There is a tension between the need to create algorithms which are blind to protected characteristics while also checking for bias against those same characteristics.

Tools and techniques

CDEI’s early work suggests that new approaches to identifying and mitigating bias are required and that specific tools are already starting to be developed. However, there is limited understanding of the full range of tools and approaches available and what constitutes best practice. This makes it difficult for organisations that want to mitigate bias in their decision-making processes to know how to proceed and which tools and techniques they should use.

Governance

According to the review, there is currently limited guidance and a lack of consensus about how to balance significant trade-offs (for example between different kinds of fairness) or even how to have constructive and open conversations about them. In the policing sector, the CDEI are developing a Code of Practice in collaboration with the sector, to help to address this issue.

The report also highlights that a certain level of transparency about the performance of algorithms will be necessary for customers and citizens to be able to trust that they are fair. Giving developers of algorithms space and opportunity to test algorithms against standard datasets or to benchmark performance against industry standards may enable the development of a consensus about the appropriate definitions of fairness. The CDEI suggest that new functions and actors, such as third-party auditors, may also be required to independently verify claims made by organisations about how their algorithms operate.

The CDEI will submit a final report with recommendations to government in March 2020. If you’d like to find out more about this programme of work, please contact Katherine.

  • Katherine Mayes

    Katherine Mayes

    Programme Manager | Cloud, Data, Analytics and AI
    T 020 7331 2019

Share this

FROM SOCIAL MEDIA

Today's guest blogs by @Arm and @bt_uk on Accountability and Explainability mark the final day of our… https://t.co/4wT7YQLmrZ
Great news! techUK has exceeded its fundraising target for @crisis_uk at the #DigitalEthics summit. Thank you to th… https://t.co/zqxxbB3UcR
ICYMI we were lucky enough to have Elizabeth Denham @ICOnews deliver her keynote via video to the delegates at our… https://t.co/eEB19b6HCZ
Read @Natterbox's @imoyse guest blog for #techUK discussing the future of cloud adoption in policy and business her… https://t.co/JXDfMJL0Ke
Do you need to present and communicate, sell, influence, overcome objections and shift mindsets? If yes, Storytelli… https://t.co/1OpWVNXEVO
On 13 January 2020 as part of techUK's emerging & transformative technologies work we will be launching our Drones… https://t.co/1S1DPqWqmg
If you’re an SME and supply to the public sector then we want to hear from you! Complete our GovTech SME Survey and… https://t.co/gp1z1M6jZg
Check out this guest blog by #techUK Skills & Diversity Council member @astrid_mehrtens of @AptumTech exploring ho… https://t.co/GYIy52nhaX
Thank you to everyone who attended the #DigitalEthics summit and made it a brilliant event! We continue our… https://t.co/veA7xLbOin
Become a Member
×

Become a techUK Member

By becoming a techUK member we will help you grow through:

Click here to learn more...