The future of ethical data sharing: the role of data intermediaries

New report from the Centre for Data Ethics and Innovation explores the role of data intermediaries in the future of data sharing

Last week, the Centre for Data Ethics and Innovation (CDEI) published a report on data intermediaries, commissioned by the Department for Digital, Culture, Media & Sport. The report covers seven different types of data intermediaries and highlights five particular challenges they can help overcome: Lack of data-sharing incentives; lack of knowledge; commercial, ethical and reputational risks; legal and regulatory risks; and costs of data access/sharing.

The paper provides a comprehensive overview of how these can be addressed through 12 case studies, and covers a range of different areas of potential for trustworthy data sharing. Below are highlighted just two of these, which are worth watching as organisations and individuals start to gain a greater awareness of the power of their own data and autonomy over who it is shared with.

Using data intermediaries to audit data-driven technologies

Bias in decision-making supported by algorithms is a long-standing concern, and one which is repeatedly raised in examples from across the world. In fields ranging as widely as advertising to criminal justice, wrong decisions may be reached through a reliance on models making assumptions based on personal characteristics.

Part of the challenge of algorithmic bias is the quality and quantity of data used to train and test models. Sensitive personal data is often required to do this well, which can be difficult to access at scale while respecting individuals’ right to privacy and maintaining public trust.

The CDEI report highlights how data intermediaries can help overcome this challenge. A data intermediary could collect and manage demographic data on behalf of organisations, with a carefully designed governance structure and technical infrastructure. In this way, data intermediaries can increase the security of personal data, but also work as a quality assurer by licensing or accrediting datasets.

A case study in the report focuses on facial recognition technology, where lower levels of accuracy for women and people of colour is often flagged as a way injustices may be amplified. At the same time, scraping data from online photographs to diversify testing data is an approach that can undermine public trust further. In the US, the National Institute of Standards and Technology therefore works as a data custodian (a type of data intermediary) to provide secure access to large datasets for the testing of facial recognition technology, which has been used for almost 200 facial recognition algorithms so far.

Citizens donating data

Another area of potential pointed to in the report, is giving citizens autonomy over their own data and opportunities to ‘donate it’ to what they consider worthy causes.

Personal Information Management Systems (a type of data intermediary) aim to store all information related to an individual in one place and allow for it to be managed it through a single interface. This can give users greater confidence and control over how their data is used and who can access it, and is therefore a way for citizens to protect their own privacy.   

It can also be used to ‘donate data’ to research which individuals would like to contribute to. The report uses a hypothetical example of a citizen whose family has been impacted by mental health conditions and therefore wants to contribute to research by providing data about himself. To protect the privacy of such sensitive information, he does this through an accredited “data donation service provider” which can collect data on his behalf and then administer it to selected research and charitable purposes.

This relationship with our own data may gain traction in the future, as the European Commission’s proposed Data Governance Act introduces provisions which would set up a system of ‘data altruism’, where individuals can donate personal data and organisations can donate non-personal data to not-for-profit organisations for ‘objectives of general interest’.

techUK will follow this and other uses of data intermediaries closely, and if you would like to share your reflections or specific use cases, please get in touch with techUK’s Programme Manager for Digital Ethics & AI, Emilie Sundorph.  

 

Emilie Sundorph

Emilie Sundorph

Programme Manager – Digital Ethics and Artificial Intelligence, techUK

Emilie joined techUK in June 2021 as the Programme Manager for Digital Ethics & AI.

Prior to techUK, she worked as the Policy Manager at the education charity Teach First and as a Researcher at the Westminster think tank Reform. She is passionate about the potential of technology to change people's lives for the better, and working with the tech industry, the public sector and citizens to achieve this.

Emilie holds a master's degree in Philosophy and Public Policy from LSE. In her spare time she is currently trying to learn Persian and improve her table tennis skills.

Email:
[email protected]
Phone:
+44 (0) 07523 481 331
Twitter:
@ESundorph,@ESundorph
LinkedIn:
https://uk.linkedin.com/in/emilie-sundorph,https://uk.linkedin.com/in/emilie-sundorph

Read lessmore

 

Related topics