Thoughtful design as the gateway to ethical data processing

This article was prompted by a debate at techUK's Digital Ethics Summit, where I took part in a panel discussing the concept of data ownership and the need to usefully re-frame the debate to focus on people's data rights and controls. 

Is data ownership really the issue? 

I agree that the idea of data ownership is unhelpful. If we apply the traditional legal meaning to the word ‘ownership’ here, the idea of data ownership becomes misleading. For example, it’s often impossible to force the return or repair of our data, as we can with something tangible we’ve purchased. Additionally, once we have given our data to one organisation, there’s nothing to stop us giving the same data to other organisations. So personal data isn’t exhaustive in the way something tangible is.

So how should we frame the debate concerning data ownership, rights and controls? I think we should avoid talking about ownership, and focus more on rights.

It isn’t that individuals, groups and organisations need more rights; we need to get better as individuals at exercising the rights we already have, and organisations need to improve how they facilitate individuals exercising those rights. To some extent this point echoes the development of data rights over the last 30 years. If we look back to the rights captured in some of the first data protection instruments, such as Convention 108, these haven’t changed significantly.

If we accept this, the challenge becomes how can we encourage individuals to exercise their rights, and how can organisations get better at facilitating them? I think the answer to this lies in design.

Thoughtful design: the gateway to ethical data processing

Organisations talk about providing their users with “as much control over their data as possible”. This often translates into countless toggle switches asking for our consent, agreement or acknowledgement to our data being used in a particular way.

This approach has developed at least partly because conversations around privacy design tend to focus on back-end systems, and developing platforms so users can toggle a simple yes or no to particular uses of their data. While attempting to give users this degree of control is laudable, the implementation of these control mechanisms at the user interface level is often extremely poor. This poor design has a knock-on effect on the individual and the privacy landscape generally.

The problem with consent fatigue

Offering too much control in this way drives the wrong behaviour from the individual. It causes consent fatigue – agreeing to everything or to nothing just to make these privacy notifications stop. This ultimately undermines the role the individual plays in ensuring organisations are complying with the law. I say this because the compliance efforts I see in practice are usually driven by one of two things: the fear of the Information Commissioner’s Office knocking on your door; or as a response to an individuals’ complaint.

If we continue to overload individuals with too many control mechanisms, we erode their motivation to participate in the privacy conversation. I’m not saying that individuals aren’t interested in their privacy rights, simply that poorly implemented design carries this risk of motivational erosion.

At best, this disengages an incredibly important stakeholder from the privacy conversation: the individual. At worst, it means the organisation isn’t able to process individuals’ data lawfully and fairly because it has failed to adequately explain how it plans to use that data. The current state of user interface design has also led some academics to suggest that even talking about the concept of “consent” in data protection is fundamentally flawed and should be removed altogether. I don’t think we are quite there yet I think we can do better.

The way forward

So how do we push this forward? I believe the goal of the debate over data rights and controls should be to provide an ethical design framework for what a good user interface looks like, from a privacy perspective.

The design framework should acknowledge it’s often impossible to communicate everything to users during the few seconds they are willing to interact with privacy notices. So it should focus on the key points that really matter to them (which won’t be the same for every project) and then link out to the detail.

Ultimately, I would like us to reach a position where the design framework acts as a litmus test for whether a particular app, cross-site tracking or enrichment project can proceed, complementing organisations’ existing impact assessment processes. If we decide that we can’t fit a planned use of data into this ethical design framework, we should either re-work our plans, or recognise it’s time to stop using personal data in that way. 

For further information on techUK's work in this area, take a look at techUK's Digital Ethics in 2019 paper. 

Share this


We are celebrating #WorldWaterDay come and join us in an interactive session here @techUK for an Innovation Worksho…
Want to know what @bankofengland is doing to embrace, enable and empower fintech? #techUK #Fintech
Free on 2 April & interested in human rights, modern slavery, conflict minerals & sustainability? We're hosting a c…
.@techUKCEO responds to European Council's agreement on extending Article 50. Parliament now must urgently work to…
Great evening with @ComdJFC_UK outlining JFC approach to innovation at @techUK Defence Spring Dinner.
.@CENTI_London is offering one-on-one investor meetings with top VC`s and private meetings with local government. E…
Given that a cyber attack is no longer an 'if' but more likely a 'when', board members need help with guidance on w…
Join us on 27 March to explore a local focused approach to local public safety service delivery challenges…
Become a Member

Become a techUK Member

By becoming a techUK member we will help you grow through:

Click here to learn more...