Surveillant Spaces and Places: time to limit the “vision”?

Guest Blog: Sue Chadwick, Strategic Planning Advisor for Pinsent Masons LLP #AIWeek2021

Exponential developments in sensory technologies and machine learning mean that land and buildings are evolving into dynamic data receptors where information can be sourced and analysed in real time.  This offers enormous benefits in building intelligent connections between public environments such as roads and open spaces, infrastructure such as public transport systems, individual homes, and the humans that inhabit and use them all. 

The speed reach and sophistication of these combined technologies also introduces a new range of risks.  Is there a way to exploit the possibilities without creating unintended legal risks and new ethical dilemmas?

The Technology

The recently published draft AI regulation  defines biometric data as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person.” As well as facial recognition, this includes fingerprint readers, temperature sensors and gait monitoring; exactly the kinds of technologies that are finding their way into the built form. This could range from safer individual homes for an ageing population through to public infrastructure such as the 3D bridge that uses data from everyone that walks, runs or cycles over it to inform its maintenance needs. As the Annual Review from the  Cambridge Centre for Smart Infrastructure and Construction recognises “Buildings are often perceived as static inanimate systems…sensors embedded in the structure bring the building to life.”

The Issues

There are many ways in which this combination of information and analytic techniques can work for the individual - and societal - good.  A fingerprint can open a phone, a phone can turn on the heating at home.   Domestic sensors offer safer homes and facilitate ageing in place; automated facial recognition could help with crime detection and prevention; environmental data can inform policy and help to quantify the impacts of a development proposal.  Project Odysseus is a recent example of a project where different datasets including JamCam cameras and traffic intersection monitors are combined to improve understanding of the “busyness” of an urban environment during the Covid-19 pandemic.

However, as with all new technology, there are risks as well as rewards. Automated facial recognition came under scrutiny last year when the Court of Appeal ruled that its use by the South Wales Police was unlawful.   In January a Government briefing note on the use of live facial recognition technology noted that it presents a range of ethical issues and a recent report by the Ada Lovelace Institute reported that “What constitutes trustworthy, responsible, proportionate use of biometric technologies is one of the most complex and urgent questions facing our society today”.


None of the emerging concerns should stop us taking advantage of the technologies, but whether they are being considered as part of a new building or public space, or a “covid-secure” retrofit of an existing development, the use of sensors, especially when combined with algorithmic analytics, should be carefully considered.  But what does this mean in practice? How can we benefit from advances in technology while mitigating the risk of legal liability and ethical breaches?

The draft Regulation just published proposes a risk-based response to all uses of AI technology, with a regulatory framework for high-risk AI systems.  Unsurprisingly, biometric identification and categorisation comes within the definition of a high-risk AI system.  In addition, the Regulation defines a publicly accessible space as “any physical place that is accessible to the public” including streets, parts of government buildings, transport infrastructure, cinemas, theatres, shops, and shopping centres.  The Regulation recognises that use of biometric technologies in these spaces can lead to discriminatory effects and recommends a range of controls including formal assessment of conformity and human oversight.”

The Regulation is still in draft form but for anyone considering the use of smart, biometric technologies on land or buildings, it is time to think carefully about how to maximise the benefits without creating reputational or legal risks.  These are some of the measures that should be considered:

  • A strategy that includes how the owner or operator of land or buildings sources, shares and uses its data;
  • A procurement approach that includes considerations such as digital ethics, embedded bias, interoperability and cybersecurity;
  • Risk assessments that include the specific issues raised by the use of biometric technology and AI, consistent with guidance published by the Surveillance Camera Commissioner last year;
  • The adoption of good digital practices including minimisation of data, robust consent processes, codes of conduct such as those anticipated by the Regulation, and consultation on the digital as well as the physical aspects of a proposed development.


One encouraging element of the Regulation is its promotion of regulatory sandboxes in the development of regulation that is “innovation-friendly, futureproof and resilient to disruption”.  While these technologies are still nascent, there is an opportunity for developers to work with local authorities and data institutions such as the Ada Lovelace Institute and the ODI on the best way to integrate their benefits into the fabric of modern life.



Sue Chadwick, Strategic Planning Advisor for Pinsent Masons LLP


You can read all insights from techUK's AI Week here

Katherine Holden

Katherine Holden

Associate Director, Data Analytics, AI and Digital ID, techUK

Katherine joined techUK in May 2018 and currently leads the Data Analytics, AI and Digital ID programme. 

Prior to techUK, Katherine worked as a Policy Advisor at the Government Digital Service (GDS) supporting the digital transformation of UK Government.

Whilst working at the Association of Medical Research Charities (AMRC) Katherine led AMRC’s policy work on patient data, consent and opt-out.    

Katherine has a BSc degree in Biology from the University of Nottingham.

[email protected]
020 7331 2019

Read lessmore