05 May 2022

What does the EU’s AI Act mean for the UK’s AI Industry?

AI lawyer Charlie Lyons-Rothbart answers the question "What does the EU’s AI Act mean for the UK’s AI Industry?" in this essential article as part of techUK's AI Week #AIWeek2022

AI is soon to be a regulated sector. While we await the UK’s white paper on the UK’s approach to regulating AI, the EU’s AI Act (the “AI Act”) is already on its way, with the final text expected around the end of 2022. 

What does this mean for the UK’s AI industry? 

The AI Act’s impact on AI in the UK 

Much like the GDPR, the AI Act will reach beyond the borders of the EU and will apply to any developer or user of an AI system outside the EU, where the output of that AI system has an effect within the EU.  

Given that many UK businesses are not solely focused on the UK market, it is likely they will be impacted by the AI Act. It’s encouraging that UK tech and policymakers are thinking about this in advance of the AI Act becoming law. 

Challenges  

AI system developers will quickly realise the obvious challenges posed by the AI Act’s requirements, especially those in Title III Chapter 2 (more on those here). But these are not the only obstacles the UK’s AI industry will face.  

The new regulatory regime is going to be most burdensome on the start-up community, where a significant proportion of innovation within the AI industry occurs. While the European Council appears to have recognised this and recommends that EU member states take this into account when implementing the AI Act at national level, so far the specific recommendations fall short.  

The current recommendations are: (i) member states must implement regulatory sandboxes so that AI systems can be stress-tested for regulatory compliance before being placed on the market; and (ii) applying costs for conformity assessments proportionately for small-scale providers (small and micro-businesses under the SME definition). 

Although these measures are welcome they do not go far enough and apply only to high risk AI systems. Sandboxes may have benefits, but will this regime help or hinder start-ups? Will there be help available for all developers, where the AI Act urges voluntary compliance outside of the high risk category? 

Perhaps more crucially, how will these measures help UK businesses? The obligation to implement these recommendations falls on member states and won’t apply to the UK. Consequently the UK needs to consider what tools, policies, and procedures it should implement, to help the UK’s AI sector comply with the AI Act and continue to thrive. 

Opportunities 

These are real challenges for the UK’s AI sector and need to be taken seriously by policymakers and industry alike. As with most challenges, however, there is opportunity. 

The AI Act requires member states to encourage voluntary regulatory compliance by implementing codes of conduct, which can be established by an organisation, an industry, or those representing them. While the UK is not bound by these requirements as it isn't a member state, UK businesses caught by the AI Act will need help with voluntary compliance too.  

There are promising opportunities here for the UK to lead the way in helping its AI sector retain its competitive edge, both in terms of aiding compliance with the AI Act, as well as activity outside it.  

The work the AI Council, the Centre of Data Ethics and Innovation, the Alan Turing Institute and the AI Standards Hub are doing will be key to the UK's success in this area. It’s exciting to watch this evolve, and I would encourage those who’ll be most impacted by this regulation to get involved and help shape the governance landscape within and for the UK.  

Author:

Charlie Lyons-Rothbart, Senior Associate, Taylor Vinters