Humanising Autonomy: A shift back to people-centric technology can drive the future of advanced mobility
Investment in electric vehicles can help create safer, more pleasant communities – but companies will need more than a list of features for their technology stacks.
It was recently announced that American retail giant Walmart confirmed its purchase of 4,500 electric vehicles (EVs) to be used for last-mile deliveries, with deployment expected in 2023. The agreement contributes to Walmart’s zero emissions goal by 2040; if successful, the multinational corporation has the option to buy up to 10,000 units.
This is the latest in a growing list of investments into EV-related initiatives, including Hyundai’s first dedicated EV factory in South Korea, and Sony Group Corporation and Honda’s joint venture in establishing an advanced mobility company, with the intent of selling EVs and mobility services by 2025.
These investments place Walmart, Hyundai, Sony, and Honda – and other organisations similarly working in advanced mobility – in a powerful position to shape the communities and places in which they operate. By owning large fleets of next generation EVs, these companies can spread and multiply the positive impact EVs have on air quality and health, environmental benefits, and reduction of maintenance costs.
However, there is a larger consideration often overlooked when designing technology stacks – one that can directly influence the creation of safer, efficient, and more pleasant cities: a shift back to people-centric technology, such as placed-based behavioural intelligence and behaviour-based machine learning.
Imagine if vehicles could see, understand, and even predict human behaviour
In 2020, techUK wrote that “the heart of a place-based approach is the citizen”, with place-based approaches in technology helping improve the quality of life for people by enabling smarter, greener places. Behaviour-based machine learning (behaviour artificial intelligence, or AI) prioritises the needs and experience of people and is designed to make human-machine interactions intuitive, effective, and more pleasant.
Behaviour AI considers the physical behaviour of people and teaches their patterns to machines so that any triggered response is more relevant, accurate and within the context of any given situation (and hence, place). Behaviour AI can be defined as an AI system that directly interacts with humans or needs to understand human behaviour for further decision-making.
Modern sensor-based technologies can provide real-time data when detecting telemetry and physical measurements, movement in objects, and any deviation from normal status. However, without a visual representation of the scene, there is considerable context lost when relying on sensors independently.
In comparison, Behaviour AI uses videos and images captured from cameras to reliably extract the nuances of what the people (and machines) in the footage are doing. With this technology, the camera ‘sees’ and can ‘understand’ what’s happening in the place, even predicting human trajectory when used in edge computing. It is important to note that Behaviour AI technology should be GDPR compliant and protect individual privacy, to prevent unfair bias or inadvertent surveillance and policing.
Incorporating behaviour-based machine learning and place-base behavioural insight into EVs at the time of production can unlock a new generation of smart vehicles and enhance driver experience, whilst promoting a cost reduction in video data storage, fewer false positive alerts, and renewed trust in AI adoption.
Making a difference in road safety, Vision Zero, and driver training
Applications in the mobility industry powered by Behaviour AI include the smart tracking of objects, speed and distance detection of other vehicles, real-time driver alerts on fleet vehicles to signal vulnerable road user (VRU) behaviour and intent predictions, as well as real-time recognition and prediction of human behaviour.
Just some of the early adopters who are already using or have explored Behaviour AI technology for specific projects include:
- Transport for London, for road safety, in support of its Vision Zero by 2041 initiative
- Transport for Greater Manchester, as a part of understanding social gathering and movement analytics during the Covid-19 pandemic
- Ceva Logistics, to analyse VRU behaviours impacting fleet operations
- Nextbase, with planned integrated in its next-generation smart dashcam
- Japanese automotive supplier Aisin
EV manufacturers and those purchasing EV fleets have a great opportunity to turn their already large investments into one that can shape society and evolve the future of advanced mobility.
While there is a lot of effort and thought put into selecting the software and apps inside the vehicle, these are often designed for entertainment, experience, and convenience.
By incorporating behaviour-based machine learning and place-based behavioural insight into EVs and considering how people-centric a proposed technology stack is from the beginning, the next generation of EVs can transcend from being the latest in a series of models to a product that can change the world.
Interested in Behaviour AI? Watch Humanising Autonomy’s Intro to Behaviour AI video on YouTube.
The techUK podcast: Innovation in place-based care
In this episode we explore the concept of ‘place’ in care, the principles behind it, the impact of Covid-19 on care delivery, prospects for innovation following the introduction of Integrated Care Systems, examples of industry best practice, and where listeners can go to learn more about ‘place’ and innovation in care.
We were joined by Helena Zaum (Social Care Lead at Microsoft and Chair of techUK’s Social Care Working Group), Scott Cain (Associate at the Connected Places Catapult) and Hannah Groombridge (Healthcare Engagement Manager at Person Centred Software).
This discussion forms part of techUK’s Digital Place Week 2022 activity and features on our recently-launched Social Care Innovation Hub.