Humanising Autonomy cameras and AI

Humanising Autonomy uses behavioural psychology and computer algorithms to make cities safer for pedestrians and cyclists.

Share this article

Using cameras and AI to protect vulnerable road users

Our Zenzic CAM Creator series continues with Raunaq Bose, co-founder of Humanising Autonomy.

Before establishing predictive artificial intelligence (AI) company Humanising Autonomy in 2017, Raunaq Bose studied mechanical engineering at Imperial College London and innovation design engineering at the Royal College of Art. Focusing on the safety of vulnerable road users, Humanising Autonomy aims to redefine how machines and people interact, making cities safer for pedestrians, cyclists and drivers alike.

RB: “Our model is a novel mix of behavioural psychology, deep learning and computer algorithms. We work with OEMs and Tier 1 suppliers on the cameras on vehicles, with the aftermarket on retrofitted dashcams, and also with infrastructure. Our software works on any camera system to look for interactions between vulnerable road users, vehicles and infrastructure in order to prevent accidents and near misses. While most AI companies use black box systems where you can’t understand why decisions are made, we set out to make our models more interpretable, ethically compliant and safety friendly.

“When it comes to questions like ‘Is this pedestrian going to cross the road?’, we look at body language and factors like how close they are to the edge of the pavement. We then put a percentage on the intention. Take distraction, for example, we cannot see it but we can infer it. Are they on the phone? Are they looking at the oncoming vehicle? Is their view blocked? These are all behaviours you can see and our algorithm identifies them and puts a numerical value on them. So we can say, for example, we’re 60% sure that this pedestrian is going to cross. This less binary approach is important in building trust – you don’t want lots of false positives, for the system to be pinging all the time.

“One of the main things we’re likely to see over the next decade is increased use of micromobility, such as cycling and e-scootering. At the same time you will see more communication between these different types of transportation, and also with vehicles and infrastructure. The whole point of ADAS is to augment the driver’s vision, to reduce blind spots and, if necessary, take control of the vehicle to avoid a shunt. Then there’s the EU agreement that by 2022 all buses and trucks must have safety features to detect and warn of vulnerable road users.

“We currently only look at what’s outside the vehicle, but with self-driving there will be monitoring of the cabin. In terms of privacy, we have a lot of documentation about our GNPR processes and how we safeguard our data. Importantly, we never identify people, for example, we never watch for a particular individual between camera streams. We look to the future with autonomous cars but for now we’re focused on what’s on the road today.”

For further info visit

Share this article

Author: Neil Kennett

Neil is MD of Featurebank Ltd. He launched in 2019.