Dr Subhajit Basu

Dr Basu issues stark warning on need to earn public trust in self-driving technology.

Share this article

UK lawyer claims dangerous lack of evidence on safe driverless car to driver handover


Dr Subhajit Basu, of The University of Leeds’ School of Law, is a lawyer with impeccable credentials and a strong sense of public duty… and he’s got serious concerns about “handover” – the moment when a self-driving vehicle transfers control back to a human driver.

An editor at The International Review of Law, a Fellow of The Royal Society (RSA), and Chair of The British and Irish Law Education Technology Association (BILETA), he recently supervised research into “Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces”.

The report, first published in the prestigious Nature journal, concluded that: “An urgent investigation is needed into the technology that allows self-driving cars to communicate with their operators”. Why? Because the “digital interfaces may be unable to adequately communicate safety and legal information, which could result in accidents”.

That is a stark warning indeed and Dr Basu believes the Government and the automotive industry need to be much more up-front about the issues.

Dr Basu report cover
Legal issues in automated vehicles report

SB: “The main safety messages surround the extreme difficulty most drivers will encounter when an autonomous vehicle suddenly transfers the driving back to them. Even if a driver responds quickly, they may not regain enough situational awareness to avoid an accident.

“The general public is not aware of their vulnerability, and it is doubted that an interface in an automated vehicle will communicate this point with sufficient clarity.   

“The article in Nature was part of a multidisciplinary international project, PAsCAL, funded by the EU’s Horizon 2020, into public acceptance of connected and autonomous vehicles (CAVs).

“My expertise is in the regulation of emerging technologies. I’m one of those people who sees autonomous vehicles not as a disruptive, but as something which can improve human life. However, in order to do that, we have to put public safety, public service and public trust before profit. I always emphasise that transparency is paramount, but the autonomous vehicle industry can be extremely secretive.

“The overall goal of PAsCAL was to create a guide to autonomy – a set of guidelines and recommendations to deliver a smooth transition to this new system of transport, particularly with regards to the behaviour of the driver, or user in charge, of an autonomous vehicle. 

“You have to recognise that an Assisted Lane Keeping System (ALKS) is basically an evolution of the lane departure warning systems that lots of cars already have, but in general self-driving cars are not an evolution but a revolution – they will change our way of life.

“We want to understand not just how the technology works, but also how people see it. The aim is to capture the public’s acceptance and attitudes – not just the users in charge, but pedestrians and other road users too – and to take their concerns into consideration.

“With any new technology there has to be a proper risk assessment. Take the example of smart motorways – it’s a brilliant idea in theory and it works in other countries, but there has been a lack of understanding in the UK. We didn’t create enough stopping places and the cameras weren’t good enough to monitor all the cars in real time. You need an artificial intelligence driven system which can identify a car which is slowing in a fraction of a second.

“Similarly with autonomous vehicles, if you want to deliver something like this you should have the right technologies in place. In this case, that means the human machine interface. The vehicle manufacturers (VMs) will basically give responsibility to the driver, the user in charge, saying “when you are warned, you should take over, okay?”.

“In our report, we argue that there will not be enough time for an individual to understand the legal complexities, what they are accepting liability for. The communication of that risk will not be easy for the user in charge to understand. Honestly, how many people have read the terms and conditions of Facebook?

“In autonomous vehicles, the human machine interface will communicate very important safety information and legally binding information, with civil or criminal implications if the driver fails to adequately respond.

“If you look at the proposed change to the Highway Code, it assumes that the driver will be able to take back control when prompted by the vehicle. We are concerned that even the most astute and conscientious driver may not be able to take back control in time. The effectiveness of the human machine interface is one limiting factor and then there is the driver – every driver has different cognitive abilities and different skill levels.

“Human beings are all different, they react differently to different circumstances, so defining the right timeframe for a handover is a difficult balance to strike. Are you going to assess people on their cognitive abilities, on the speed of their reflexes?

“In some circumstances, I have doubts about whether it is fair to have a handover even within 30 to 40 seconds. Certainly, there is nothing I have found where scientifically they have viewed 10 seconds as an adequate time. Cognitively, a blanket 10 seconds simply may not be possible – that’s my major concern.

“This is something we have been talking about for quite some time now. The UK government seems to be in very much in favour in pushing ahead with this technology quickly, because it fits with the “Build Back Better” tagline. There is a huge risk that we are disregarding safety in the name of innovation.

“I think the automotive industry has a responsibility here. When you are travelling in a self-driving car, the manufacturer is responsible for your safety, for ensuring that the technology is up to standard.

“The industry also has a responsibility to ensure that drivers are adequately trained, adequately educated. The argument that accidents happen and can be used for development is vulgar. Go and tell that to the person who has lost a relative – that this is a learning process.

“I am not against autonomous vehicles. What I am saying is that we need evidence-based conclusions. We need to be sure that the reaction time is well-founded and supported, so we don’t create a system which will fail.

“Personally, I propose that we should first create a comprehensive legal framework which should mandate additional driver training for the safe use of self-driving systems. The automotive industry could take a lead on this, actively push for it.

“At the end of the day, this is about road safety, it is about saving human lives. I believe that autonomous vehicles can reduce congestion, can be good for the climate, but they also have the potential to become deathtraps because we are getting over-reliant on the technology to work perfectly and over-relying on human ability, without the evidence-based research to find out whether we can react within the stipulated time.

“As a lawyer, it is my responsibility to uphold public safety, to highlight the risks. If the government and the automotive industry don’t face these issues, then people will lose trust in this amazing technology.”

For more, you can read the full Nature article here.

Share this article

Author: Neil Kennett

Neil is MD of Featurebank Ltd. He launched Carsofthefuture.co.uk in 2019.