One of the biggest barriers to the successful introduction of driverless cars is confusion over what constitutes true self-driving.
In America, the controversial autonomous vehicle expert, Alex Roy, has suggested a self-driving litmus test called Roy’s Razor. “Can you get in, pick a destination and safely go to sleep?” he asks. “If yes, it’s self-driving. If no, it’s not.”
While this has some merit, the key word “safely” gets somewhat lost. The internet is awash with less than sensible people climbing out of the driver’s seat with their Tesla in Autopilot.
So, here’s an idea to head off such recklessness… the best way to tell if a car is truly self-driving is to ask this simple question: Has it got a steering wheel?
Audi has apparently been down this road in the thinking behind its new Grandsphere concept car. When in “hands-off” mode, the steering wheel folds neatly away.
That certainly removes any doubt as to whether the driver is responsible for driving or just a user in charge, to use The Law Commission of England and Wales’ new lingo.
“We will be ready for Level 4 driving in the second half of this decade,” said Josef Schloßmacher, Audi’s spokesperson for concept cars.
“That’s an important timeframe for us and we will interact with authorities in the different continents and countries in all important markets on the homologation of this new technology.”
While somewhat open to the accusation of a fudge – if it is truly self-driving, why do you need a steering wheel at all? – this looks like progress.
Driverless Toyota e-Palette bus hits blind Japanese judo star
A golden PR opportunity for driverless cars backfired badly this week when a Toyota self-driving e-Palette shuttle bus hit a visually impaired athlete at the Tokyo Paralympic Games.
It had all been going so well. A fleet of eye-catching autonomous electric vehicles successfully ferrying competitors and officials around the Olympic village was a major triumph for the self-driving industry, and Toyota in particular.
But this Olympic fairy tale received a nasty reality check when a slow-moving e-Palette collided with Japanese judo veteran Aramitsu Kitazono, apparently ending his medal hopes.
Kitazono had been due to face Ukraine’s Dmytro Solovey the following day, but didn’t take to the mat. Toyota Chief Executive Akio Toyoda swiftly apologised, but the damage was done.
Somewhat ironically now, given the accident involved a blind man, our headline endorsed “flickering lights to replace eye contact in facilitating trust”. Perhaps audible warnings are also warranted.
“Throughout the development process, athletes, especially Paralympians, helped us understand how the e-Palette could be adapted and upgraded to better meet their needs for simple, convenient and comfortable mobility,” said Takahiro Muta, the project’s development leader, in 2019.
Hindsight is a wonderful thing. Last December, the idea of these autonomous vehicles playing a practical role at this showcase sporting event was enticing, to say the least – some questioned whether it would even be possible.
Now we are left with Toyoda’s grim assessment of the incident. “It shows that autonomous vehicles are not yet realistic for normal roads,” he said.
Use of the e-Palette fleet was suspended for several days but has now resumed.
As accusations of slow progress fly, the UK self-driving industry is accelerating.
There’s a lot of talk about the shift to autonomous vehicles slowing. Indeed, the question “Why has the driverless car revolution stalled?” was posed in preparation for the upcoming Reuters Automotive 2021 event [at which yours truly is moderating the AV session – sorry, shameless plug!].
In the UK, a good barometer of such things is Oxford-based Oxbotica, and they’ve made several significant announcements recently.
Back in January, we reported on the Oxford University spin-out securing huge BP investment, with CEO, Ozgur Tohumcu, teasing “exciting deals in the pipeline”.
Shortly afterwards, Tohumcu struck a big deal himself, leaving to become MD of Automotive at Amazon Web Services.
Oxbotica Co-founder and CTO, Professor Paul Newman, was lavish in his praise for ‘Ozo’, saying on LinkedIn: “A chunk of everything we do will always be because of what you made these past few years.”
One major goal was swiftly achieved: offering public AV passenger rides in the UK. Oxbotica was instrumental in this long-awaited milestone, providing the software for Project Endeavour’s well-publicised road trials in Birmingham and London.
Part-funded by the Centre for Connected and Autonomous Vehicles (CCAV), and delivered in partnership with Innovate UK, Project Endeavour applied BSI’s new safety case framework specification, PAS 1881:2020 Assuring the Safety of Automated Vehicle Trials and Testing.
Oxbotica therefore became the first company to have its safety case assessed against these stringent new requirements.
In Greenwich, six modified Ford Mondeos were deployed on a five-mile route to help transport planners and local authorities understand how autonomy can fill mobility gaps and play a role in the long-term sustainability of cities.
Dr Graeme Smith, Senior Vice President (SVP) at Oxbotica and Director of Project Endeavour, said: “This is a one-of-a-kind research project that is allowing us to learn about the challenges of deploying autonomous vehicles in multiple cities across the UK – a key part of being able to deploy services safely and at scale.
“So far, it has been a real collaborative effort, bringing everyone into the discussion, from local authorities to road safety groups, transport providers and, most importantly, the general public.”
Not everyone was convinced, however. My London carried this barbed comment from local Stephen McKenna: “What’s the purpose it’s filling that we don’t already have?” Clearly, the industry still has work to do on the public perception front.
Impressive new products can only help and, in May, Oxbotica and Navtech Radar launched Terran360, “the world’s first all-weather radar localisation solution for industrial autonomous vehicles”.
This pioneering technology is apparently accurate to 10cm on any vehicle, in any environment, up to 75mph. It has been comprehensively tested in industrial settings, on roads, railways and for marine use.
Phil Avery, Managing Director at Navtech, said: “Thanks to decades of experience in delivering radar solutions for safety and mission critical applications, and together with Oxbotica’s world-leading autonomy software platform, Terran360 is trusted to answer the fundamental question for autonomous vehicles: “Where am I?”, everywhere, every time.”
If that weren’t enough, outside of the UK, Oxbotica has deepened its partnership with BP by running an AV trial at its Lingen refinery in Germany.
Described as “a world-first in the energy sector”, BP now aims to deploy its first AV for monitoring operations at the site by the end of the year.
Morag Watson, SVP for digital science and engineering at BP, said: “This relationship is an important example of how BP is leveraging automation and digital technology that we believe can improve safety, increase efficiency and decrease carbon emissions in support of our net zero ambition.”
So much for AV progress stalling!
Pressing data privacy questions as car computer processing power increases.
The sheer volume of data being collected by connected cars is soaring. Forget megabytes (MB), gigabytes (GB) and even terabytes (TB), it’s time to start thinking in petabytes (PB) and exaflops (EFLOPS).
A petabyte is equal to one quadrillion (one thousand trillion) bytes. However, rather than looking at storage capacity, there’s now been a shift towards performance, measured in floating-point operations per second (FLOPS).
At the CVPR 2021 Workshop on Autonomous Driving event earlier this year, Tesla unveiled its new in-house supercomputer, boasting an eyewatering 1.8 EFLOPS.
The University Information Technology Services tells us that: “To match what a one EFLOPS computer system can do in just one second, you’d have to perform one calculation every second for 31,688,765,000 years.”
Behind this unprecedented processing power sit important questions. Back in 2019 we asked Connected cars: whose data is it anyway? with Bill Hanvey, CEO of the Auto Care Association, warning that “carmakers have no incentive to release control of the data collected from our vehicles”.
Under the headline “Customer trust is essential to large-scale adoption of connected cars”, Engineering and Technology (E&T) recently highlighted a survey, by automotive engineering company Horiba MIRA, which asked 1,038 car owners from the UK, Germany and Italy about privacy in their connected vehicles. 42% said they were not made aware that they could withdraw their consent.
Garikayi Madzudzo, advanced cybersecurity research scientist at Horiba MIRA, commented: “Industry sources estimate that on average about 480 terabytes of data was collected by every automotive manufacturer in 2013, and it is expected that this will increase to 11.1 petabytes per year during the course of 2021.
“With such large volumes of personal information being collected, it is inevitable that privacy will be a challenge.”
This dovetails with a survey by Parkers which found that 86% of people wouldn’t be happy to share driving habit data with third-party companies.
Parkers.co.uk editor, Keith Adams, told Fleet News: “We’re agreeing to all manner of terms and conditions on a daily basis – I shudder to think what Google knows about me – but it comes as a surprise to see so few drivers are aware of what their cars knows about them.”
Meanwhile, The Star Online has published some interesting thoughts on data privacy from Volkswagen Group chief executive, Herbert Diess.
“In Europe, data belongs to our customers first and foremost – they decide what happens with it,” he said.
“In China, data is considered a common good, available for the people’s good. In America, data is predominantly seen as an economic good, is not public, but remains with the companies, with Google, with Apple, in order to serve the business model there.”
Dr Basu issues stark warning on need to earn public trust in self-driving technology.
Dr Subhajit Basu, of The University of Leeds’ School of Law, is a lawyer with impeccable credentials and a strong sense of public duty… and he’s got serious concerns about “handover” – the moment when a self-driving vehicle transfers control back to a human driver.
An editor at The International Review of Law, a Fellow of The Royal Society (RSA), and Chair of The British and Irish Law Education Technology Association (BILETA), he recently supervised research into “Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces”.
The report, first published in the prestigious Nature journal, concluded that: “An urgent investigation is needed into the technology that allows self-driving cars to communicate with their operators”. Why? Because the “digital interfaces may be unable to adequately communicate safety and legal information, which could result in accidents”.
That is a stark warning indeed and Dr Basu believes the Government and the automotive industry need to be much more up-front about the issues.
SB: “The main safety messages surround the extreme difficulty most drivers will encounter when an autonomous vehicle suddenly transfers the driving back to them. Even if a driver responds quickly, they may not regain enough situational awareness to avoid an accident.
“The general public is not aware of their vulnerability, and it is doubted that an interface in an automated vehicle will communicate this point with sufficient clarity.
“The article in Nature was part of a multidisciplinary international project, PAsCAL, funded by the EU’s Horizon 2020, into public acceptance of connected and autonomous vehicles (CAVs).
“My expertise is in the regulation of emerging technologies. I’m one of those people who sees autonomous vehicles not as a disruptive, but as something which can improve human life. However, in order to do that, we have to put public safety, public service and public trust before profit. I always emphasise that transparency is paramount, but the autonomous vehicle industry can be extremely secretive.
“The overall goal of PAsCAL was to create a guide to autonomy – a set of guidelines and recommendations to deliver a smooth transition to this new system of transport, particularly with regards to the behaviour of the driver, or user in charge, of an autonomous vehicle.
“You have to recognise that an Assisted Lane Keeping System (ALKS) is basically an evolution of the lane departure warning systems that lots of cars already have, but in general self-driving cars are not an evolution but a revolution – they will change our way of life.
“We want to understand not just how the technology works, but also how people see it. The aim is to capture the public’s acceptance and attitudes – not just the users in charge, but pedestrians and other road users too – and to take their concerns into consideration.
“With any new technology there has to be a proper risk assessment. Take the example of smart motorways – it’s a brilliant idea in theory and it works in other countries, but there has been a lack of understanding in the UK. We didn’t create enough stopping places and the cameras weren’t good enough to monitor all the cars in real time. You need an artificial intelligence driven system which can identify a car which is slowing in a fraction of a second.
“Similarly with autonomous vehicles, if you want to deliver something like this you should have the right technologies in place. In this case, that means the human machine interface. The vehicle manufacturers (VMs) will basically give responsibility to the driver, the user in charge, saying “when you are warned, you should take over, okay?”.
“In our report, we argue that there will not be enough time for an individual to understand the legal complexities, what they are accepting liability for. The communication of that risk will not be easy for the user in charge to understand. Honestly, how many people have read the terms and conditions of Facebook?
“In autonomous vehicles, the human machine interface will communicate very important safety information and legally binding information, with civil or criminal implications if the driver fails to adequately respond.
“If you look at the proposed change to the Highway Code, it assumes that the driver will be able to take back control when prompted by the vehicle. We are concerned that even the most astute and conscientious driver may not be able to take back control in time. The effectiveness of the human machine interface is one limiting factor and then there is the driver – every driver has different cognitive abilities and different skill levels.
“Human beings are all different, they react differently to different circumstances, so defining the right timeframe for a handover is a difficult balance to strike. Are you going to assess people on their cognitive abilities, on the speed of their reflexes?
“In some circumstances, I have doubts about whether it is fair to have a handover even within 30 to 40 seconds. Certainly, there is nothing I have found where scientifically they have viewed 10 seconds as an adequate time. Cognitively, a blanket 10 seconds simply may not be possible – that’s my major concern.
“This is something we have been talking about for quite some time now. The UK government seems to be in very much in favour in pushing ahead with this technology quickly, because it fits with the “Build Back Better” tagline. There is a huge risk that we are disregarding safety in the name of innovation.
“I think the automotive industry has a responsibility here. When you are travelling in a self-driving car, the manufacturer is responsible for your safety, for ensuring that the technology is up to standard.
“The industry also has a responsibility to ensure that drivers are adequately trained, adequately educated. The argument that accidents happen and can be used for development is vulgar. Go and tell that to the person who has lost a relative – that this is a learning process.
“I am not against autonomous vehicles. What I am saying is that we need evidence-based conclusions. We need to be sure that the reaction time is well-founded and supported, so we don’t create a system which will fail.
“Personally, I propose that we should first create a comprehensive legal framework which should mandate additional driver training for the safe use of self-driving systems. The automotive industry could take a lead on this, actively push for it.
“At the end of the day, this is about road safety, it is about saving human lives. I believe that autonomous vehicles can reduce congestion, can be good for the climate, but they also have the potential to become deathtraps because we are getting over-reliant on the technology to work perfectly and over-relying on human ability, without the evidence-based research to find out whether we can react within the stipulated time.
“As a lawyer, it is my responsibility to uphold public safety, to highlight the risks. If the government and the automotive industry don’t face these issues, then people will lose trust in this amazing technology.”
For more, you can read the full Nature article here.
PAVE is on a mission to inform the US public about self-driving vehicles.
There are many lessons America can teach us Brits about the safe introduction of driverless cars, and the vital work of Partners for Automated Vehicle Education (PAVE) is a prime example.
The US is well ahead of the UK in terms of on-road testing and there have been crashes. These high-profile incidents have dented consumer confidence and calls for greater oversight have now been met.
On 29 June 2021, The National Highway Traffic Safety Administration (NHTSA) announced that the manufacturers and operators of vehicles equipped with advanced driver assistance systems (ADAS), or higher SAE level automated driving systems, must report crashes.
Against this background, PAVE has a mission “To inform the public about automated vehicles and their potential so everyone can fully participate in shaping the future of transportation”.
Executive Director of PAVE, Tara Andringa, explains: “PAVE was born at CES in Las Vegas in 2019 and unites industry, academia, non-profits and the public sector. PAVE aims to bridge the gap between the huge resources that industry is investing in AV technology, and opinion polls that show that the public is largely confused and distrustful. Our mission is to educate and engage the public.
“We don’t advocate for any particular policy. We are all about education, having a conversation and raising the level of understanding – we want to equip everyone to be part of the conversation. We started with 18 members at CES, and we’ve grown to over 80 members. There has been a lot of agreement about the need for this kind of effort, including many big industry players.”
Importantly, PAVE now has many of these big players on-board: vehicle manufacturers including Audi, Ford, Toyota and VW; AV specialists Cruise, Oxbotica and Waymo; IT and comms giants Intel and Blackberry; motoring bodies including the National Automobile Dealers Association (NADA); influential campaign groups like Mothers Against Drunk Driving (MADD); and charities such as The National Federation of the Blind.
Andringa continues: “Although our organisation includes very diverse members with diverse missions, we find that our efforts are more impactful if all of these groups come together.
“We like to put on demonstration events to demystify the technology and the good news is that knowledge and experience change attitudes. When we get people into AVs, they often say it is just like being in a human-driven car, and it’s almost boring. For us, that’s a success. It builds trust and understanding, which are universal concepts.
“We also conduct surveys and have found a lot of confusion about the technology that’s on the road today – from people who say self-driving cars will never happen, to people who think their cars are already equipped to drive themselves.
“In particular, people confuse driver assistance with self-driving. We very much believe ADAS can improve safety, but we always emphasise that all cars for sale today require a responsible driver behind the wheel.
“Another way we have reached a lot of people is through our weekly panel discussions looking at all different aspects of AVs. These originally came about due to the pandemic, but they have gotten over 12,000 views on YouTube.
“Recently we partnered with the State of Ohio to engage the public sector. Town and city authorities want to be ready, but they have lots of questions. We ran a workshop on how AVs work from the point of view of regulation, freight, law enforcement and linking with existing transport. The response was incredibly positive.”
For more information, including links to the panel discussions and other helpful resources, visit pavecampaign.org
Cars of the Future editor Neil Kennett talks driverless cars, driver assistance systems, The Highway Code and more.
In a wide-ranging interview, our editor Neil Kennett discusses driverless cars, driver assistance systems, proposed changes to The Highway Code, robotaxis, data privacy, the trolley problem, artificial intelligence, and the Smokey and The Bandit theme song, with Dean and Sarah Gratton on the Tech Uncorked podcast.
“I’ve been a motoring journalist for 20-odd-years and I’ve become increasingly obsessed with connected and autonomous vehicles, and very dissatisfied with the majority of national media coverage,” he said.
“As I saw it, driverless cars were presented as either goodies like Kitt from Knight Rider or baddies like The Terminator, and you didn’t really get beyond that, so I launched Carsofthefuture.co.uk to explore the issues in more depth.”
Barrister Alex Glassbrook specialises in road transport and has written two books on UK autonomous vehicle (AV) law. An expert in the law of advanced, automated and electric vehicles, serious personal injury, motor insurance and high-value vehicle damages cases, he begins by highlighting three recent developments:
The Automated and Electric Vehicles Act 2018 coming into force on 21 April 2021;
The government announcement on 28 April that it isn’t yet publishing a list of AVs under Section 1 of the Act, but that it does expect to list vehicles equipped with Automated Lane Keeping Systems (ALKS) as “automated”; and
AG: “My work overwhelmingly involves car accidents as the source of serious injury, so the AEV Act coming into force was an historic moment. Immediately though, it was clear there was something missing: the list of automated vehicles under Section 1 of the Act, which the Secretary of State is required to publish as soon as it is prepared. There was a presumption that the Act and the list would come together, but they didn’t. We have the Act but no list. In traffic light terms, we’ve gone past amber but there’s no green. What’s going on?
“That question was answered a week later with the Centre for Connected and Autonomous Vehicles’ publication of its paper for the Department for Transport on whether vehicles equipped with ALKS would be listed as automated. In summary, it said the list is not yet being published because we’re waiting to find out if these vehicles will get Whole Vehicle Type Approval from the Vehicle Certification Agency (VCA). If that happens, then the Secretary of State does expect to list them as automated under the AEV Act.
“This has huge implications for liability because it brings into effect a new line of motor insurance. Currently, under the Road Traffic Act, the motor insurer is effectively the body that will satisfy any judgment against a liable driver, or indeed can be sued directly under the direct rights against insurers regulations.
“The new AEV Act does something very different, something particular to AVs: it makes the insurer of the vehicle directly liable. This brings two important changes. One is the direct liability, which is slightly different from the direct rights regs. Second, it attaches to the vehicle rather than the driver, which is quite a radical step.
“There are obviously practical considerations behind this. Would publishing the list before the vehicles get Type Approval be putting the cart before the horse? Even so, it’s a little bit curious because the Act has already come into effect. Moreover, it’s not yet certain that ALKS-equipped vehicles will be classed as automated. The Secretary of State could change his mind.
“Running alongside this, we have the proposed amendments to the Highway Code. They’re quite eye-catching. The current Highway Code reiterates the orthodoxy, that the driver must at all times be in control of the vehicle and must understand the manufacturer’s instructions. The new proposed version is currently out for consultation, but the consultation period is very short, with a deadline of 28 May.
“The key section reads: “On the basis of responses to the call for evidence, and the step-change that the expected introduction of the first legally recognised automated vehicles represents, we have decided to make a more ambitious amendment to The Highway Code, coinciding with the code’s 90th year anniversary.” To me, the fact it is 90 years since the Highway Code was first published in 1931 is neither here nor there. What is notable is the reference to “more ambitious”, because that implies there was an earlier draft.
“The next sentence has the wow factor. It says: “Automated vehicles no longer require the driver to pay attention to the vehicle or the road when in automated mode, except to resume control in response to a transition demand in a timely manner.” The implications of those words are immense.
“The document continues: “Automated vehicles are vehicles that are listed by the Secretary of State for Transport. While an automated vehicle is driving itself, you are not responsible for how it drives, and you do not need to pay attention to the road.”
“Well, we don’t have that list yet, and what follows is really quite striking. It proposes an instruction in the Highway Code, the official guidance to drivers, to do nothing – to pay no attention to how the vehicle is driving or what’s happening on the road. It positively advises drivers to switch off their attention.
“The next paragraph sets some parameters: “If the vehicle is designed to require you to resume driving after being prompted to, while the vehicle is driving itself, you MUST remain in a position to be able to take control. For example, you should not move out of the driving seat.”
“So, you shouldn’t get out of the driving seat – that’s quite a low standard. This appears to be saying it’s fine to watch a movie, it’s fine to go on Instagram, it’s fine to read and respond to business emails. All these tasks are entirely absorbing of concentration and require some disengaging from.
“I’ve done a lot of trials in which I’ve asked witnesses about their appreciation of time during a crash and heard expert evidence about what can happen within a short window of time. Particularly when you’ve got three or four lanes of motorway, multiple vehicles, an awful lot can happen in 10 seconds.
“There are two fairly well-known exceptions to driver control recognised in the law. One is a medical emergency, if a driver is suddenly incapacitated. The other is moments of peril, sometimes known as agony of the moment – when it is such a difficult situation that a driver causing injury by their evasive manoeuvre is not to be judged by the usual demanding standard.
“So, the common law has formed exceptions to liability, but in this case it’s more complex. First, it introduces, for want of a better phrase, artificial intelligence (AI) into the picture. Adjudicating the actions of AI is still a very undeveloped area of law. Second, it brings into the picture something that has been manufactured, namely a computer and sensor system within a moving vehicle. Again, the laws of product liability are at a very early stage of development in relation to AI and new technologies. A notorious example of that is over-the-air (OTA) software, which is not understood as goods.
“From a legal perspective, it is vitally important not to have guidance which leaves open very obvious questions. Unfortunately, these proposed changes to the Highway Code do just that. On the one hand, the Highway Code might say it’s perfectly fine to completely distract yourself from driving. But on the other hand, it’s not okay to do things like climbing out of the driving seat. That leaves open a very broad set of situations and the courts are going to find themselves dealing with some very difficult problems.
“Of course, what the court has to deal with is very much secondary, the primary question must be: what is safe? There are plenty of lessons in the history of motor vehicles when innovation has overreached. Famously, Ralph Nader’s book, Unsafe At Any Speed (published in the USA in 1965), highlighted rear suspension which lost traction when going round corners. That changed product liability law across all sectors.
“I’m not a road traffic engineer but, as an observer of many road traffic accident cases over many years, I have real doubts as to the safety of this guidance.”
With its laudable aim “to demonstrate entrepreneurship in the global public interest while upholding the highest standards of governance”, transformational technologies like autonomous vehicles are natural territory for The World Economic Forum. Here, we get the considered views of the Forum’s Automotive & Autonomous Mobility Lead, Tim Dawkins – an Englishman working for the Geneva-based organisation in sunny California.
Tell us about your path to autonomous vehicles and The World Economic Forum
TD: “I started out studying motorsport engineering at Brunel and my first job out of university was in vehicle security for automotive consulting firm, SBD, helping manufacturers meet Type Approval requirements with anti-theft technologies. When SBD opened an office in North America, I went there, to lead their consulting in autonomous driving. Then, in 2018, I got my MBA and wound-up joining The World Economic Forum.
“Here at the Forum, our mission is greater than to convene events for business leaders, but actually to improve the state of the world. In my domain, that means making sure that the future of transportation is as safe as possible. Broadly, we work with governments and industry leaders to help them understand each other better. In the world of autonomous vehicles that means helping governments understand how the technology is evolving and the creation of new governance structures – which can be used in regulations, standards and assessment criteria.
“A crude analogy is to think about a driving test for the self-driving cars of the future – what does that look like? It’s obviously a lot more nuanced and complex than that, but by being a neutral entity – bringing together the likes of Aurora and Cruise with leading academics and regulators to have focused discussions around autonomous vehicle operation and deployment, or what it means to define a safe autonomous vehicle – is a very effective way of achieving better outcomes for all.
“It’s not just about the advanced technologies of the future, our portfolio also includes road safety research – improving the infrastructure, reducing crashes and fatalities with today’s ADAS technologies, and looking ahead to creating a safer future of mobility with autonomous vehicles.”
With your global perspective on autonomous mobility, how is the UK doing in terms of the government’s stated aim of being “at the forefront of this change”?
TD: “The automotive industry has always been very important to the UK economy, so it is natural that that industry and the government agree on the strategic priority to make the UK an attractive place to develop and test these technologies. We have world-leading engineering talent, universities and research and test facilities within our borders, so it’s shifting the focus from sheet metal and engines over to Connected and Autonomous Vehicle (CAV) technologies. Really, it’s a great fit.
“What UK governments have done – I say governments plural, because this has been going on for over 10 years – is to create institutions which spur development. There’s been dedicated funding and research grants not only to grow the CAV ecosystem within the UK, but to encourage international organisations to come and develop in the UK as well.
“What we see now is the result of many years of building the business case, to position the UK as a competitive place to test and develop new technologies. This top-down industrial policy, combined with an open code of practice to facilitate automated vehicle trialling, make the UK a great place to test and develop AVs.
“This ecosystem view is something we study here at the Forum. We recently published a joint paper with The Autonomous – The AV Governance Ecosystem: A Guide for Decision-Makers – which looks at how the standards bodies, alliances and consortia are coming together to develop solutions which will become policy, or at least be used in future governance. You will notice that a lot of UK entities feature very prominently in this study.
“For example, BSI are one of the long-established standards institutions that have been mission-aligned to further CAV mobility, by delivering technical standards and guidance to address governance gaps in the sector, such as the new Publicly Available Specification (PAS) 1881, 1882 and 1883 documents and a vocabulary of CAV terms. Then you have entities such as Zenzic to create the business environment and inform the overall roadmap to making autonomous vehicles a reality, supported by entities such as Innovate UK, and a whole ecosystem of universities and research entities creating a thriving network for innovation.
TD: “One of the things our team like to tackle is how to incentivise these companies to go not just where they can make the most profit, but to provide services to those who most need transportation. This means providing services in areas that are underserved by public transport.
“Think about commuting into London – you drive to the train station, then get onto the TFL network. If you can make that journey more efficient, hopefully more affordable, and accessible, suddenly the economic opportunities that come with commuting into London are open to a greater swathe of people. It’s a very local issue. You have to look at each city and say: where are the areas with the least economic opportunities and how can mobility provide them with greater access to jobs, healthcare and all the things they need?
“Fundamentally, mobility should be considered a human right. It’s not codified as one, but the link between good access to mobility and access to a good future is extremely strong. When we talk to city regulators, for example, they’re very keen to view autonomous vehicles as a way of making their transportation ecosystem more efficient – using AVs to get people onto the existing network, rather than replacing buses or train services.”
That’s certainly opened our eyes to the important work of the World Economic Forum, and we’ll be hearing more from Tim’s colleague, Michelle Avary, Head of Automotive and Autonomous Mobility, at next month’s Reuters event, Car Of The Future 2021.
Law Commission proposes user-in-charge – a new legal role reflecting the responsibilities of being less than a driver but more than a passenger.
The Automated Vehicles Review at the Law Commission of England and Wales plays a pivotal role in in the UK government’s push to be at the forefront of the burgeoning global self-driving industry.
Since 2018, when the Centre for Connected and Autonomous Vehicles (CCAV) asked The Commission to undertake a far-reaching three-year review of the UK’s regulatory framework for automated vehicles, Jessica Uguccioni, the lead lawyer for the review, has been immersed in reforms to enable their safe and effective deployment.
Notably, in December 2020, The Commission unveiled a consultation setting out a comprehensive regulatory scheme for automated vehicles. The consultation closed in March 2021 and the outcomes are not yet public.
Two concepts are particularly striking: 1) a start-to-finish self-driving vehicle safety assurance scheme; and 2) a user-in-charge.
Under the proposals, when the vehicle is driving in automated mode the person in the driving seat is no longer a driver, but instead a ‘user-in-charge’ with responsibilities to take over driving following a transition demand, and for driver duties that do not relate to dynamic driving (like maintenance of the vehicle, or ensuring children are wearing seatbelts).
Importantly, the user-in-charge would not be criminally liable if an accident occurred while the vehicle was in self-driving mode. Transport Minister Rachel Maclean hailed the work as “leading the way on the regulation of this technology”.
JU: “Our analysis is still evolving, not just in terms of the framework we would like to see, but suggesting changes to existing legislation and identifying gaps.
“For passenger cars, there are two main routes to market: gradually adding driving automation features to consumer vehicles, which may be capable of self-driving for part of a journey but still rely on a human driver to complete a trip; and the ride hail model, with vehicles that can carry passengers or drive empty, and can complete trips while self-driving.
“The oversight needs to be very different, although there is some common ground. The safety assurance scheme applies regardless of the use case. But for cars which cannot complete a journey in self-driving mode, it is important to have a user in charge – a new legal role reflecting the responsibilities of being less than a driver but more than a passenger. On the other hand, fleet operators play a crucial supervisory role for automated vehicles that do not need a user-in-charge.
“There is a lot of unease over the safety of the transition process: human factors input is crucial to ensure the human can be brought back into the loop and take over driving in a safe manner. Circumstances (the ‘operational design domain’ or ODD) must also be taken into account. For example, being in a dedicated lane travelling at 10mph is a very different safety case to motorway driving.
“The SAE levels are helpful, but they don’t tell the whole story. The AV must be safe within its ODD, but any public place brings an amount of randomness. The AV therefore needs to be able to cope with a wide variety of situations. For example, pedestrian safety needs to be taken into consideration for ALKS on motorways – people shouldn’t be walking along or across motorways, but sometimes they are. We need to make sure that redistribution of risk does not disadvantage vulnerable road users – that’s a priority.”