Self-driving experts talk constantly of the need to earn public trust, but driverless cars continue to divide opinion. Indeed, recent surveys have shown that people are becoming more, not less, wary of them.
Just this week, in Inverness, Scotland, where an autonomous bus trial is due to start later this year, The Inverness Courier reported significant resistance to the idea. 69% of respondents to its survey of local residents said they would refuse to get on a driverless bus.
What we need, of course, is for the media to convey an informed and nuanced safety message. Hmm! To illustrate the scale of the task, here’s a list of our top five hyperbolic headlines:
One of the biggest barriers to the successful introduction of driverless cars is confusion over what constitutes true self-driving.
In America, the controversial autonomous vehicle expert, Alex Roy, has suggested a self-driving litmus test called Roy’s Razor. “Can you get in, pick a destination and safely go to sleep?” he asks. “If yes, it’s self-driving. If no, it’s not.”
While this has some merit, the key word “safely” gets somewhat lost. The internet is awash with less than sensible people climbing out of the driver’s seat with their Tesla in Autopilot.
So, here’s an idea to head off such recklessness… the best way to tell if a car is truly self-driving is to ask this simple question: Has it got a steering wheel?
Audi has apparently been down this road in the thinking behind its new Grandsphere concept car. When in “hands-off” mode, the steering wheel folds neatly away.
That certainly removes any doubt as to whether the driver is responsible for driving or just a user in charge, to use The Law Commission of England and Wales’ new lingo.
“We will be ready for Level 4 driving in the second half of this decade,” said Josef Schloßmacher, Audi’s spokesperson for concept cars.
“That’s an important timeframe for us and we will interact with authorities in the different continents and countries in all important markets on the homologation of this new technology.”
While somewhat open to the accusation of a fudge – if it is truly self-driving, why do you need a steering wheel at all? – this looks like progress.
Driverless Toyota e-Palette bus hits blind Japanese judo star
A golden PR opportunity for driverless cars backfired badly this week when a Toyota self-driving e-Palette shuttle bus hit a visually impaired athlete at the Tokyo Paralympic Games.
It had all been going so well. A fleet of eye-catching autonomous electric vehicles successfully ferrying competitors and officials around the Olympic village was a major triumph for the self-driving industry, and Toyota in particular.
But this Olympic fairy tale received a nasty reality check when a slow-moving e-Palette collided with Japanese judo veteran Aramitsu Kitazono, apparently ending his medal hopes.
Kitazono had been due to face Ukraine’s Dmytro Solovey the following day, but didn’t take to the mat. Toyota Chief Executive Akio Toyoda swiftly apologised, but the damage was done.
We first covered the e-Palette last year in our interview with Yosuke Ushigome, Director at Takram, who worked on Toyota’s future car concepts.
Somewhat ironically now, given the accident involved a blind man, our headline endorsed “flickering lights to replace eye contact in facilitating trust”. Perhaps audible warnings are also warranted.
“Throughout the development process, athletes, especially Paralympians, helped us understand how the e-Palette could be adapted and upgraded to better meet their needs for simple, convenient and comfortable mobility,” said Takahiro Muta, the project’s development leader, in 2019.
Hindsight is a wonderful thing. Last December, the idea of these autonomous vehicles playing a practical role at this showcase sporting event was enticing, to say the least – some questioned whether it would even be possible.
Now we are left with Toyoda’s grim assessment of the incident. “It shows that autonomous vehicles are not yet realistic for normal roads,” he said.
Use of the e-Palette fleet was suspended for several days but has now resumed.
As accusations of slow progress fly, the UK self-driving industry is accelerating.
There’s a lot of talk about the shift to autonomous vehicles slowing. Indeed, the question “Why has the driverless car revolution stalled?” was posed in preparation for the upcoming Reuters Automotive 2021 event [at which yours truly is moderating the AV session – sorry, shameless plug!].
In the UK, a good barometer of such things is Oxford-based Oxbotica, and they’ve made several significant announcements recently.
Back in January, we reported on the Oxford University spin-out securing huge BP investment, with CEO, Ozgur Tohumcu, teasing “exciting deals in the pipeline”.
Shortly afterwards, Tohumcu struck a big deal himself, leaving to become MD of Automotive at Amazon Web Services.
Oxbotica Co-founder and CTO, Professor Paul Newman, was lavish in his praise for ‘Ozo’, saying on LinkedIn: “A chunk of everything we do will always be because of what you made these past few years.”
One major goal was swiftly achieved: offering public AV passenger rides in the UK. Oxbotica was instrumental in this long-awaited milestone, providing the software for Project Endeavour’s well-publicised road trials in Birmingham and London.
Part-funded by the Centre for Connected and Autonomous Vehicles (CCAV), and delivered in partnership with Innovate UK, Project Endeavour applied BSI’s new safety case framework specification, PAS 1881:2020 Assuring the Safety of Automated Vehicle Trials and Testing.
Oxbotica therefore became the first company to have its safety case assessed against these stringent new requirements.
In Greenwich, six modified Ford Mondeos were deployed on a five-mile route to help transport planners and local authorities understand how autonomy can fill mobility gaps and play a role in the long-term sustainability of cities.
Dr Graeme Smith, Senior Vice President (SVP) at Oxbotica and Director of Project Endeavour, said: “This is a one-of-a-kind research project that is allowing us to learn about the challenges of deploying autonomous vehicles in multiple cities across the UK – a key part of being able to deploy services safely and at scale.
“So far, it has been a real collaborative effort, bringing everyone into the discussion, from local authorities to road safety groups, transport providers and, most importantly, the general public.”
Not everyone was convinced, however. My London carried this barbed comment from local Stephen McKenna: “What’s the purpose it’s filling that we don’t already have?” Clearly, the industry still has work to do on the public perception front.
Impressive new products can only help and, in May, Oxbotica and Navtech Radar launched Terran360, “the world’s first all-weather radar localisation solution for industrial autonomous vehicles”.
This pioneering technology is apparently accurate to 10cm on any vehicle, in any environment, up to 75mph. It has been comprehensively tested in industrial settings, on roads, railways and for marine use.
Phil Avery, Managing Director at Navtech, said: “Thanks to decades of experience in delivering radar solutions for safety and mission critical applications, and together with Oxbotica’s world-leading autonomy software platform, Terran360 is trusted to answer the fundamental question for autonomous vehicles: “Where am I?”, everywhere, every time.”
If that weren’t enough, outside of the UK, Oxbotica has deepened its partnership with BP by running an AV trial at its Lingen refinery in Germany.
Described as “a world-first in the energy sector”, BP now aims to deploy its first AV for monitoring operations at the site by the end of the year.
Morag Watson, SVP for digital science and engineering at BP, said: “This relationship is an important example of how BP is leveraging automation and digital technology that we believe can improve safety, increase efficiency and decrease carbon emissions in support of our net zero ambition.”
So much for AV progress stalling!
The warning lights are flashing on draft guidance to drivers in driverless cars.
Barrister Alex Glassbrook specialises in road transport and has written two books on UK autonomous vehicle (AV) law. An expert in the law of advanced, automated and electric vehicles, serious personal injury, motor insurance and high-value vehicle damages cases, he begins by highlighting three recent developments:
The Automated and Electric Vehicles Act 2018 coming into force on 21 April 2021;
The government announcement on 28 April that it isn’t yet publishing a list of AVs under Section 1 of the Act, but that it does expect to list vehicles equipped with Automated Lane Keeping Systems (ALKS) as “automated”; and
AG: “My work overwhelmingly involves car accidents as the source of serious injury, so the AEV Act coming into force was an historic moment. Immediately though, it was clear there was something missing: the list of automated vehicles under Section 1 of the Act, which the Secretary of State is required to publish as soon as it is prepared. There was a presumption that the Act and the list would come together, but they didn’t. We have the Act but no list. In traffic light terms, we’ve gone past amber but there’s no green. What’s going on?
“That question was answered a week later with the Centre for Connected and Autonomous Vehicles’ publication of its paper for the Department for Transport on whether vehicles equipped with ALKS would be listed as automated. In summary, it said the list is not yet being published because we’re waiting to find out if these vehicles will get Whole Vehicle Type Approval from the Vehicle Certification Agency (VCA). If that happens, then the Secretary of State does expect to list them as automated under the AEV Act.
“This has huge implications for liability because it brings into effect a new line of motor insurance. Currently, under the Road Traffic Act, the motor insurer is effectively the body that will satisfy any judgment against a liable driver, or indeed can be sued directly under the direct rights against insurers regulations.
“The new AEV Act does something very different, something particular to AVs: it makes the insurer of the vehicle directly liable. This brings two important changes. One is the direct liability, which is slightly different from the direct rights regs. Second, it attaches to the vehicle rather than the driver, which is quite a radical step.
“There are obviously practical considerations behind this. Would publishing the list before the vehicles get Type Approval be putting the cart before the horse? Even so, it’s a little bit curious because the Act has already come into effect. Moreover, it’s not yet certain that ALKS-equipped vehicles will be classed as automated. The Secretary of State could change his mind.
“Running alongside this, we have the proposed amendments to the Highway Code. They’re quite eye-catching. The current Highway Code reiterates the orthodoxy, that the driver must at all times be in control of the vehicle and must understand the manufacturer’s instructions. The new proposed version is currently out for consultation, but the consultation period is very short, with a deadline of 28 May.
“The key section reads: “On the basis of responses to the call for evidence, and the step-change that the expected introduction of the first legally recognised automated vehicles represents, we have decided to make a more ambitious amendment to The Highway Code, coinciding with the code’s 90th year anniversary.” To me, the fact it is 90 years since the Highway Code was first published in 1931 is neither here nor there. What is notable is the reference to “more ambitious”, because that implies there was an earlier draft.
“The next sentence has the wow factor. It says: “Automated vehicles no longer require the driver to pay attention to the vehicle or the road when in automated mode, except to resume control in response to a transition demand in a timely manner.” The implications of those words are immense.
“The document continues: “Automated vehicles are vehicles that are listed by the Secretary of State for Transport. While an automated vehicle is driving itself, you are not responsible for how it drives, and you do not need to pay attention to the road.”
“Well, we don’t have that list yet, and what follows is really quite striking. It proposes an instruction in the Highway Code, the official guidance to drivers, to do nothing – to pay no attention to how the vehicle is driving or what’s happening on the road. It positively advises drivers to switch off their attention.
“The next paragraph sets some parameters: “If the vehicle is designed to require you to resume driving after being prompted to, while the vehicle is driving itself, you MUST remain in a position to be able to take control. For example, you should not move out of the driving seat.”
“So, you shouldn’t get out of the driving seat – that’s quite a low standard. This appears to be saying it’s fine to watch a movie, it’s fine to go on Instagram, it’s fine to read and respond to business emails. All these tasks are entirely absorbing of concentration and require some disengaging from.
“I’ve done a lot of trials in which I’ve asked witnesses about their appreciation of time during a crash and heard expert evidence about what can happen within a short window of time. Particularly when you’ve got three or four lanes of motorway, multiple vehicles, an awful lot can happen in 10 seconds.
“There are two fairly well-known exceptions to driver control recognised in the law. One is a medical emergency, if a driver is suddenly incapacitated. The other is moments of peril, sometimes known as agony of the moment – when it is such a difficult situation that a driver causing injury by their evasive manoeuvre is not to be judged by the usual demanding standard.
“So, the common law has formed exceptions to liability, but in this case it’s more complex. First, it introduces, for want of a better phrase, artificial intelligence (AI) into the picture. Adjudicating the actions of AI is still a very undeveloped area of law. Second, it brings into the picture something that has been manufactured, namely a computer and sensor system within a moving vehicle. Again, the laws of product liability are at a very early stage of development in relation to AI and new technologies. A notorious example of that is over-the-air (OTA) software, which is not understood as goods.
“From a legal perspective, it is vitally important not to have guidance which leaves open very obvious questions. Unfortunately, these proposed changes to the Highway Code do just that. On the one hand, the Highway Code might say it’s perfectly fine to completely distract yourself from driving. But on the other hand, it’s not okay to do things like climbing out of the driving seat. That leaves open a very broad set of situations and the courts are going to find themselves dealing with some very difficult problems.
“Of course, what the court has to deal with is very much secondary, the primary question must be: what is safe? There are plenty of lessons in the history of motor vehicles when innovation has overreached. Famously, Ralph Nader’s book, Unsafe At Any Speed (published in the USA in 1965), highlighted rear suspension which lost traction when going round corners. That changed product liability law across all sectors.
“I’m not a road traffic engineer but, as an observer of many road traffic accident cases over many years, I have real doubts as to the safety of this guidance.”
Our Zenzic CAM Creator series continues with Jim Hutchinson, CEO of Fusion Processing.
As a partner in the ambitious CAVForth project, predicted by the Scottish Mail on Sunday to make Edinburgh “the most ‘driverless’ city in the world”, Fusion Processing is delivering on its promise to design and build world leading systems for the automation of vehicles. Here, CEO Jim Hutchinson talks ADAS, cyclist detection and autonomous vehicle safety, explaining how CAVForth is set to make a major mark on the global self-driving map.
Please can you outline Fusion Processing’s work on connected and automated mobility?
JH: “We’ve been going since 2012. We set up to develop automated vehicle systems with the ultimate goal of being fully autonomous – able to do anything that a human-driven vehicle can. We knew from the start there were a lot of steps along the way and, for essentially a commercial company, we needed to have products along those steps rather than trying for a ‘Level 5 or nothing’ approach.
“We developed the CAVstar platform as a scalable solution – a drive system we could put into pretty much any vehicle, from small cars up to HGV. Along the way we’ve been involved in some great schemes like the Venturer project, one of the original three UK AV projects.
“Then, more or less in parallel with that, we were involved in the Gateway project in London. We provided the autonomous drive system for the pods that drove along the Thames path. That was a big trial with random members of the public – some who came along specifically to experience it, and many others who just wanted to get from the O2 to the other end of the route. The pods encountered various other people on the route – the vehicles had to be mindful of dog walkers and cyclists. The feedback was by and large very positive, and it was a good proof point for us of how our system can be used off-highway.
“It also led to other things, notably our partnerships with Stagecoach and Alexander Dennis. First-off we were exploring using autonomy in bus depots. Every night a lot of operations have to happen involving the movement of vehicles – they have to be fuelled, washed, made ready for the morning, so we put together a system which could automate that. The concept was based on a fleet manager directing all this from a control tower once the bus arrives back at the depot.
“The system proved very successful, demonstrating operating efficiency and improved safety for those working in the depot, so that led to CAVForth – an autonomous bus service. Again, we’re working with Stagecoach and Alexander Dennis, joined by Transport Scotland, Bristol Robotics Laboratory and Napier University.
“The intent is to put into service a number of Level 4 autonomous buses between the Fife Park & Ride and the Hermiston Gait Interchange. It’s a commuter route so we’re expecting a large number of daily commuters who want to travel to the Hermiston Gait Interchange, where they can transfer on to trams for the city centre, the airport or the rail network. We expect tourists will want to use it too to reach the Forth Road Bridge, a UNESCO heritage site.
“It’s a useful service, running every day of the week, and the hope is that it will go from a pilot service to a full service. It’s being registered as a new route, providing a service that wasn’t previously there, and Stagecoach anticipate around 10,000 journeys a week.
“The route includes a mix of road environments – motorway, bus lanes, roundabouts, signalled interchanges – so from our point of view it makes for a great demonstration of capability. There’s the technology side, which Fusion is focussed on, but there’s also key research around public acceptance and uptake. That’s really exciting too.
“The launch date isn’t set in stone due to Covid uncertainties, and the point at which they start taking passengers is still to be determined, but we will be running autonomous buses this year. That’s an incredible milestone, absolutely huge. It will be a very significant achievement to demonstrate a Level 4 capability on that class of vehicle – a big thing for the UK which will be noticed around the world.
“There are one or two other groups working on similar projects, but I haven’t seen anything with this level of ambition, this level of complexity, or length of route. It’ll obviously be fantastic for us and our CAVForth partners, but also for the UK autonomous vehicle industry as a whole. It will really put us on the worldwide map.”
Please can you outline Fusion Processing’s work on driver assistance?
JH: “CycleEye is an important product for us. We identified a need for collision avoidance technology. There are lots of collisions with cyclists and quite often they occur because the bus driver doesn’t know the cyclist is there. CycleEye is like a subsystem of CAVstar in a lot of ways – one of those steps to get some proof points on bits of the technology. It recognises and classifies different types of vehicle, and the driver gets an alert when there’s a cyclist in the danger zone. It is currently being used in a few cities around the UK, including on the Bristol Metrobus. It’s a good system. Whenever it has been evaluated against other cyclist detection systems it has always come out on top.
“We’re particularly excited about the next incarnation of CycleEye, evolving it to become a camera mirror system. It’s legal now to use cameras instead of mirrors, so we can provide that functionality too – monitors in the driver’s cab instead of mirrors. That has several benefits. Mirrors, on buses particularly, can be a bit of a liability – they quite often get knocked and sometimes they knock people. They stick out and head strikes are unfortunately quite common. They also get smashed, putting the bus out of service, which is an inconvenience and an operational cost. We think that being able to offer this camera mirror with CycleEye functionality is going to prove attractive to a lot of operators.”
Over what timescale do you expect Level 4 and 5 autonomy to be achieved in the UK and which sectors will be early adopters?
JH: “With CAVForth we’ll be running Level 4 autonomous vehicles, where you’ve got a restricted operational design domain (ODD), in the UK this year. Restricting these vehicles to particular routes or environments lends itself very well to public service, where the vehicles are maintained by an operator. That’s very achievable right now. As well as passenger service vehicles, other service vehicle fleets are easy wins, as well as off-highway stuff like industrial sites. Then you’ve got delivery vehicles.
“When it comes to true Level 5 – go anywhere, do anything vehicles – repair and maintenance is an issue. We know that with privately owned cars, some people maintain them exactly as they should, and other people don’t. There are other complications too – things that people perhaps don’t do that often but like their vehicles to be able to do, like parking in a farmer’s field at a festival – that’s a little bit further out still.
“If you just roll back slightly from true Level 5, if people want a city car or a comfortable car for a long motorway journey, nothing off-road, there’s a case for vehicles which have an autonomous mode. That certainly appeals to me.”
Can you address the concerns about ADAS, particularly handover of control, driver concentration levels and driver deskilling?
JH: “I’m not a big fan of Level 3. If you haven’t been driving for an hour to suddenly be asked to take the wheel because the car has encountered something it can’t handle, it’s just unrealistic. Whereas a Level 4 system, which can put itself into a safe state when it reaches the limits of its ODD – perhaps ready to be restarted in a manual mode when the driver wants to take control – that’s much more practical.
“If there are circumstances when the driver needs to take over then clearly the driver needs to be of a standard that they can drive safely. Once you have widespread adoption of autonomous systems, and people are not driving routinely, there is a risk of driver deskilling. If that were the case you’d really need to look at greater regulation of drivers.
“That said, you can sometime envisage problems that don’t really transpire. We’ve had cruise control and adaptive cruise control for a while now and I don’t think they’ve had the effect of particularly deskilling drivers. So, with Lane Keep, maybe it’s not such a big deal. Once you get to the point where cars are properly self-driving, there is a danger. If you haven’t got anything to do your mind will wander, that’s human nature, so it is a concern.”
Thanks to LinkedIn, self-driving experts from the UK and New Zealand have united to decry the trolley problem in relation to driverless cars.
Mitchell Gingrich, President of Autonomous Consulting in Christchurch, New Zealand, responded to our interview with Professor John McDermid, Director of the Assuring Autonomy International Programme at the University of York, saying: “Spot on about the trolley problem.”
Professor McDermid had asserted that: “The trolley problem is a nonsense… all these elaborate versions require self-driving vehicles to make distinctions that you or I could not.”
The trolley problem is a thought experiment which runs like this: Imagine there’s a runaway trolley and, ahead, five people are tied to the track; You are standing some distance off, next to a lever. If you pull it, the trolley will switch to a track only one person is tied to. What do you do?
Or, as Professor McDermid puts it: “Who do you save, a child or an older person? The child because they can be expected to live longer and benefit more. However, this is based on false assumptions. I don’t believe in the split second of a crash you go into that sort of thought process – you focus on controlling the vehicle and in most cases the best option is to (try to) stop.”
Gingrich opined that the March 2018 fatal accident involving an Uber Advanced Technology Group (Uber ATG) self-driving vehicle can aid in evaluating the trolley problem. The National Transportation Safety Board (NTSB) in the US recently completed an 18-month-long investigation and concluded there were 20 contributing factors. Some of those concerned the software misclassifying a pedestrian. A significant contributing factor was the safety driver’s inattentiveness.
The trolley problem assumes that a person or system is not only aware of the task of driving but also of the present and future merits of the lives of road users, he says. However, experience demonstrates that, sadly and all too frequently, road users pay the price for a lack of vigilance.
It turns out that Gingrich, a lawyer by trade, has been on quite a journey with autonomous vehicles himself. From working for Uber ATG in Phoenix, seeing first-hand the fallout from the Elaine Herzberg tragedy, to relocating to New Zealand and setting up Autonomous Consulting to push the case for driverless transport.
“I’m convinced that the future will be autonomous,” he says. “Whether it’s on public roads, in the air or on the seas, we will be utilising autonomous technology to transport our people and goods. That’s what autonomy is promising, but we’re in an interim period.
“New cars have advanced driver assistance systems (ADAS) like lane keep assist and automatic emergency braking. Some of us have been using cruise control for a long time, now it is adaptive – the car will keep its distance. These are autonomous features but not autonomy and we need to educate the public about the difference.
“Autonomy is about safety, resources and the environment. These ADAS systems expect me to pay attention to the road and the robot, and that’s not a recipe for safety. 93-94% of accidents are caused by human error, usually distraction – we think we’re paying attention, but we aren’t. There are repair and maintenance issues too, for example, around the correct calibration of sensors.
“In terms of resources, my personal car is a depreciating asset that isn’t used 90% of the time. Autonomous vehicles will also have a tremendous impact on town planning. An architect in the US imagined Manhattan pedestrianised and it freed up 60% of space.
“My freedom is not challenged by not having a personal vehicle. I’d have more money in my pocket and could use my smartphone to access different vehicles for different purposes.”
Our Zenzic CAM Creator series continues with Raunaq Bose, co-founder of Humanising Autonomy.
Before establishing predictive artificial intelligence (AI) company Humanising Autonomy in 2017, Raunaq Bose studied mechanical engineering at Imperial College London and innovation design engineering at the Royal College of Art. Focusing on the safety of vulnerable road users, Humanising Autonomy aims to redefine how machines and people interact, making cities safer for pedestrians, cyclists and drivers alike.
RB: “Our model is a novel mix of behavioural psychology, deep learning and computer algorithms. We work with OEMs and Tier 1 suppliers on the cameras on vehicles, with the aftermarket on retrofitted dashcams, and also with infrastructure. Our software works on any camera system to look for interactions between vulnerable road users, vehicles and infrastructure in order to prevent accidents and near misses. While most AI companies use black box systems where you can’t understand why decisions are made, we set out to make our models more interpretable, ethically compliant and safety friendly.
“When it comes to questions like ‘Is this pedestrian going to cross the road?’, we look at body language and factors like how close they are to the edge of the pavement. We then put a percentage on the intention. Take distraction, for example, we cannot see it but we can infer it. Are they on the phone? Are they looking at the oncoming vehicle? Is their view blocked? These are all behaviours you can see and our algorithm identifies them and puts a numerical value on them. So we can say, for example, we’re 60% sure that this pedestrian is going to cross. This less binary approach is important in building trust – you don’t want lots of false positives, for the system to be pinging all the time.
“One of the main things we’re likely to see over the next decade is increased use of micromobility, such as cycling and e-scootering. At the same time you will see more communication between these different types of transportation, and also with vehicles and infrastructure. The whole point of ADAS is to augment the driver’s vision, to reduce blind spots and, if necessary, take control of the vehicle to avoid a shunt. Then there’s the EU agreement that by 2022 all buses and trucks must have safety features to detect and warn of vulnerable road users.
“We currently only look at what’s outside the vehicle, but with self-driving there will be monitoring of the cabin. In terms of privacy, we have a lot of documentation about our GNPR processes and how we safeguard our data. Importantly, we never identify people, for example, we never watch for a particular individual between camera streams. We look to the future with autonomous cars but for now we’re focused on what’s on the road today.”
Our Zenzic CAM Creator series continues with Dr Charlie Wartnaby, chief engineer at Applus IDIADA.
Way back in 2019 we covered IDIADA’s role in the construction of the new CAVWAY testing facility, and that investment continued with a large new venture. With a PhD in physical chemistry from the University of Cambridge, Charlie Wartnaby was technical lead for the ground-breaking Multi-Car Collision Avoidance (MuCCA) project.
CW: “Certainly the funding from the Centre for Connected and Autonomous Vehicles (CCAV) for MuCCA and CAVWAY were big wins for us. Traditionally, we’d focused on automotive electrics and engine management, but we could see there was all this exciting CAV work. Now we’re working with an OEM I can’t name to run an field operational test using our IDAPT development tool – a high performance computer with GPS and car-to-car communications – as a spin-off from MuCCA.
“With the MuCCA project, we think we achieved a world first by having multiple full-sized vehicles do real-time cooperative collision avoidance. We still have the cars for further R&D when time, budget and Covid allow.
“In the UK, we’re focussed on building a new proving ground (CAVWAY) near Oxford, which should open in 2021. There’s also our CAVRide Level4 taxi project, at our headquarters near Barcelona. CAVRide shares some of the technology developed for MuCCA and they’ve done some really interesting vehicle-in-the-loop testing, having the real vehicle avoid virtual actors in a simulation environment.
“In the short term, we’re really working hard on the C in CAV. Connected vehicles offer massive safety and efficiency improvements, for example, by warning about stopped vehicles or advising on speed to get through traffic lights on green. There’s a bit of a VHS versus Betamax situation, with both WiFi-based short-range communications and the C-V2X 5G-based protocol, so we’ve upgraded IDAPT to support both.
“Personally I think that while heroic work by the big players shows robotaxi applications are feasible, economic viability is a long way off, 2030 maybe. Watch the latest Zoox and Waymo videos from America, they’re mesmerising! No way is that kind of tech going to be installed in private cars any time soon because it’s eye-wateringly expensive. Think about the costs involved in making every taxi driverless – it’d be out of all proportion to replacing driver salaries, especially considering backup teleoperators and maintenance and charging personnel.
“These big self-driving companies aren’t operating in the UK yet, but we do have very successful smaller players with intellectual property to sell. The UK government has been supporting a good number of R&D projects, via the CCAV and UK Research and Innovation (UKRI), and the regulatory environment has been reasonably friendly so far.
“I feel the first practical applications are likely to be low-speed shuttle buses and small autonomous delivery droids, but trucking is a very important area. If lorry drivers were permitted to stop their tachographs while napping in the back of the cab once on the motorway – only clocking up hours for parts of long journeys – that would make a viable economic case for a Level4 operating design domain (ODD) of ‘just motorways’, which is harder to justify merely as a convenience feature in private cars.
“In terms of current tech, emergency lane keeping systems (ELK), to stop drifting, are a major breakthrough, requiring cameras, sensors and autonomous steering. I welcome the road safety, however, if drivers engage automation systems like ALKS (automated lane keeping) by habit, for sure their skills will be affected. Perhaps there’s a case for the system enforcing some periods of manual driving, just as airline pilots perform manual landings to stay in practice even in planes that can land themselves.
“Concerns about timely handover are well-founded and I think there’s an industry consensus now that Level3 is not reasonable if it requires quick driver intervention. We see up to 20 seconds before some unprepared drivers are properly in control when asked to resume unexpectedly. It really requires that the vehicle can get itself into (or remain in) a safe state by itself, or at least there needs to be a generous takeover period. The difference between L3 and L4 is that the latter must always be able to achieve that safe state.”
Our Zenzic CAM Creator series continues with Professor John McDermid OBE FREng, Director of the Assuring Autonomy International Programme at the University of York.
Professor John McDermid has been Director of the Assuring Autonomy International Programme, a partnership between Lloyd’s Register Foundation and the University of York, since 2018. He advises government and industry on safety and software standards, including Five and the Ministry of Defence, and was awarded an OBE in 2010. The author of 400 published papers, his 2019 article, Self-driving cars: why we can’t expect them to be ‘moral’, was highly critical of the oft-quoted trolley problem in relation to driverless vehicles.
PJM: “I’ve been at York for 30 years working on the safety of complex computer-controlled systems. What you define as complex changes all the time. In January 2018 we started a new programme, looking at the assurance of robots and autonomous systems, including automated mobility, but also robots in factories, healthcare and mining.
“It’s important to demonstrate the safety and security of novel technologies like machine learning, but there’s often a trade-off involved, because you can make things so secure they become unusable. If I open my car with the remote key I have a couple of minutes before it automatically locks again, and there’s a small possibility that someone could get their finger trapped if they try to open the door just as it automatically re-locks. We encounter these types of trade-offs all the time.”
What major shifts in UK transport do you expect over the next 10-15 years?
PJM: “Over the next decade we will get to Level4 autonomous driving, so in defined parts of the road network cars will drive themselves. We will solve the safety problems of that technology, but I’d be surprised if it is within five years. Despite the rhetoric, Tesla’s approach is not on track for safe autonomous driving within the year.
“At the same time, there will be a trend towards Mobility as a Service (MAAS). I love my car, but I’ve had it for 18 months and have only driven 7,000 miles. I sometimes ask myself why I have this expensive piece of machinery. A recent study showed that the average car in the UK is only used for 53 minutes a day. Mostly, they sit doing nothing, which, considering the huge environmental impact of manufacturing all these vehicles, is very wasteful.
“If I could call upon a reliable autonomous vehicle and be 99% certain that it would arrive in a timely manner, say within five minutes, I’d probably give up my car. It should also be noted that the two trends go hand-in-hand. Having Level4 is critical to achieving MAAS, delivering all the convenience of having your own car without any of the hassle.”
Can you address some of the data privacy concerns surrounding connected cars?
PJM: “We are back to this issue of trade-offs again. I want my MAAS so I’ve called it up and given the service provider some information about where I am. If they delete that information after I’ve paid then I’m prepared to accept that. What if the company wants to keep the information but won’t allow access except for law enforcement – would that be acceptable to the public? What can government agencies require this company to do?
“Another example: What if your 10-year-old daughter needs MAAS to take her to school? A reasonable concerned parent should be able to track that. What if the parents are divorced, can they both access that data? There’s clearly a privacy issue and there needs to be a legislative framework, but it’s a balance. For the purposes of getting from A to B, most people would accept it, so long as their data is normally kept private.”
Can you address concerns about the trolley problem in relation to self-driving cars?
PJM: “My basic feeling is that the trolley problem is a nonsense, a distraction. All these elaborate versions require self-driving vehicles to make distinctions that you or I could not.
“The big Massachusetts Institute of Technology (MIT) study sets a higher standard for autonomous vehicles than any human can manage. Who do you save, a child or an older person? The child because they can be expected to live longer and benefit more. However, this is based on false assumptions. I don’t believe in the split second of a crash you go into that sort of thought process – you focus on controlling the vehicle and in most cases the best option is to (try to) stop.
“I don’t know why people find the trolley problem so compelling, why they waste so much energy on it. I really wish it would go away. Fortunately, most people seem to be coming to that conclusion, although one of our philosophy lecturers strongly disagrees with me.”
Which sectors do you think will adopt self-driving first?
PJM: “Farming applications might come first as they are short of people in agriculture and the problems are simpler to overcome. If you geofence a field where you wish to use a combine harvester and equip it with technology so it doesn’t run over a dog lying asleep in the field – there’s already tech which is getting quite close to that – then that’s an attractive solution.
“Last mile freight via small delivery robots (like Nuro in the US and Starship here in the UK) might also come quickly, but longer distance freight will probably require a segregated lane. Even last mile robots come with risks, like people tripping over them.
“There’s a lot of commercial desire for robotaxis, and this is potentially a very big market. There are already genuine driverless taxis in the US now, but they have a much simpler road structure than here in the UK.
“The crucial technical bit is finding accepted ways of assuring the machine learning. I would say that, I work on it, but without that regulators and insurers won’t allow it.”