The first quarter of 2022 has seen two giant leaps forward for self-driving in America. First, in February, General Motors–backed Cruise started offering robotaxi rides to the public in San Francisco… with no safety driver.
Then, in March, The National Highway Traffic Safety Administration (NHTSA) removed the necessity for autonomous vehicles to have manual controls including, notably, a steering wheel.
Cruise self-driving robotaxi
Cruise posted a video showing consumers’ reactions to riding in a truly driverless taxi – they ranged from “This is so cool” to “Just weird”, “Slightly scary” to “A lot smoother than I was expecting”, and probably most astutely: “I am literally witnessing the future”.
General Motors (GM) chief executive, Mary Barra, told shareholders: “This major milestone brings Cruise even closer to offering its first paid rides and generating $50bn in annual revenue by the end of the decade.”
Make no mistake, this is a significant development: A household-name US vehicle manufacturer (VM) operating a driverless taxi with no safety driver in a popular global tourist destination.
Not just any old city either – the streets of San Francisco, so closely associated with the iconic high speed car chase from the Steve McQueen film Bullitt. For fans of burning rubber and squealing brakes, it will be hard to take, but that was 1968, over half a century ago. V8 Ford Mustangs and Dodge Chargers are history. Self-driving cars are the future.
If you need further convincing, you need only look to the historic NHTSA announcement, on 10 March 2022, eliminating the need for manufacturers to equip fully autonomous vehicles with a steering wheel.
It’s something we were speculating about at Cars of the Future just last summer – when we looked at Audi’s Grandsphere concept car, with a steering wheel which folds neatly away when in hands-free mode. It’s also a startling indicator of just how rapidly this industry is moving.
Audi Grandsphere self-driving concept
US self-driving law change
The legislative change follows lobbying by General Motors and updates the Federal Motor Vehicle Safety Standards related to occupant protection in vehicles with an automated driving system (ADS).
NHTSA Deputy Administrator, Steven Cliff, commented: “As the driver changes from a person to a machine in ADS-equipped vehicles, the need to keep the humans safe remains the same and must be integrated from the beginning.”
America is surging ahead in self-driving and if the UK wants to remain “at the forefront of this change”, as the Government says, we’d better get our skates on.
Neil Kennett reviews the CAM Innovators self-driving industry event in London, March 2022
As my first industry do in London for two years, the Zenzic Connected and Automated Mobility (CAM) Innovators event 2022 was always going to be memorable. Actually, it was much better than that. It was a fantastic day packed with astute analysis and exciting announcements about self-driving in the UK.
It was also a reminder of the shared vision – the belief that we’re on the cusp of something momentous, that this technology can deliver seismic safety and societal benefits. And this is no pipedream. Thanks to a lot of hard work over many years by an array of seriously talented people, there’s a detailed Roadmap of exactly how we’ll get there.
Let’s talk self-driving
For starters, we couldn’t have wished for a more impressive venue – The Institution of Engineering and Technology (IET) on The Embankment, near Waterloo Bridge. Passing the statue of Michael Faraday, the father of electromagnetism, I bumped into a former colleague before I’d even reached the front door. How nice to see Tom Flisher of Thatcham – a real live human – after all the remote communications of the pandemic.
Hands up, I missed the morning sessions on cyber resilience, vehicle to everything (V2X) and the Interoperable Simulation project. Catching the fast train in to London to attend a real world event is, admittedly, more time consuming and expensive than clicking into a Teams meeting.
We’ll look at the Interoperable Simulation project in more detail another day as it’s a prime example of joined-up thinking, designed to enable seamless testing across the CAM Testbed UK facilities.
The main reason for attending, I thought, was to hear about the latest six UK-based companies selected for Zenzic’s CAM Scale-Up Programme – a business accelerator for almost ready-for-market products and services that can “meet required safety standards and operate in real-world environments”. There’s also the small matter of sharing £500,000 of government funding.
Of the six winners announced in October 2021, four are London-based: geolocation solution provider Albora; Intelligent CCTV designer Exeros; sensor fusion system developer Grayscale AI; and insurance claims visualiser Xtract 360. The other two are: Cambridge-based vulnerable road user safety specialist R4DAR; and Cardiff-based real-time movement experts Route Konnect.
Each will be supported by the UK government – via the Department for Transport’s Centre for Connected and Autonomous Vehicles (CCAV) – and innovation platform Plug and Play. They’ll get time at the testbeds, benefit from introductions to corporate partners (including Honda, Thales and Vodafone), and gain access to a global network investor platform. Watch this space for in-depth profiles.
The curious among you will have noted the “I thought” a couple of paragraphs ago. Of course, hearing from these exceptional innovators was great, but the best was yet to come.
Wired editor and futurist Jeremy White talks self-driving at the CAM Innovators event 2022
Following an entertaining whip through automotive history with Wired editor and futurist Jeremy White – who urged the self-driving industry to “hurry up!” and make connected and automated mobility a reality – we adjourned to the Haslett & Flowers room for networking drinks.
And that’s where the magic happened: Talking shop and shooting the breeze with people I’d just met, connected with on Linkedin, interviewed on Zoom, been on mute with for hours. That’s where you hear the backstories and inspirations, discover obscure but pertinent bits of information, and see early signs of the next big things.
A maelstrom of tech wizards and engineers, CEOs and interns, the odd safety campaigner and motoring hack, most cautious about over-promising but overwhelmingly excited and optimistic about the fast-approaching road transport revolution.
That’s what self-driving industry events are all about. That’s what we’ve been missing.
Relationship between driverless cars, the media and consumer confidence reflected in five hyperbolic headlines.
Self-driving experts talk constantly of the need to earn public trust, but driverless cars continue to divide opinion. Indeed, recent surveys have shown that people are becoming more, not less, wary of them.
Just this week, in Inverness, Scotland, where an autonomous bus trial is due to start later this year, The Inverness Courier reported significant resistance to the idea. 69% of respondents to its survey of local residents said they would refuse to get on a driverless bus.
What we need, of course, is for the media to convey an informed and nuanced safety message. Hmm! To illustrate the scale of the task, here’s a list of our top five hyperbolic headlines:
On a mission to drive more sensible debate about self-driving, Carsofthefuture.co.uk has renewed its media partnership with Reuters Events for the Auto Tech 2022 digital conference on 14-15 June.
The prestigious two-day online event will enable technology providers and automotive companies to meet and do business with vehicle manufacturers (VMs) including Audi, BMW, Cadillac, Daimler, Fisker, Ford, GM, Honda, Hyundai, Nissan, Opel and Toyota.
Confirmed speakers include Mercedes Benz Mobility chief executive Franz Reiner, Hyundai Motor Company chief safety officer Brian Latouf, Polestar chief operating officer Dennis Nobelius and Lucid Motors vice president of software validation Margaret Burgraff.
Self-driving and AI at Auto Tech 2022
As part of The Key Steps Towards Safer Roads programme on 15 June, Carsofthefuture.co.uk editor Neil Kennett will moderate two sessions: 1) “The Growing Presence of AI”, with Sammy Omari, vice president of autonomy at Motional, and; 2) “Where are we on the journey to full automation?”, with Xinzhou Wu, head of Xpeng Motors’ Autonomous Driving Centre.
The former will cover the value of artificial intelligence in testing cutting-edge systems and its role in autonomous vehicle (AV) decision making. The latter will cover autonomous driving, in-car connectivity and advanced driver assistance systems (ADAS), evaluating the progress made and exploring when carmakers expect to introduce fully automated features.
Carsofthefuture.co.uk editor, Neil Kennett, said: “I’m delighted to renew our partnership with Reuters and look forward to lively discussions about these phenomenal but controversial technologies. It’s a shame, given everything Tesla’s done for electric cars, that so many hyperbolic headlines are caused by its confusingly-named Full Self-Driving (FSD) package. It simply isn’t self-driving as the rest of the industry understands it.
“Conflating assisted and automated driving is dangerous, because it risks drivers misunderstanding what their cars are capable of. News of so-called driverless car crashes then dents consumer confidence – the last thing the industry needs at such a crucial time in terms of public perception. These are safety-critical issues and utmost clarity is vital. For the near future at least, the best advice is that drivers need to be alert at all times.”
Nabil Awan, automotive conference producer at Reuters Events, added: “The moves towards greater connectivity and autonomy that we are seeing will lead to safer roads while also deeply transforming the auto industry as we know it today. Our unique Auto Tech 2022 event will give innovators and technology providers a chance to discuss the latest advances and come away with valuable intelligence with which to drive the evolution of the sector.”
Carsofthefuture editor will moderate sessions on self-driving and AI at Auto Tech 2022
A few messages flew back and forth, and it transpired that he’s an expert in measuring driver behaviour, particularly driver-vehicle interactions in ADAS-equipped and self-driving vehicles. That was music to my ears, so we arranged a Zoom. What follows is the highly insightful result.
Lucas Noldus Ph.D., Founder of Noldus Information Technology
LN: “The future starts here. The world is changing. We see people living longer and there are more and more interactive devices – telephones, tablets, dashboards – with which we can interact, leading to greater risk of distraction while driving. I know personally how tempting it is to use these devices, always trying to keep your eyes on the road.
“We already have fascinating developments in connected driving and now, with self-driving, the role of the driver changes significantly. That has triggered research institutes, universities, OEMs and tier one suppliers to pay more attention to the user experience for both drivers and passengers.
“All these experiences are important because how people perceive the safety and comfort will influence their buying decisions, and their recommendations to other potential users.
“For autonomous driving, how far will we go towards level five? What happens at the intermediate stages? Over the coming decades, driving tasks will gradually diminish but, until full autonomy, the driver will have to remain on standby, ready to take over in certain situations. How will the vehicle know the driver is available? How quickly can he take over? These are the topics we’re involved in as a technology company.
“We make tools to allow automotive researchers to keep the human in the loop. Traditionally, automotive research focused exclusively on improving the vehicle – better engines, drivetrains etc. Until recently, nobody paid much attention to the human being (with a brain, skeletal system, muscles, motor functions), who needs to process information through his sensory organs, draw the right conclusions and take actions.
“Now, these aspects are getting more attention, especially in relation to reduced capacity, whether due to a distracting device, drugs, alcohol or neurodegeneration. As you get older your response time becomes longer, your eyesight and hearing abilities reduce, as does the speed at which you can process information.
“These are the challenges that researchers in automotive are looking at concerning the role of the driver, now and in the future. If the automated or semi-automated system wants to give control back to the driver because its AI algorithms decide a situation is too complex, can the driver safely take over while he’s been doing something like reading or taking a nap? How many milliseconds does the brain need to be alert again?
NK: “Draft legislation seems to be proceeding on a 10-second rule, but some studies say at least 30 seconds is required.”
LN: “Situational awareness – that’s a key word in this business. Not only where am I geographically, but in what situation. Oh, I’m in a situation where the road surface is very wet, there’s a vehicle just in front of me, the exit I need is near and I’m in the wrong lane. Understanding a situation like that takes time.
“If we take a helicopter view, from our perspective as a technology company, what should be measured to understand the driver behaviour? Which sensors should we use to pick up that information? If we use a microphone, a video camera, a heartbeat monitor and a link to the ECU, how do we synchronise that?
“That’s not trivial because one sensor may be sending the sampling at 300Hz and another at 25 frames per second. That’s something my company has specialised in over the years. We’re very good at merging data from different sources, whether it’s a driving simulator continuously spitting out data, a real car, or sensors mounted in the infrastructure.
“You then need to analyse that data and pull out meaningful quantitative units that give you actionable insights. Generating large matrices is no big deal, making sense of that information is the real challenge.
“For example, in dashboard design, a manufacturer might be comparing two or three displays of road quality. A driver behaviour study with our tools will give the designer a clear answer on which design leads to the least cognitive workload, the least confusion.
Noldus DriveLab
“This same technical challenge can be applied to a vast number of design objectives. The vehicle manufacturer might be looking to make incremental improvements to, say, the readability of the dashboard under certain light conditions. Or they might be working on a completely new feature, like an intelligent personal in-car assistant. A number of brands are working on that, but the concept is still relatively new.
“You cannot test every scenario on the road, it’s just too dangerous, so we work with simulator manufacturers too. On the road or in the lab, we can measure a driver’s actions with eye-tracker, audio, video, face-reader and physiology in one.”
NK: “Back to LinkedIn again, I saw a post by Perry McCarthy, the F1 driver and original Stig on Top Gear, who said something like: Simulators are getting so good these days, when you make a mistake they drop three tonnes of bricks on your legs!”
LN: “You have so-called high fidelity and low fidelity simulators – the higher the fidelity, the closer you get to the real vehicle behaviour on the road, and there are all sorts of metrics to benchmark responsiveness.
“You have simple fixed-base simulators right up to motion-based simulators which can rotate, pitch and roll, move forward, backwards, sideways and up and down. For the best ones you’re talking about 10 million euros.
“We work with OEMs, tier1 suppliers, research institutes and simulator manufacturers to build-in our DriveLab software platform. We also advise on what sensors are recommended depending on what aspects of driver behaviour they want to study.
“We try to capture all the driver-vehicle interactions, so if he pushes a pedal, changes gear or turns the steering wheel, that’s all recorded and fed into the data stream. We can also record their body motion, facial expression, what they’re saying and how they’re saying it – it all tells us something about their mental state.
Multi-camera eye tracker (Smart Eye)
“Eye tracking measures the point of gaze – what your pupils are focused on. In a vehicle, that might be the left, right and rear mirrors, through the windscreen or windows, around the interior, even looking back over your shoulders. To capture all that you need multiple eye-tracking cameras. If you just want to look at, for example, how the driver perceives distance to the car in front, you can do with just two cameras rather than six.
“Eye tracking generates all sorts of data. How long the eyes have been looking at something is called dwell time. Then there’s what direction the eyes are looking in and how fast the eyes move from one fixed position to another – that’s the saccade. People doing eye tracking research measure saccades in milliseconds.
“Another important metric is pupil diameter. If the light intensity goes up, the pupil diameter decreases. Given a stable light condition, the diameter of your pupil says something about the cognitive load to your brain – the harder you have to think, the wider your pupils will open. If you’re tired, your blink rate will go up. There’s a normal natural blink rate to refresh the fluid on your eyes with a fully awake person, but if you’re falling asleep the blink rate changes. It’s a very useful instrument.
“Then there’s body worn sensors that measure physiology. It’s harder to do in-car, but in a lab people don’t mind wearing electromyography (EMG) sensors to measure muscle tension. If you’re a designer and you want to know how easy it is for an 80-year-old lady to operate a gearshift, you need to know how much muscle power she has to exert.
“We also measure the pulse rate with a technique called photoplethysmography (PPG), like in a sports watch. From the PPG signal you can derive the heart rate (HR). However, a more accurate method is an electrocardiogram (ECG), which is based on the electrical activity of the heart.
GSR (EDA) measurement
“Further still, we measure galvanic skin response (GSR), also called electrodermal activity (EDA), the level of sweating of your skin. The more nervous you get, the more you sweat. If you’re a bit late braking approaching a traffic jam, your GSR level will jump up. A few body parts are really good for capturing GSR – the wrist, palm, fingers, and the foot.
“We also measure oxygen saturation in the blood with near infrared spectroscopy (NIRS) and brain activity with an electroencephalogram (EEG). Both EEG and NIRS show which brain region is activated.
“Another incredibly useful technique is face reading. Simply by pointing a video camera at someone’s face we can plot 500 points – the surroundings of the eyebrows, the eyelids, the nose, chin, mouth, lips. We feed this into a neural network model and classify it against a database of tens of thousands of annotated images, allowing us to identify basic emotions – happy, sad, angry, surprised, disgusted, scared or neutral. You can capture that from one photograph. For other states, like boredom or confusion, you need a series of images.
“These days we can even capture the heart rate just by looking at the face – tiny changes in colour resulting from the pulsation of the blood vessels in the skin. This field of face reading is evolving every year and I dare to claim that we are leading the pack with our tool.
“Doing this in the lab is one thing, doing it in a real car is another challenge, being able to keep your focus on the driver’s face and deal with variable backgrounds. Of course, cars also drive at night so the next question is can you do all this in darkness? We turned our company van into an instrumented vehicle and my sons agreed to be the guinea pigs.
“It took some work – overcoming the issue of light striking the face and causing sharp shadows, for instance – but we can now use infrared illuminators with our FaceReader software to make these measurements in full darkness.
“The turning of the head is also very important in studying distraction, for example, if the driver looks sideways for too long, or nods their head in sleepiness. When something shocks someone, we see the face change and the blood pressure rise, and these readings are synchronised in DriveLab.
“It is well proven that even things like changing radio station can be very distracting. Taking your eyes off the road for just a few seconds is dangerous. As we move to more and more connected devices, touchscreens and voice commands, minimising distraction is vital to ensure safety.”
NK: “I absolutely love this tech but what I actually drive is a 7-year-old Suzuki Swift Sport with a petrol engine and a manual gearbox, and I quite like it that way”
LN: “I’m doing research on cars of the future with my software but I am personally driving a 30-year old soft-top Saab 900. That’s my ultimate relaxation, getting away from high tech for a moment.
“At Noldus, we’re constantly pushing the boundaries of research, working with top level organisations in automotive – Bosch, Cat, Daimler, Fiat, Honda, Isuzu, Land Rover, Mazda, Nissan, Scania, Skoda, Toyota, Valeo and Volvo, to name just a few – and also with the Netherlands Aerospace Centre (NLR) and the Maritime Research Institute Netherlands (MARIN).
“Our aim is make it so that the client doesn’t have to worry about things like hardware to software connections – we do that for them so they can focus on their research or design challenge.”
Ahead of a flagship product launch later this week, Bill McKinley, Automotive Strategic Planner at Keysight Technologies, gives his thoughts on self-driving and the fast-changing connected and autonomous vehicle (CAV) landscape.
Avid readers may remember that Bill was on the panel I hosted at the Small Cells World Summit in May. He’s got 30+ years’ experience in wireless communications and his current focus is developing test solutions for the automotive sector.
BM: “The UK, in line with other nations around the world, is investing heavily in connectivity and electrification – both the vehicles themselves and the charging infrastructure. Connected vehicles have been demonstrated to enhance safety via cellular vehicle to everything (C-V2X) and dedicated short-range communication (DSRC).
“These technologies allow for more efficient driving, for example, by routing to avoid accidents or poor road conditions. They also enable higher levels of automation, all of which can lead to an improved overall driving experience.
“It is likely that the first fully automated vehicles will be delivery vehicles, controlled environment shuttle type services, and buses on specific routes. With the gradual introduction of robotaxis, we will no doubt start to see Mobility as a Service (MaaS) become more common over the next 10-15 years.
“Keysight was the first test and measurement company to be awarded Global Certification Forum (GCF) validation for C-V2X RF conformance. We have industry leading validated test cases for the C-V2X protocol conformance test, and we were the first to be awarded OmniAir Qualified Test Equipment (OQTE) status.
“Cybersecurity will play a critical role in connected mobility and Keysight is working with leading companies and organisations in this space to develop solutions to ensure vehicular communications remain safe and robust against attacks.
“Clearly, the main risks associated with self-driving vehicles are around the safety aspects, which in turn will heavily influence public acceptance of the technology. We are all very familiar with some of the headlines about Tesla vehicles.
“It remains incredibly challenging to overcome the complexities of urban automated driving, but things are improving all the time. Our autonomous driving emulator (ADE) system is designed with this in mind – to test many autonomous drive systems in a rich and authentic environment within the lab, before moving testing out into the field.”
More on that to follow soon. For further info see keysight.com
Navtech Radar puts figures on the benefits of port automation including reduced operating expenses and labour costs
While self-driving cars await a legislative framework, this ground-breaking technology is already being deployed in off-road settings. Ports are a good example and Madelen Shepherd, Growth Marketing Manager at Navtech, sets out a strong business case.
MS: “Ports are complicated operations and automation can massively improve efficiency, so we’ve been doing some financial analysis on the quantification of value. The benefits fall into three main areas: 1) reduced operating expenses; 2) reduced labour requirements; and 3) productivity increases.”
According to Navtech’s research, benefits resulting from port automation include a 31% reduction in operating expenses, a 40% reduction in labour costs and a 21% increase in productivity.
Automation at ports delivers significant cost savings
MS: “This kind of financial modelling is important for Navtech to demonstrate that our products are viable, but it also provides a compelling argument for automation in general.
“The findings are based on averages from multiple quotes, although there was quite a large range on the reduction in operating expenses, from around 25% up to 50%.
“Currently, only 3% of the world’s ports are automated, but the rate of growth is now exponential. Key drivers for this include the rise of megaships and increasing next day deliveries.
“About 80% of the world’s goods go through ports. There’s already time pressure on everything and the increasing global population equals ever increasing demand.
“New ports are a massive investment. For example, the first phase of the Tuas project in Singapore, which will create the world’s largest container terminal, is nearly complete and has already cost $1.76bn. There are three more phases to come.
“Of course, any cost benefit analysis must also include risks. If you’re retrofitting an existing port, how much is installation going to disrupt operations? What about the social impact of job losses or a shift in employment profile? Are the new jobs higher paid or more secure? How much time and money would an infrastructure-free solution save in operational downtime during installation compared to an infrastructure dependent solution?
“Automation has created so-called ghost ports, which are largely human-free, so there are clear safety benefits. And with automation you get remote operation, so maybe one person can now operate two straddle carriers.
“Also, operating bulky vehicles like terminal tractors can require an additional member of staff to supervise the movement. By using technological solutions – installing sensors which act beyond human capabilities – that’s no longer necessary.
“Terran360, an infrastructure-free localisation solution, delivers a detailed 360-degree map made up of around 400 slices and uploads this to a cloud-based server. The vehicle drives down a route continually scanning all these different landmarks.
“We’re always looking for new partners in the shipping world and other industrial settings. This kind of radar is perfect for self-driving cars too, so that’s another exciting growth area.”
Dr Basu issues stark warning on need to earn public trust in self-driving technology.
Dr Subhajit Basu, of The University of Leeds’ School of Law, is a lawyer with impeccable credentials and a strong sense of public duty… and he’s got serious concerns about “handover” – the moment when a self-driving vehicle transfers control back to a human driver.
An editor at The International Review of Law, a Fellow of The Royal Society (RSA), and Chair of The British and Irish Law Education Technology Association (BILETA), he recently supervised research into “Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces”.
The report, first published in the prestigious Nature journal, concluded that: “An urgent investigation is needed into the technology that allows self-driving cars to communicate with their operators”. Why? Because the “digital interfaces may be unable to adequately communicate safety and legal information, which could result in accidents”.
That is a stark warning indeed and Dr Basu believes the Government and the automotive industry need to be much more up-front about the issues.
Legal issues in automated vehicles report
SB: “The main safety messages surround the extreme difficulty most drivers will encounter when an autonomous vehicle suddenly transfers the driving back to them. Even if a driver responds quickly, they may not regain enough situational awareness to avoid an accident.
“The general public is not aware of their vulnerability, and it is doubted that an interface in an automated vehicle will communicate this point with sufficient clarity.
“The article in Nature was part of a multidisciplinary international project, PAsCAL, funded by the EU’s Horizon 2020, into public acceptance of connected and autonomous vehicles (CAVs).
“My expertise is in the regulation of emerging technologies. I’m one of those people who sees autonomous vehicles not as a disruptive, but as something which can improve human life. However, in order to do that, we have to put public safety, public service and public trust before profit. I always emphasise that transparency is paramount, but the autonomous vehicle industry can be extremely secretive.
“The overall goal of PAsCAL was to create a guide to autonomy – a set of guidelines and recommendations to deliver a smooth transition to this new system of transport, particularly with regards to the behaviour of the driver, or user in charge, of an autonomous vehicle.
“You have to recognise that an Assisted Lane Keeping System (ALKS) is basically an evolution of the lane departure warning systems that lots of cars already have, but in general self-driving cars are not an evolution but a revolution – they will change our way of life.
“We want to understand not just how the technology works, but also how people see it. The aim is to capture the public’s acceptance and attitudes – not just the users in charge, but pedestrians and other road users too – and to take their concerns into consideration.
“With any new technology there has to be a proper risk assessment. Take the example of smart motorways – it’s a brilliant idea in theory and it works in other countries, but there has been a lack of understanding in the UK. We didn’t create enough stopping places and the cameras weren’t good enough to monitor all the cars in real time. You need an artificial intelligence driven system which can identify a car which is slowing in a fraction of a second.
“Similarly with autonomous vehicles, if you want to deliver something like this you should have the right technologies in place. In this case, that means the human machine interface. The vehicle manufacturers (VMs) will basically give responsibility to the driver, the user in charge, saying “when you are warned, you should take over, okay?”.
“In our report, we argue that there will not be enough time for an individual to understand the legal complexities, what they are accepting liability for. The communication of that risk will not be easy for the user in charge to understand. Honestly, how many people have read the terms and conditions of Facebook?
“In autonomous vehicles, the human machine interface will communicate very important safety information and legally binding information, with civil or criminal implications if the driver fails to adequately respond.
“If you look at the proposed change to the Highway Code, it assumes that the driver will be able to take back control when prompted by the vehicle. We are concerned that even the most astute and conscientious driver may not be able to take back control in time. The effectiveness of the human machine interface is one limiting factor and then there is the driver – every driver has different cognitive abilities and different skill levels.
“Human beings are all different, they react differently to different circumstances, so defining the right timeframe for a handover is a difficult balance to strike. Are you going to assess people on their cognitive abilities, on the speed of their reflexes?
“In some circumstances, I have doubts about whether it is fair to have a handover even within 30 to 40 seconds. Certainly, there is nothing I have found where scientifically they have viewed 10 seconds as an adequate time. Cognitively, a blanket 10 seconds simply may not be possible – that’s my major concern.
“This is something we have been talking about for quite some time now. The UK government seems to be in very much in favour in pushing ahead with this technology quickly, because it fits with the “Build Back Better” tagline. There is a huge risk that we are disregarding safety in the name of innovation.
“I think the automotive industry has a responsibility here. When you are travelling in a self-driving car, the manufacturer is responsible for your safety, for ensuring that the technology is up to standard.
“The industry also has a responsibility to ensure that drivers are adequately trained, adequately educated. The argument that accidents happen and can be used for development is vulgar. Go and tell that to the person who has lost a relative – that this is a learning process.
“I am not against autonomous vehicles. What I am saying is that we need evidence-based conclusions. We need to be sure that the reaction time is well-founded and supported, so we don’t create a system which will fail.
“Personally, I propose that we should first create a comprehensive legal framework which should mandate additional driver training for the safe use of self-driving systems. The automotive industry could take a lead on this, actively push for it.
“At the end of the day, this is about road safety, it is about saving human lives. I believe that autonomous vehicles can reduce congestion, can be good for the climate, but they also have the potential to become deathtraps because we are getting over-reliant on the technology to work perfectly and over-relying on human ability, without the evidence-based research to find out whether we can react within the stipulated time.
“As a lawyer, it is my responsibility to uphold public safety, to highlight the risks. If the government and the automotive industry don’t face these issues, then people will lose trust in this amazing technology.”
For more, you can read the full Nature article here.
The warning lights are flashing on draft guidance to drivers in driverless cars.
Barrister Alex Glassbrook specialises in road transport and has written two books on UK autonomous vehicle (AV) law. An expert in the law of advanced, automated and electric vehicles, serious personal injury, motor insurance and high-value vehicle damages cases, he begins by highlighting three recent developments:
The Automated and Electric Vehicles Act 2018 coming into force on 21 April 2021;
The government announcement on 28 April that it isn’t yet publishing a list of AVs under Section 1 of the Act, but that it does expect to list vehicles equipped with Automated Lane Keeping Systems (ALKS) as “automated”; and
Barrister Alex Glassbrook specialises in the law of connected and autonomous vehicles.
AG: “My work overwhelmingly involves car accidents as the source of serious injury, so the AEV Act coming into force was an historic moment. Immediately though, it was clear there was something missing: the list of automated vehicles under Section 1 of the Act, which the Secretary of State is required to publish as soon as it is prepared. There was a presumption that the Act and the list would come together, but they didn’t. We have the Act but no list. In traffic light terms, we’ve gone past amber but there’s no green. What’s going on?
“That question was answered a week later with the Centre for Connected and Autonomous Vehicles’ publication of its paper for the Department for Transport on whether vehicles equipped with ALKS would be listed as automated. In summary, it said the list is not yet being published because we’re waiting to find out if these vehicles will get Whole Vehicle Type Approval from the Vehicle Certification Agency (VCA). If that happens, then the Secretary of State does expect to list them as automated under the AEV Act.
“This has huge implications for liability because it brings into effect a new line of motor insurance. Currently, under the Road Traffic Act, the motor insurer is effectively the body that will satisfy any judgment against a liable driver, or indeed can be sued directly under the direct rights against insurers regulations.
“The new AEV Act does something very different, something particular to AVs: it makes the insurer of the vehicle directly liable. This brings two important changes. One is the direct liability, which is slightly different from the direct rights regs. Second, it attaches to the vehicle rather than the driver, which is quite a radical step.
“There are obviously practical considerations behind this. Would publishing the list before the vehicles get Type Approval be putting the cart before the horse? Even so, it’s a little bit curious because the Act has already come into effect. Moreover, it’s not yet certain that ALKS-equipped vehicles will be classed as automated. The Secretary of State could change his mind.
“Running alongside this, we have the proposed amendments to the Highway Code. They’re quite eye-catching. The current Highway Code reiterates the orthodoxy, that the driver must at all times be in control of the vehicle and must understand the manufacturer’s instructions. The new proposed version is currently out for consultation, but the consultation period is very short, with a deadline of 28 May.
Proposed amendments to the Highway Code published on 28 April 2021.
“The key section reads: “On the basis of responses to the call for evidence, and the step-change that the expected introduction of the first legally recognised automated vehicles represents, we have decided to make a more ambitious amendment to The Highway Code, coinciding with the code’s 90th year anniversary.” To me, the fact it is 90 years since the Highway Code was first published in 1931 is neither here nor there. What is notable is the reference to “more ambitious”, because that implies there was an earlier draft.
“The next sentence has the wow factor. It says: “Automated vehicles no longer require the driver to pay attention to the vehicle or the road when in automated mode, except to resume control in response to a transition demand in a timely manner.” The implications of those words are immense.
“The document continues: “Automated vehicles are vehicles that are listed by the Secretary of State for Transport. While an automated vehicle is driving itself, you are not responsible for how it drives, and you do not need to pay attention to the road.”
“Well, we don’t have that list yet, and what follows is really quite striking. It proposes an instruction in the Highway Code, the official guidance to drivers, to do nothing – to pay no attention to how the vehicle is driving or what’s happening on the road. It positively advises drivers to switch off their attention.
“The next paragraph sets some parameters: “If the vehicle is designed to require you to resume driving after being prompted to, while the vehicle is driving itself, you MUST remain in a position to be able to take control. For example, you should not move out of the driving seat.”
“So, you shouldn’t get out of the driving seat – that’s quite a low standard. This appears to be saying it’s fine to watch a movie, it’s fine to go on Instagram, it’s fine to read and respond to business emails. All these tasks are entirely absorbing of concentration and require some disengaging from.
“I’ve done a lot of trials in which I’ve asked witnesses about their appreciation of time during a crash and heard expert evidence about what can happen within a short window of time. Particularly when you’ve got three or four lanes of motorway, multiple vehicles, an awful lot can happen in 10 seconds.
“There are two fairly well-known exceptions to driver control recognised in the law. One is a medical emergency, if a driver is suddenly incapacitated. The other is moments of peril, sometimes known as agony of the moment – when it is such a difficult situation that a driver causing injury by their evasive manoeuvre is not to be judged by the usual demanding standard.
“So, the common law has formed exceptions to liability, but in this case it’s more complex. First, it introduces, for want of a better phrase, artificial intelligence (AI) into the picture. Adjudicating the actions of AI is still a very undeveloped area of law. Second, it brings into the picture something that has been manufactured, namely a computer and sensor system within a moving vehicle. Again, the laws of product liability are at a very early stage of development in relation to AI and new technologies. A notorious example of that is over-the-air (OTA) software, which is not understood as goods.
“From a legal perspective, it is vitally important not to have guidance which leaves open very obvious questions. Unfortunately, these proposed changes to the Highway Code do just that. On the one hand, the Highway Code might say it’s perfectly fine to completely distract yourself from driving. But on the other hand, it’s not okay to do things like climbing out of the driving seat. That leaves open a very broad set of situations and the courts are going to find themselves dealing with some very difficult problems.
“Of course, what the court has to deal with is very much secondary, the primary question must be: what is safe? There are plenty of lessons in the history of motor vehicles when innovation has overreached. Famously, Ralph Nader’s book, Unsafe At Any Speed (published in the USA in 1965), highlighted rear suspension which lost traction when going round corners. That changed product liability law across all sectors.
“I’m not a road traffic engineer but, as an observer of many road traffic accident cases over many years, I have real doubts as to the safety of this guidance.”
Our Zenzic CAM Creator series continues with Jim Hutchinson, CEO of Fusion Processing.
As a partner in the ambitious CAVForth project, predicted by the Scottish Mail on Sunday to make Edinburgh “the most ‘driverless’ city in the world”, Fusion Processing is delivering on its promise to design and build world leading systems for the automation of vehicles. Here, CEO Jim Hutchinson talks ADAS, cyclist detection and autonomous vehicle safety, explaining how CAVForth is set to make a major mark on the global self-driving map.
Jim Hutchinson, CEO of Fusion Processing
Please can you outline Fusion Processing’s work on connected and automated mobility?
JH: “We’ve been going since 2012. We set up to develop automated vehicle systems with the ultimate goal of being fully autonomous – able to do anything that a human-driven vehicle can. We knew from the start there were a lot of steps along the way and, for essentially a commercial company, we needed to have products along those steps rather than trying for a ‘Level 5 or nothing’ approach.
“We developed the CAVstar platform as a scalable solution – a drive system we could put into pretty much any vehicle, from small cars up to HGV. Along the way we’ve been involved in some great schemes like the Venturer project, one of the original three UK AV projects.
“Then, more or less in parallel with that, we were involved in the Gateway project in London. We provided the autonomous drive system for the pods that drove along the Thames path. That was a big trial with random members of the public – some who came along specifically to experience it, and many others who just wanted to get from the O2 to the other end of the route. The pods encountered various other people on the route – the vehicles had to be mindful of dog walkers and cyclists. The feedback was by and large very positive, and it was a good proof point for us of how our system can be used off-highway.
“It also led to other things, notably our partnerships with Stagecoach and Alexander Dennis. First-off we were exploring using autonomy in bus depots. Every night a lot of operations have to happen involving the movement of vehicles – they have to be fuelled, washed, made ready for the morning, so we put together a system which could automate that. The concept was based on a fleet manager directing all this from a control tower once the bus arrives back at the depot.
“The system proved very successful, demonstrating operating efficiency and improved safety for those working in the depot, so that led to CAVForth – an autonomous bus service. Again, we’re working with Stagecoach and Alexander Dennis, joined by Transport Scotland, Bristol Robotics Laboratory and Napier University.
Fusion Processing’s CAVstar
“The intent is to put into service a number of Level 4 autonomous buses between the Fife Park & Ride and the Hermiston Gait Interchange. It’s a commuter route so we’re expecting a large number of daily commuters who want to travel to the Hermiston Gait Interchange, where they can transfer on to trams for the city centre, the airport or the rail network. We expect tourists will want to use it too to reach the Forth Road Bridge, a UNESCO heritage site.
“It’s a useful service, running every day of the week, and the hope is that it will go from a pilot service to a full service. It’s being registered as a new route, providing a service that wasn’t previously there, and Stagecoach anticipate around 10,000 journeys a week.
“The route includes a mix of road environments – motorway, bus lanes, roundabouts, signalled interchanges – so from our point of view it makes for a great demonstration of capability. There’s the technology side, which Fusion is focussed on, but there’s also key research around public acceptance and uptake. That’s really exciting too.
“The launch date isn’t set in stone due to Covid uncertainties, and the point at which they start taking passengers is still to be determined, but we will be running autonomous buses this year. That’s an incredible milestone, absolutely huge. It will be a very significant achievement to demonstrate a Level 4 capability on that class of vehicle – a big thing for the UK which will be noticed around the world.
“There are one or two other groups working on similar projects, but I haven’t seen anything with this level of ambition, this level of complexity, or length of route. It’ll obviously be fantastic for us and our CAVForth partners, but also for the UK autonomous vehicle industry as a whole. It will really put us on the worldwide map.”
Please can you outline Fusion Processing’s work on driver assistance?
JH: “CycleEye is an important product for us. We identified a need for collision avoidance technology. There are lots of collisions with cyclists and quite often they occur because the bus driver doesn’t know the cyclist is there. CycleEye is like a subsystem of CAVstar in a lot of ways – one of those steps to get some proof points on bits of the technology. It recognises and classifies different types of vehicle, and the driver gets an alert when there’s a cyclist in the danger zone. It is currently being used in a few cities around the UK, including on the Bristol Metrobus. It’s a good system. Whenever it has been evaluated against other cyclist detection systems it has always come out on top.
“We’re particularly excited about the next incarnation of CycleEye, evolving it to become a camera mirror system. It’s legal now to use cameras instead of mirrors, so we can provide that functionality too – monitors in the driver’s cab instead of mirrors. That has several benefits. Mirrors, on buses particularly, can be a bit of a liability – they quite often get knocked and sometimes they knock people. They stick out and head strikes are unfortunately quite common. They also get smashed, putting the bus out of service, which is an inconvenience and an operational cost. We think that being able to offer this camera mirror with CycleEye functionality is going to prove attractive to a lot of operators.”
Van with Fusion Processing technology
Over what timescale do you expect Level 4 and 5 autonomy to be achieved in the UK and which sectors will be early adopters?
JH: “With CAVForth we’ll be running Level 4 autonomous vehicles, where you’ve got a restricted operational design domain (ODD), in the UK this year. Restricting these vehicles to particular routes or environments lends itself very well to public service, where the vehicles are maintained by an operator. That’s very achievable right now. As well as passenger service vehicles, other service vehicle fleets are easy wins, as well as off-highway stuff like industrial sites. Then you’ve got delivery vehicles.
“When it comes to true Level 5 – go anywhere, do anything vehicles – repair and maintenance is an issue. We know that with privately owned cars, some people maintain them exactly as they should, and other people don’t. There are other complications too – things that people perhaps don’t do that often but like their vehicles to be able to do, like parking in a farmer’s field at a festival – that’s a little bit further out still.
“If you just roll back slightly from true Level 5, if people want a city car or a comfortable car for a long motorway journey, nothing off-road, there’s a case for vehicles which have an autonomous mode. That certainly appeals to me.”
Can you address the concerns about ADAS, particularly handover of control, driver concentration levels and driver deskilling?
JH: “I’m not a big fan of Level 3. If you haven’t been driving for an hour to suddenly be asked to take the wheel because the car has encountered something it can’t handle, it’s just unrealistic. Whereas a Level 4 system, which can put itself into a safe state when it reaches the limits of its ODD – perhaps ready to be restarted in a manual mode when the driver wants to take control – that’s much more practical.
“If there are circumstances when the driver needs to take over then clearly the driver needs to be of a standard that they can drive safely. Once you have widespread adoption of autonomous systems, and people are not driving routinely, there is a risk of driver deskilling. If that were the case you’d really need to look at greater regulation of drivers.
“That said, you can sometime envisage problems that don’t really transpire. We’ve had cruise control and adaptive cruise control for a while now and I don’t think they’ve had the effect of particularly deskilling drivers. So, with Lane Keep, maybe it’s not such a big deal. Once you get to the point where cars are properly self-driving, there is a danger. If you haven’t got anything to do your mind will wander, that’s human nature, so it is a concern.”