Lucas Noldus Ph.D. details the latest high tech ways to measure driver behaviour in ADAS-equipped and self-driving vehicles

Connected and self-driving car safety: Noldus keeps more than an eye on distracted driving

Isn’t LinkedIn marvellous? I met Lucas Noldus Ph.D., Founder & CEO of Netherlands-based Noldus Information Technology, after he liked my interview with his Global Partnership on Artificial Intelligence (GPAI) colleague, Inma Martinez.

A few messages flew back and forth, and it transpired that he’s an expert in measuring driver behaviour, particularly driver-vehicle interactions in ADAS-equipped and self-driving vehicles. That was music to my ears, so we arranged a Zoom. What follows is the highly insightful result.

Lucas Noldus
Lucas Noldus Ph.D., Founder of Noldus Information Technology

LN: “The future starts here. The world is changing. We see people living longer and there are more and more interactive devices – telephones, tablets, dashboards – with which we can interact, leading to greater risk of distraction while driving. I know personally how tempting it is to use these devices, always trying to keep your eyes on the road.

“We already have fascinating developments in connected driving and now, with self-driving, the role of the driver changes significantly. That has triggered research institutes, universities, OEMs and tier one suppliers to pay more attention to the user experience for both drivers and passengers.

“All these experiences are important because how people perceive the safety and comfort will influence their buying decisions, and their recommendations to other potential users.

“For autonomous driving, how far will we go towards level five? What happens at the intermediate stages? Over the coming decades, driving tasks will gradually diminish but, until full autonomy, the driver will have to remain on standby, ready to take over in certain situations. How will the vehicle know the driver is available? How quickly can he take over? These are the topics we’re involved in as a technology company.

“We make tools to allow automotive researchers to keep the human in the loop. Traditionally, automotive research focused exclusively on improving the vehicle – better engines, drivetrains etc. Until recently, nobody paid much attention to the human being (with a brain, skeletal system, muscles, motor functions), who needs to process information through his sensory organs, draw the right conclusions and take actions.

“Now, these aspects are getting more attention, especially in relation to reduced capacity, whether due to a distracting device, drugs, alcohol or neurodegeneration. As you get older your response time becomes longer, your eyesight and hearing abilities reduce, as does the speed at which you can process information.

“These are the challenges that researchers in automotive are looking at concerning the role of the driver, now and in the future. If the automated or semi-automated system wants to give control back to the driver because its AI algorithms decide a situation is too complex, can the driver safely take over while he’s been doing something like reading or taking a nap? How many milliseconds does the brain need to be alert again?

NK: “Draft legislation seems to be proceeding on a 10-second rule, but some studies say at least 30 seconds is required.”

LN: “Situational awareness – that’s a key word in this business. Not only where am I geographically, but in what situation. Oh, I’m in a situation where the road surface is very wet, there’s a vehicle just in front of me, the exit I need is near and I’m in the wrong lane. Understanding a situation like that takes time.

“If we take a helicopter view, from our perspective as a technology company, what should be measured to understand the driver behaviour? Which sensors should we use to pick up that information? If we use a microphone, a video camera, a heartbeat monitor and a link to the ECU, how do we synchronise that?

“That’s not trivial because one sensor may be sending the sampling at 300Hz and another at 25 frames per second. That’s something my company has specialised in over the years. We’re very good at merging data from different sources, whether it’s a driving simulator continuously spitting out data, a real car, or sensors mounted in the infrastructure.

“You then need to analyse that data and pull out meaningful quantitative units that give you actionable insights. Generating large matrices is no big deal, making sense of that information is the real challenge.

“For example, in dashboard design, a manufacturer might be comparing two or three displays of road quality. A driver behaviour study with our tools will give the designer a clear answer on which design leads to the least cognitive workload, the least confusion.

Noldus DriveLab
Noldus DriveLab

“This same technical challenge can be applied to a vast number of design objectives. The vehicle manufacturer might be looking to make incremental improvements to, say, the readability of the dashboard under certain light conditions. Or they might be working on a completely new feature, like an intelligent personal in-car assistant. A number of brands are working on that, but the concept is still relatively new.

“You cannot test every scenario on the road, it’s just too dangerous, so we work with simulator manufacturers too. On the road or in the lab, we can measure a driver’s actions with eye-tracker, audio, video, face-reader and physiology in one.”

NK: “Back to LinkedIn again, I saw a post by Perry McCarthy, the F1 driver and original Stig on Top Gear, who said something like: Simulators are getting so good these days, when you make a mistake they drop three tonnes of bricks on your legs!”

LN: “You have so-called high fidelity and low fidelity simulators – the higher the fidelity, the closer you get to the real vehicle behaviour on the road, and there are all sorts of metrics to benchmark responsiveness.

“You have simple fixed-base simulators right up to motion-based simulators which can rotate, pitch and roll, move forward, backwards, sideways and up and down. For the best ones you’re talking about 10 million euros.

“We work with OEMs, tier1 suppliers, research institutes and simulator manufacturers to build-in our DriveLab software platform. We also advise on what sensors are recommended depending on what aspects of driver behaviour they want to study.

“We try to capture all the driver-vehicle interactions, so if he pushes a pedal, changes gear or turns the steering wheel, that’s all recorded and fed into the data stream. We can also record their body motion, facial expression, what they’re saying and how they’re saying it – it all tells us something about their mental state.

Noldus eye-tracker
Multi-camera eye tracker (Smart Eye)

“Eye tracking measures the point of gaze – what your pupils are focused on. In a vehicle, that might be the left, right and rear mirrors, through the windscreen or windows, around the interior, even looking back over your shoulders. To capture all that you need multiple eye-tracking cameras. If you just want to look at, for example, how the driver perceives distance to the car in front, you can do with just two cameras rather than six.

“Eye tracking generates all sorts of data. How long the eyes have been looking at something is called dwell time. Then there’s what direction the eyes are looking in and how fast the eyes move from one fixed position to another – that’s the saccade. People doing eye tracking research measure saccades in milliseconds.

“Another important metric is pupil diameter. If the light intensity goes up, the pupil diameter decreases. Given a stable light condition, the diameter of your pupil says something about the cognitive load to your brain – the harder you have to think, the wider your pupils will open. If you’re tired, your blink rate will go up. There’s a normal natural blink rate to refresh the fluid on your eyes with a fully awake person, but if you’re falling asleep the blink rate changes. It’s a very useful instrument.

“Then there’s body worn sensors that measure physiology. It’s harder to do in-car, but in a lab people don’t mind wearing electromyography (EMG) sensors to measure muscle tension. If you’re a designer and you want to know how easy it is for an 80-year-old lady to operate a gearshift, you need to know how much muscle power she has to exert.

“We also measure the pulse rate with a technique called photoplethysmography (PPG), like in a sports watch. From the PPG signal you can derive the heart rate (HR). However, a more accurate method is an electrocardiogram (ECG), which is based on the electrical activity of the heart.


Noldus physiological data
GSR (EDA) measurement

“Further still, we measure galvanic skin response (GSR), also called electrodermal activity (EDA), the level of sweating of your skin. The more nervous you get, the more you sweat. If you’re a bit late braking approaching a traffic jam, your GSR level will jump up. A few body parts are really good for capturing GSR – the wrist, palm, fingers, and the foot.

“We also measure oxygen saturation in the blood with near infrared spectroscopy (NIRS) and brain activity with an electroencephalogram (EEG). Both EEG and NIRS show which brain region is activated.

“Another incredibly useful technique is face reading. Simply by pointing a video camera at someone’s face we can plot 500 points – the surroundings of the eyebrows, the eyelids, the nose, chin, mouth, lips. We feed this into a neural network model and classify it against a database of tens of thousands of annotated images, allowing us to identify basic emotions – happy, sad, angry, surprised, disgusted, scared or neutral. You can capture that from one photograph. For other states, like boredom or confusion, you need a series of images.

“These days we can even capture the heart rate just by looking at the face – tiny changes in colour resulting from the pulsation of the blood vessels in the skin. This field of face reading is evolving every year and I dare to claim that we are leading the pack with our tool.

“Doing this in the lab is one thing, doing it in a real car is another challenge, being able to keep your focus on the driver’s face and deal with variable backgrounds. Of course, cars also drive at night so the next question is can you do all this in darkness? We turned our company van into an instrumented vehicle and my sons agreed to be the guinea pigs.

“It took some work – overcoming the issue of light striking the face and causing sharp shadows, for instance – but we can now use infrared illuminators with our FaceReader software to make these measurements in full darkness.

“The turning of the head is also very important in studying distraction, for example, if the driver looks sideways for too long, or nods their head in sleepiness. When something shocks someone, we see the face change and the blood pressure rise, and these readings are synchronised in DriveLab.

“It is well proven that even things like changing radio station can be very distracting. Taking your eyes off the road for just a few seconds is dangerous. As we move to more and more connected devices, touchscreens and voice commands, minimising distraction is vital to ensure safety.”

NK: “I absolutely love this tech but what I actually drive is a 7-year-old Suzuki Swift Sport with a petrol engine and a manual gearbox, and I quite like it that way”

LN: “I’m doing research on cars of the future with my software but I am personally driving a 30-year old soft-top Saab 900. That’s my ultimate relaxation, getting away from high tech for a moment.

“At Noldus, we’re constantly pushing the boundaries of research, working with top level organisations in automotive – Bosch, Cat, Daimler, Fiat, Honda, Isuzu, Land Rover, Mazda, Nissan, Scania, Skoda, Toyota, Valeo and Volvo, to name just a few – and also with the Netherlands Aerospace Centre (NLR) and the Maritime Research Institute Netherlands (MARIN).

“Our aim is make it so that the client doesn’t have to worry about things like hardware to software connections – we do that for them so they can focus on their research or design challenge.”

For further info see noldus.com





Bill McKinley of Keysight Technologies explains how C-V2X and DSRC enable higher levels of self-driving

Keysight at forefront of self-driving safety standards and certification

Ahead of a flagship product launch later this week, Bill McKinley, Automotive Strategic Planner at Keysight Technologies, gives his thoughts on self-driving and the fast-changing connected and autonomous vehicle (CAV) landscape.

Avid readers may remember that Bill was on the panel I hosted at the Small Cells World Summit in May. He’s got 30+ years’ experience in wireless communications and his current focus is developing test solutions for the automotive sector.

BM: “The UK, in line with other nations around the world, is investing heavily in connectivity and electrification – both the vehicles themselves and the charging infrastructure. Connected vehicles have been demonstrated to enhance safety via cellular vehicle to everything (C-V2X) and dedicated short-range communication (DSRC).

“These technologies allow for more efficient driving, for example, by routing to avoid accidents or poor road conditions. They also enable higher levels of automation, all of which can lead to an improved overall driving experience.

“It is likely that the first fully automated vehicles will be delivery vehicles, controlled environment shuttle type services, and buses on specific routes. With the gradual introduction of robotaxis, we will no doubt start to see Mobility as a Service (MaaS) become more common over the next 10-15 years.

“From a Keysight perspective, we play a significant role at the very leading edge of connected and automated mobility. We participate in various global organisations developing the standards, test procedures and certification for the industry, including the 5G Automotive Association (5GAA), the Car 2 Car Communication Consortium (C2C CC), the China Academy of Information and Communications Technology (CAICT), the OmniAir Consortium and the Society of Automotive Engineers (SAE).

“Keysight was the first test and measurement company to be awarded Global Certification Forum (GCF) validation for C-V2X RF conformance. We have industry leading validated test cases for the C-V2X protocol conformance test, and we were the first to be awarded OmniAir Qualified Test Equipment (OQTE) status. 

“Cybersecurity will play a critical role in connected mobility and Keysight is working with leading companies and organisations in this space to develop solutions to ensure vehicular communications remain safe and robust against attacks. 

“Clearly, the main risks associated with self-driving vehicles are around the safety aspects, which in turn will heavily influence public acceptance of the technology. We are all very familiar with some of the headlines about Tesla vehicles.  

“It remains incredibly challenging to overcome the complexities of urban automated driving, but things are improving all the time. Our autonomous driving emulator (ADE) system is designed with this in mind – to test many autonomous drive systems in a rich and authentic environment within the lab, before moving testing out into the field.”

More on that to follow soon. For further info see keysight.com

Navtech Radar puts figures on the benefits of port automation including reduced operating expenses and labour costs

Navtech builds the business case for automation

Regular readers will recognise the name Navtech Radar from our recent update on Oxbotica. In May, the two Oxfordshire-based companies joined forces to launch Terran360, promoted as the world’s first all-weather radar localisation solution for industrial autonomous vehicles.

While self-driving cars await a legislative framework, this ground-breaking technology is already being deployed in off-road settings. Ports are a good example and Madelen Shepherd, Growth Marketing Manager at Navtech, sets out a strong business case.

MS: “Ports are complicated operations and automation can massively improve efficiency, so we’ve been doing some financial analysis on the quantification of value. The benefits fall into three main areas: 1) reduced operating expenses; 2) reduced labour requirements; and 3) productivity increases.”

According to Navtech’s research, benefits resulting from port automation include a 31% reduction in operating expenses, a 40% reduction in labour costs and a 21% increase in productivity.

Navtech on port automation
Automation at ports delivers significant cost savings

MS: “This kind of financial modelling is important for Navtech to demonstrate that our products are viable, but it also provides a compelling argument for automation in general.

“The findings are based on averages from multiple quotes, although there was quite a large range on the reduction in operating expenses, from around 25% up to 50%.

“Currently, only 3% of the world’s ports are automated, but the rate of growth is now exponential. Key drivers for this include the rise of megaships and increasing next day deliveries.

“About 80% of the world’s goods go through ports. There’s already time pressure on everything and the increasing global population equals ever increasing demand.  

“New ports are a massive investment. For example, the first phase of the Tuas project in Singapore, which will create the world’s largest container terminal, is nearly complete and has already cost $1.76bn. There are three more phases to come.

“Of course, any cost benefit analysis must also include risks. If you’re retrofitting an existing port, how much is installation going to disrupt operations? What about the social impact of job losses or a shift in employment profile? Are the new jobs higher paid or more secure? How much time and money would an infrastructure-free solution save in operational downtime during installation compared to an infrastructure dependent solution?

“Automation has created so-called ghost ports, which are largely human-free, so there are clear safety benefits. And with automation you get remote operation, so maybe one person can now operate two straddle carriers.

“Also, operating bulky vehicles like terminal tractors can require an additional member of staff to supervise the movement. By using technological solutions – installing sensors which act beyond human capabilities – that’s no longer necessary.

“Terran360, an infrastructure-free localisation solution, delivers a detailed 360-degree map made up of around 400 slices and uploads this to a cloud-based server. The vehicle drives down a route continually scanning all these different landmarks.

“We’re always looking for new partners in the shipping world and other industrial settings. This kind of radar is perfect for self-driving cars too, so that’s another exciting growth area.”

Inma Martinez, author of new book The Future of the Automotive Industry, on self-driving and connected cars

Street smart cars of the future will drive like a local and diagnose Alzheimer’s

Described by Time magazine as “One of the best talents in human digital behaviour”, Inma Martinez advises business leaders and governments (including the UK’s Department of Culture, Media and Sport) on AI and digitisation. She’s just written a book called The Future of the Automotive Industry, so obviously we had to ask her about driverless cars.

How did you come to specialise in automotive?

IM: “I first got involved in the auto industry in the early 2000s, when BMW recognised that they had to attract female drivers and buyers. We made a series of short films with directors including Ridley Scott and John Woo, starring Clive Owen as The Driver. Guy Ritchie’s had Madonna in it. In those days, I was working as a human factors scientist, looking at how humans use technology.

“Previously, I had been a telecoms engineer specialising in internet protocols. Then, because Nokia bought two of my start-ups, I landed in their innovations department. Together with Intel, we came to the realisation that telecommunications companies had to create alliances with auto manufacturers for vehicle to everything (V2X) and vehicle to infrastructure (V2I) communications.

“I worked for Volkswagen Group designing cars with AI and met Mark Gallagher and all the Formula One crowd. I thought: I have to write about the future of this industry, because in the next five to ten years it will not look anything like today – the massive influence of the Internet of Things (IoT) and AI, sustainability and the green economy. I wrote the book during the pandemic and it came out in June.”

Setting EVs aside, how do you view the autonomous side of things?

IM: “I love the topic, firstly because it needs so much definition. People interchange ‘autonomous’ with ‘self-driving’, but they’re separate things. Unfortunately, the media is not very sophisticated in talking about that.

“For me, it’s something that’s been happening for 15 or 20 years, initially because the industry was pressed to improve safety. You got level one autonomous features, like cruise control and parking assistance, making things easier and safer. Now we’re at level three, and no one understands what on earth is going on!

“I hate it when Tesla put out press releases claiming full self-driving. The PR houses are doing a disservice to the industry because they’re confusing people. I delved into this for the book and came up to the conclusion that we’re not going to see autonomous cars until the regulation is ready for them.

“The European Union put out a good first attempt to define self-driving in 2019, and Japan has changed a lot of its traffic laws to allow Honda to start putting level three cars on the road.

“This will only happen when the legal framework is defined. Otherwise, you have the massive legal issue of who’s at fault in a crash. There’s got to be an effort in the industry to help create these legal frameworks, and I don’t think it’s too complicated.

“The way I see it, we need to differentiate an autonomous car – a level five car which can do literally everything by itself – from self-driving cars which can drive and brake and accelerate and have situational awareness, but which can’t operate constantly by themselves and still need the driver to keep their eyes on the road.”

Proposed changes to the Highway Code talk of drivers not having to pay attention anymore. Is there a danger that regulators could jump the gun?

IM: “That is frightening. You can’t put vehicles on the road driving themselves with just computer vision, you need V2X, roadside units (RSUs), Vehicular Ad Hoc Networks (VANETs) – all the beacons that make roads smart. You need 5G infrastructure, so the car is actually guided by connectedness. This has to do with urban planning and smart cities, not with the automotive industry per se.

“The point is not just whether can we make cars autonomous, it is whether we can make them street smart. The way people drive is different in every country. In Rome, people brake all the time. In Kuala Lumpur, there are mopeds everywhere. So, the car of the future is going to have to be adaptive – the AI, computer vision, all the settings will be different depending on where it is.

“There’s a wonderful thesis that asks whether people are born street smart or whether they get it when they move to a big city. I began to think about autonomous cars driving around big urban centres – they’re going to have to get the pulse of how you drive in a certain city. We need to train the system to learn how to integrate itself.

“We’ve only just begun to consider what autonomous is, and we need to have a bold vision as to what it should be. In my view, we need to make cars smart, not just autonomous.”

What are the main risks in the shift to self-driving?

IM: “We need a legal framework. We need integration into the smart city infrastructure, including telecommunications. We also need definitions.

“Cars look fabulous at the Geneva Motor Show, but nobody talks about them in contexts. Should there be designated lanes for hands-free driving? How are we going to deal with a car parc that is not all digital, that still has a lot of older vehicles?

“Automotive is one of the hardest industries to create innovation because you have the pressure of safety, safety, safety at all costs. For example, nobody’s working on voice commands anymore because it turned out they were a distraction, a nuisance.”

Can you address the challenges specific to the UK?

IM: “Yes – your road network. In the UK you have a lot of 60mph rural roads where you can barely see what’s coming. I drive in Somerset and holy cow! It’s only because humans drive in such a super intuitive way that there aren’t more crashes.

“Perhaps it’s also because your driving test is so rigorous. I did my test at school in a small town in Pennsylvania. The police would make you drive around the car park and give you your licence. That was it.

“Then you have London, which is like no other city. It is a Dickensian city with 21st century vehicles running through it. It is a costly challenge to test smart road infrastructure without creating congestion. Where are the budgets going to come from?”

Anything else you’d like to mention?

IM: “I was speaking to a board member at Volkswagen recently and he said that one of the revelations of the pandemic was that it motivated people to own a car, rather than use public transport, for health and safety reasons, and a certain level of freedom and privacy. People have conversations when driving that they wouldn’t have on a train.

“It is also worth highlighting the prospect of the automotive industry partnering with healthcare companies on predictive medicine – keeping track of your vital biometrics to help detect serious diseases. If you’re going to be sitting in this highly technical environment for two hours a day, data such as the way you check your mirrors can reveal early symptoms of things like Alzheimer’s.

“Connected cars will add another layer of personal profiling and data authentication. Digital fingerprinting companies will be able to see that it’s me on my usual route, doing what I normally do. The cybersecurity will have to be very strong though. Imagine somebody hacking into the traffic management system of a future city – that’d be the ultimate hack.”

And on that very Italian Job note, our time is up. Inma Martinez’s book The Future of the Automotive Industry is out now, or visit inmamartinez.io

Audi’s Grandsphere concept car features a retractable steering wheel for hands-off mode.

How to tell if a car is truly driverless: Has it got a steering wheel?

One of the biggest barriers to the successful introduction of driverless cars is confusion over what constitutes true self-driving.

In America, the controversial autonomous vehicle expert, Alex Roy, has suggested a self-driving litmus test called Roy’s Razor. “Can you get in, pick a destination and safely go to sleep?” he asks. “If yes, it’s self-driving. If no, it’s not.”

While this has some merit, the key word “safely” gets somewhat lost. The internet is awash with less than sensible people climbing out of the driver’s seat with their Tesla in Autopilot.

So, here’s an idea to head off such recklessness… the best way to tell if a car is truly self-driving is to ask this simple question: Has it got a steering wheel?

Audi has apparently been down this road in the thinking behind its new Grandsphere concept car. When in “hands-off” mode, the steering wheel folds neatly away.

Audi Grandsphere concept car fold away steering wheel
Audi Grandsphere concept car with fold away steering wheel

That certainly removes any doubt as to whether the driver is responsible for driving or just a user in charge, to use The Law Commission of England and Wales’ new lingo.

“We will be ready for Level 4 driving in the second half of this decade,” said Josef Schloßmacher, Audi’s spokesperson for concept cars.

“That’s an important timeframe for us and we will interact with authorities in the different continents and countries in all important markets on the homologation of this new technology.”

While somewhat open to the accusation of a fudge – if it is truly self-driving, why do you need a steering wheel at all? – this looks like progress.

Driverless Toyota e-Palette bus hits blind Japanese judo star

Injury setback for self-driving at Tokyo Paralympics

A golden PR opportunity for driverless cars backfired badly this week when a Toyota self-driving e-Palette shuttle bus hit a visually impaired athlete at the Tokyo Paralympic Games.

It had all been going so well. A fleet of eye-catching autonomous electric vehicles successfully ferrying competitors and officials around the Olympic village was a major triumph for the self-driving industry, and Toyota in particular.

But this Olympic fairy tale received a nasty reality check when a slow-moving e-Palette collided with Japanese judo veteran Aramitsu Kitazono, apparently ending his medal hopes.

Kitazono had been due to face Ukraine’s Dmytro Solovey the following day, but didn’t take to the mat. Toyota Chief Executive Akio Toyoda swiftly apologised, but the damage was done.

We first covered the e-Palette last year in our interview with Yosuke Ushigome, Director at Takram, who worked on Toyota’s future car concepts.

Somewhat ironically now, given the accident involved a blind man, our headline endorsed “flickering lights to replace eye contact in facilitating trust”. Perhaps audible warnings are also warranted.

Tokyo Paralympics Toyota e-Palette
Tokyo Paralympics Toyota e-Palette

“Throughout the development process, athletes, especially Paralympians, helped us understand how the e-Palette could be adapted and upgraded to better meet their needs for simple, convenient and comfortable mobility,” said Takahiro Muta, the project’s development leader, in 2019.

Hindsight is a wonderful thing. Last December, the idea of these autonomous vehicles playing a practical role at this showcase sporting event was enticing, to say the least – some questioned whether it would even be possible.

Now we are left with Toyoda’s grim assessment of the incident. “It shows that autonomous vehicles are not yet realistic for normal roads,” he said.

Use of the e-Palette fleet was suspended for several days but has now resumed.

As accusations of slow progress fly, the UK self-driving industry is accelerating.

Has the driverless car revolution stalled? Not at Oxbotica

There’s a lot of talk about the shift to autonomous vehicles slowing. Indeed, the question “Why has the driverless car revolution stalled?” was posed in preparation for the upcoming Reuters Automotive 2021 event [at which yours truly is moderating the AV session – sorry, shameless plug!].

In the UK, a good barometer of such things is Oxford-based Oxbotica, and they’ve made several significant announcements recently.

Back in January, we reported on the Oxford University spin-out securing huge BP investment, with CEO, Ozgur Tohumcu, teasing “exciting deals in the pipeline”.

Shortly afterwards, Tohumcu struck a big deal himself, leaving to become MD of Automotive at Amazon Web Services. 

Oxbotica Co-founder and CTO, Professor Paul Newman, was lavish in his praise for ‘Ozo’, saying on LinkedIn: “A chunk of everything we do will always be because of what you made these past few years.”

One major goal was swiftly achieved: offering public AV passenger rides in the UK. Oxbotica was instrumental in this long-awaited milestone, providing the software for Project Endeavour’s well-publicised road trials in Birmingham and London.

Part-funded by the Centre for Connected and Autonomous Vehicles (CCAV), and delivered in partnership with Innovate UK, Project Endeavour applied BSI’s new safety case framework specification, PAS 1881:2020 Assuring the Safety of Automated Vehicle Trials and Testing.

Oxbotica therefore became the first company to have its safety case assessed against these stringent new requirements.

In Greenwich, six modified Ford Mondeos were deployed on a five-mile route to help transport planners and local authorities understand how autonomy can fill mobility gaps and play a role in the long-term sustainability of cities. 

Dr Graeme Smith, Senior Vice President (SVP) at Oxbotica and Director of Project Endeavour, said: “This is a one-of-a-kind research project that is allowing us to learn about the challenges of deploying autonomous vehicles in multiple cities across the UK – a key part of being able to deploy services safely and at scale.

“So far, it has been a real collaborative effort, bringing everyone into the discussion, from local authorities to road safety groups, transport providers and, most importantly, the general public.”

Not everyone was convinced, however. My London carried this barbed comment from local Stephen McKenna: “What’s the purpose it’s filling that we don’t already have?” Clearly, the industry still has work to do on the public perception front.

Impressive new products can only help and, in May, Oxbotica and Navtech Radar launched Terran360, “the world’s first all-weather radar localisation solution for industrial autonomous vehicles”.

This pioneering technology is apparently accurate to 10cm on any vehicle, in any environment, up to 75mph. It has been comprehensively tested in industrial settings, on roads, railways and for marine use.

Phil Avery, Managing Director at Navtech, said: “Thanks to decades of experience in delivering radar solutions for safety and mission critical applications, and together with Oxbotica’s world-leading autonomy software platform, Terran360 is trusted to answer the fundamental question for autonomous vehicles: “Where am I?”, everywhere, every time.”

If that weren’t enough, outside of the UK, Oxbotica has deepened its partnership with BP by running an AV trial at its Lingen refinery in Germany.

Described as “a world-first in the energy sector”, BP now aims to deploy its first AV for monitoring operations at the site by the end of the year. 

Morag Watson, SVP for digital science and engineering at BP, said: “This relationship is an important example of how BP is leveraging automation and digital technology that we believe can improve safety, increase efficiency and decrease carbon emissions in support of our net zero ambition.”

So much for AV progress stalling!

Pressing data privacy questions as car computer processing power increases.

Connected car data surge: welcome to the world of petabytes and exaFLOPS

The sheer volume of data being collected by connected cars is soaring. Forget megabytes (MB), gigabytes (GB) and even terabytes (TB), it’s time to start thinking in petabytes (PB) and exaflops (EFLOPS).

A petabyte is equal to one quadrillion (one thousand trillion) bytes. However, rather than looking at storage capacity, there’s now been a shift towards performance, measured in floating-point operations per second (FLOPS).

At the CVPR 2021 Workshop on Autonomous Driving event earlier this year, Tesla unveiled its new in-house supercomputer, boasting an eyewatering 1.8 EFLOPS.

The University Information Technology Services tells us that: “To match what a one EFLOPS computer system can do in just one second, you’d have to perform one calculation every second for 31,688,765,000 years.”

Behind this unprecedented processing power sit important questions. Back in 2019 we asked Connected cars: whose data is it anyway? with Bill Hanvey, CEO of the Auto Care Association, warning that “carmakers have no incentive to release control of the data collected from our vehicles”.

Under the headline “Customer trust is essential to large-scale adoption of connected cars”, Engineering and Technology (E&T) recently highlighted a survey, by automotive engineering company Horiba MIRA, which asked 1,038 car owners from the UK, Germany and Italy about privacy in their connected vehicles. 42% said they were not made aware that they could withdraw their consent.

Garikayi Madzudzo, advanced cybersecurity research scientist at Horiba MIRA, commented: “Industry sources estimate that on average about 480 terabytes of data was collected by every automotive manufacturer in 2013, and it is expected that this will increase to 11.1 petabytes per year during the course of 2021.

“With such large volumes of personal information being collected, it is inevitable that privacy will be a challenge.”

This dovetails with a survey by Parkers which found that 86% of people wouldn’t be happy to share driving habit data with third-party companies.

Parkers.co.uk editor, Keith Adams, told Fleet News: “We’re agreeing to all manner of terms and conditions on a daily basis – I shudder to think what Google knows about me – but it comes as a surprise to see so few drivers are aware of what their cars knows about them.”

Meanwhile, The Star Online has published some interesting thoughts on data privacy from Volkswagen Group chief executive, Herbert Diess.

“In Europe, data belongs to our customers first and foremost – they decide what happens with it,” he said.

“In China, data is considered a common good, available for the people’s good. In America, data is predominantly seen as an economic good, is not public, but remains with the companies, with Google, with Apple, in order to serve the business model there.”

Dr Basu issues stark warning on need to earn public trust in self-driving technology.

UK lawyer claims dangerous lack of evidence on safe driverless car to driver handover

Dr Subhajit Basu, of The University of Leeds’ School of Law, is a lawyer with impeccable credentials and a strong sense of public duty… and he’s got serious concerns about “handover” – the moment when a self-driving vehicle transfers control back to a human driver.

An editor at The International Review of Law, a Fellow of The Royal Society (RSA), and Chair of The British and Irish Law Education Technology Association (BILETA), he recently supervised research into “Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces”.

The report, first published in the prestigious Nature journal, concluded that: “An urgent investigation is needed into the technology that allows self-driving cars to communicate with their operators”. Why? Because the “digital interfaces may be unable to adequately communicate safety and legal information, which could result in accidents”.

That is a stark warning indeed and Dr Basu believes the Government and the automotive industry need to be much more up-front about the issues.

Dr Basu report cover
Legal issues in automated vehicles report

SB: “The main safety messages surround the extreme difficulty most drivers will encounter when an autonomous vehicle suddenly transfers the driving back to them. Even if a driver responds quickly, they may not regain enough situational awareness to avoid an accident.

“The general public is not aware of their vulnerability, and it is doubted that an interface in an automated vehicle will communicate this point with sufficient clarity.   

“The article in Nature was part of a multidisciplinary international project, PAsCAL, funded by the EU’s Horizon 2020, into public acceptance of connected and autonomous vehicles (CAVs).

“My expertise is in the regulation of emerging technologies. I’m one of those people who sees autonomous vehicles not as a disruptive, but as something which can improve human life. However, in order to do that, we have to put public safety, public service and public trust before profit. I always emphasise that transparency is paramount, but the autonomous vehicle industry can be extremely secretive.

“The overall goal of PAsCAL was to create a guide to autonomy – a set of guidelines and recommendations to deliver a smooth transition to this new system of transport, particularly with regards to the behaviour of the driver, or user in charge, of an autonomous vehicle. 

“You have to recognise that an Assisted Lane Keeping System (ALKS) is basically an evolution of the lane departure warning systems that lots of cars already have, but in general self-driving cars are not an evolution but a revolution – they will change our way of life.

“We want to understand not just how the technology works, but also how people see it. The aim is to capture the public’s acceptance and attitudes – not just the users in charge, but pedestrians and other road users too – and to take their concerns into consideration.

“With any new technology there has to be a proper risk assessment. Take the example of smart motorways – it’s a brilliant idea in theory and it works in other countries, but there has been a lack of understanding in the UK. We didn’t create enough stopping places and the cameras weren’t good enough to monitor all the cars in real time. You need an artificial intelligence driven system which can identify a car which is slowing in a fraction of a second.

“Similarly with autonomous vehicles, if you want to deliver something like this you should have the right technologies in place. In this case, that means the human machine interface. The vehicle manufacturers (VMs) will basically give responsibility to the driver, the user in charge, saying “when you are warned, you should take over, okay?”.

“In our report, we argue that there will not be enough time for an individual to understand the legal complexities, what they are accepting liability for. The communication of that risk will not be easy for the user in charge to understand. Honestly, how many people have read the terms and conditions of Facebook?

“In autonomous vehicles, the human machine interface will communicate very important safety information and legally binding information, with civil or criminal implications if the driver fails to adequately respond.

“If you look at the proposed change to the Highway Code, it assumes that the driver will be able to take back control when prompted by the vehicle. We are concerned that even the most astute and conscientious driver may not be able to take back control in time. The effectiveness of the human machine interface is one limiting factor and then there is the driver – every driver has different cognitive abilities and different skill levels.

“Human beings are all different, they react differently to different circumstances, so defining the right timeframe for a handover is a difficult balance to strike. Are you going to assess people on their cognitive abilities, on the speed of their reflexes?

“In some circumstances, I have doubts about whether it is fair to have a handover even within 30 to 40 seconds. Certainly, there is nothing I have found where scientifically they have viewed 10 seconds as an adequate time. Cognitively, a blanket 10 seconds simply may not be possible – that’s my major concern.

“This is something we have been talking about for quite some time now. The UK government seems to be in very much in favour in pushing ahead with this technology quickly, because it fits with the “Build Back Better” tagline. There is a huge risk that we are disregarding safety in the name of innovation.

“I think the automotive industry has a responsibility here. When you are travelling in a self-driving car, the manufacturer is responsible for your safety, for ensuring that the technology is up to standard.

“The industry also has a responsibility to ensure that drivers are adequately trained, adequately educated. The argument that accidents happen and can be used for development is vulgar. Go and tell that to the person who has lost a relative – that this is a learning process.

“I am not against autonomous vehicles. What I am saying is that we need evidence-based conclusions. We need to be sure that the reaction time is well-founded and supported, so we don’t create a system which will fail.

“Personally, I propose that we should first create a comprehensive legal framework which should mandate additional driver training for the safe use of self-driving systems. The automotive industry could take a lead on this, actively push for it.

“At the end of the day, this is about road safety, it is about saving human lives. I believe that autonomous vehicles can reduce congestion, can be good for the climate, but they also have the potential to become deathtraps because we are getting over-reliant on the technology to work perfectly and over-relying on human ability, without the evidence-based research to find out whether we can react within the stipulated time.

“As a lawyer, it is my responsibility to uphold public safety, to highlight the risks. If the government and the automotive industry don’t face these issues, then people will lose trust in this amazing technology.”

For more, you can read the full Nature article here.

PAVE is on a mission to inform the US public about self-driving vehicles.

Letters from America: Partners for Automated Vehicle Education (PAVE)

There are many lessons America can teach us Brits about the safe introduction of driverless cars, and the vital work of Partners for Automated Vehicle Education (PAVE) is a prime example.

The US is well ahead of the UK in terms of on-road testing and there have been crashes. These high-profile incidents have dented consumer confidence and calls for greater oversight have now been met.

On 29 June 2021, The National Highway Traffic Safety Administration (NHTSA) announced that the manufacturers and operators of vehicles equipped with advanced driver assistance systems (ADAS), or higher SAE level automated driving systems, must report crashes.

Against this background, PAVE has a mission “To inform the public about automated vehicles and their potential so everyone can fully participate in shaping the future of transportation”.

Tara Andringa
Tara Andringa, Executive Director of PAVE

Executive Director of PAVE, Tara Andringa, explains: “PAVE was born at CES in Las Vegas in 2019 and unites industry, academia, non-profits and the public sector. PAVE aims to bridge the gap between the huge resources that industry is investing in AV technology, and opinion polls that show that the public is largely confused and distrustful. Our mission is to educate and engage the public.

“We don’t advocate for any particular policy. We are all about education, having a conversation and raising the level of understanding – we want to equip everyone to be part of the conversation. We started with 18 members at CES, and we’ve grown to over 80 members. There has been a lot of agreement about the need for this kind of effort, including many big industry players.”

Importantly, PAVE now has many of these big players on-board: vehicle manufacturers including Audi, Ford, Toyota and VW; AV specialists Cruise, Oxbotica and Waymo; IT and comms giants Intel and Blackberry; motoring bodies including the National Automobile Dealers Association (NADA); influential campaign groups like Mothers Against Drunk Driving (MADD); and charities such as The National Federation of the Blind.

Andringa continues: “Although our organisation includes very diverse members with diverse missions, we find that our efforts are more impactful if all of these groups come together.

“We like to put on demonstration events to demystify the technology and the good news is that knowledge and experience change attitudes. When we get people into AVs, they often say it is just like being in a human-driven car, and it’s almost boring. For us, that’s a success. It builds trust and understanding, which are universal concepts.

PAVE demo
PAVE AV demonstration event

“We also conduct surveys and have found a lot of confusion about the technology that’s on the road today – from people who say self-driving cars will never happen, to people who think their cars are already equipped to drive themselves.

“In particular, people confuse driver assistance with self-driving. We very much believe ADAS can improve safety, but we always emphasise that all cars for sale today require a responsible driver behind the wheel.

“Another way we have reached a lot of people is through our weekly panel discussions looking at all different aspects of AVs. These originally came about due to the pandemic, but they have gotten over 12,000 views on YouTube.

PAVE panel
PAVE panel discussion

“Recently we partnered with the State of Ohio to engage the public sector. Town and city authorities want to be ready, but they have lots of questions. We ran a workshop on how AVs work from the point of view of regulation, freight, law enforcement and linking with existing transport. The response was incredibly positive.”

For more information, including links to the panel discussions and other helpful resources, visit pavecampaign.org