Lucas Noldus Ph.D. details the latest high tech ways to measure driver behaviour in ADAS-equipped and self-driving vehicles

Connected and self-driving car safety: Noldus keeps more than an eye on distracted driving

Isn’t LinkedIn marvellous? I met Lucas Noldus Ph.D., Founder & CEO of Netherlands-based Noldus Information Technology, after he liked my interview with his Global Partnership on Artificial Intelligence (GPAI) colleague, Inma Martinez.

A few messages flew back and forth, and it transpired that he’s an expert in measuring driver behaviour, particularly driver-vehicle interactions in ADAS-equipped and self-driving vehicles. That was music to my ears, so we arranged a Zoom. What follows is the highly insightful result.

Lucas Noldus
Lucas Noldus Ph.D., Founder of Noldus Information Technology

LN: “The future starts here. The world is changing. We see people living longer and there are more and more interactive devices – telephones, tablets, dashboards – with which we can interact, leading to greater risk of distraction while driving. I know personally how tempting it is to use these devices, always trying to keep your eyes on the road.

“We already have fascinating developments in connected driving and now, with self-driving, the role of the driver changes significantly. That has triggered research institutes, universities, OEMs and tier one suppliers to pay more attention to the user experience for both drivers and passengers.

“All these experiences are important because how people perceive the safety and comfort will influence their buying decisions, and their recommendations to other potential users.

“For autonomous driving, how far will we go towards level five? What happens at the intermediate stages? Over the coming decades, driving tasks will gradually diminish but, until full autonomy, the driver will have to remain on standby, ready to take over in certain situations. How will the vehicle know the driver is available? How quickly can he take over? These are the topics we’re involved in as a technology company.

“We make tools to allow automotive researchers to keep the human in the loop. Traditionally, automotive research focused exclusively on improving the vehicle – better engines, drivetrains etc. Until recently, nobody paid much attention to the human being (with a brain, skeletal system, muscles, motor functions), who needs to process information through his sensory organs, draw the right conclusions and take actions.

“Now, these aspects are getting more attention, especially in relation to reduced capacity, whether due to a distracting device, drugs, alcohol or neurodegeneration. As you get older your response time becomes longer, your eyesight and hearing abilities reduce, as does the speed at which you can process information.

“These are the challenges that researchers in automotive are looking at concerning the role of the driver, now and in the future. If the automated or semi-automated system wants to give control back to the driver because its AI algorithms decide a situation is too complex, can the driver safely take over while he’s been doing something like reading or taking a nap? How many milliseconds does the brain need to be alert again?

NK: “Draft legislation seems to be proceeding on a 10-second rule, but some studies say at least 30 seconds is required.”

LN: “Situational awareness – that’s a key word in this business. Not only where am I geographically, but in what situation. Oh, I’m in a situation where the road surface is very wet, there’s a vehicle just in front of me, the exit I need is near and I’m in the wrong lane. Understanding a situation like that takes time.

“If we take a helicopter view, from our perspective as a technology company, what should be measured to understand the driver behaviour? Which sensors should we use to pick up that information? If we use a microphone, a video camera, a heartbeat monitor and a link to the ECU, how do we synchronise that?

“That’s not trivial because one sensor may be sending the sampling at 300Hz and another at 25 frames per second. That’s something my company has specialised in over the years. We’re very good at merging data from different sources, whether it’s a driving simulator continuously spitting out data, a real car, or sensors mounted in the infrastructure.

“You then need to analyse that data and pull out meaningful quantitative units that give you actionable insights. Generating large matrices is no big deal, making sense of that information is the real challenge.

“For example, in dashboard design, a manufacturer might be comparing two or three displays of road quality. A driver behaviour study with our tools will give the designer a clear answer on which design leads to the least cognitive workload, the least confusion.

Noldus DriveLab
Noldus DriveLab

“This same technical challenge can be applied to a vast number of design objectives. The vehicle manufacturer might be looking to make incremental improvements to, say, the readability of the dashboard under certain light conditions. Or they might be working on a completely new feature, like an intelligent personal in-car assistant. A number of brands are working on that, but the concept is still relatively new.

“You cannot test every scenario on the road, it’s just too dangerous, so we work with simulator manufacturers too. On the road or in the lab, we can measure a driver’s actions with eye-tracker, audio, video, face-reader and physiology in one.”

NK: “Back to LinkedIn again, I saw a post by Perry McCarthy, the F1 driver and original Stig on Top Gear, who said something like: Simulators are getting so good these days, when you make a mistake they drop three tonnes of bricks on your legs!”

LN: “You have so-called high fidelity and low fidelity simulators – the higher the fidelity, the closer you get to the real vehicle behaviour on the road, and there are all sorts of metrics to benchmark responsiveness.

“You have simple fixed-base simulators right up to motion-based simulators which can rotate, pitch and roll, move forward, backwards, sideways and up and down. For the best ones you’re talking about 10 million euros.

“We work with OEMs, tier1 suppliers, research institutes and simulator manufacturers to build-in our DriveLab software platform. We also advise on what sensors are recommended depending on what aspects of driver behaviour they want to study.

“We try to capture all the driver-vehicle interactions, so if he pushes a pedal, changes gear or turns the steering wheel, that’s all recorded and fed into the data stream. We can also record their body motion, facial expression, what they’re saying and how they’re saying it – it all tells us something about their mental state.

Noldus eye-tracker
Multi-camera eye tracker (Smart Eye)

“Eye tracking measures the point of gaze – what your pupils are focused on. In a vehicle, that might be the left, right and rear mirrors, through the windscreen or windows, around the interior, even looking back over your shoulders. To capture all that you need multiple eye-tracking cameras. If you just want to look at, for example, how the driver perceives distance to the car in front, you can do with just two cameras rather than six.

“Eye tracking generates all sorts of data. How long the eyes have been looking at something is called dwell time. Then there’s what direction the eyes are looking in and how fast the eyes move from one fixed position to another – that’s the saccade. People doing eye tracking research measure saccades in milliseconds.

“Another important metric is pupil diameter. If the light intensity goes up, the pupil diameter decreases. Given a stable light condition, the diameter of your pupil says something about the cognitive load to your brain – the harder you have to think, the wider your pupils will open. If you’re tired, your blink rate will go up. There’s a normal natural blink rate to refresh the fluid on your eyes with a fully awake person, but if you’re falling asleep the blink rate changes. It’s a very useful instrument.

“Then there’s body worn sensors that measure physiology. It’s harder to do in-car, but in a lab people don’t mind wearing electromyography (EMG) sensors to measure muscle tension. If you’re a designer and you want to know how easy it is for an 80-year-old lady to operate a gearshift, you need to know how much muscle power she has to exert.

“We also measure the pulse rate with a technique called photoplethysmography (PPG), like in a sports watch. From the PPG signal you can derive the heart rate (HR). However, a more accurate method is an electrocardiogram (ECG), which is based on the electrical activity of the heart.


Noldus physiological data
GSR (EDA) measurement

“Further still, we measure galvanic skin response (GSR), also called electrodermal activity (EDA), the level of sweating of your skin. The more nervous you get, the more you sweat. If you’re a bit late braking approaching a traffic jam, your GSR level will jump up. A few body parts are really good for capturing GSR – the wrist, palm, fingers, and the foot.

“We also measure oxygen saturation in the blood with near infrared spectroscopy (NIRS) and brain activity with an electroencephalogram (EEG). Both EEG and NIRS show which brain region is activated.

“Another incredibly useful technique is face reading. Simply by pointing a video camera at someone’s face we can plot 500 points – the surroundings of the eyebrows, the eyelids, the nose, chin, mouth, lips. We feed this into a neural network model and classify it against a database of tens of thousands of annotated images, allowing us to identify basic emotions – happy, sad, angry, surprised, disgusted, scared or neutral. You can capture that from one photograph. For other states, like boredom or confusion, you need a series of images.

“These days we can even capture the heart rate just by looking at the face – tiny changes in colour resulting from the pulsation of the blood vessels in the skin. This field of face reading is evolving every year and I dare to claim that we are leading the pack with our tool.

“Doing this in the lab is one thing, doing it in a real car is another challenge, being able to keep your focus on the driver’s face and deal with variable backgrounds. Of course, cars also drive at night so the next question is can you do all this in darkness? We turned our company van into an instrumented vehicle and my sons agreed to be the guinea pigs.

“It took some work – overcoming the issue of light striking the face and causing sharp shadows, for instance – but we can now use infrared illuminators with our FaceReader software to make these measurements in full darkness.

“The turning of the head is also very important in studying distraction, for example, if the driver looks sideways for too long, or nods their head in sleepiness. When something shocks someone, we see the face change and the blood pressure rise, and these readings are synchronised in DriveLab.

“It is well proven that even things like changing radio station can be very distracting. Taking your eyes off the road for just a few seconds is dangerous. As we move to more and more connected devices, touchscreens and voice commands, minimising distraction is vital to ensure safety.”

NK: “I absolutely love this tech but what I actually drive is a 7-year-old Suzuki Swift Sport with a petrol engine and a manual gearbox, and I quite like it that way”

LN: “I’m doing research on cars of the future with my software but I am personally driving a 30-year old soft-top Saab 900. That’s my ultimate relaxation, getting away from high tech for a moment.

“At Noldus, we’re constantly pushing the boundaries of research, working with top level organisations in automotive – Bosch, Cat, Daimler, Fiat, Honda, Isuzu, Land Rover, Mazda, Nissan, Scania, Skoda, Toyota, Valeo and Volvo, to name just a few – and also with the Netherlands Aerospace Centre (NLR) and the Maritime Research Institute Netherlands (MARIN).

“Our aim is make it so that the client doesn’t have to worry about things like hardware to software connections – we do that for them so they can focus on their research or design challenge.”

For further info see noldus.com





Bill McKinley of Keysight Technologies explains how C-V2X and DSRC enable higher levels of self-driving

Keysight at forefront of self-driving safety standards and certification

Ahead of a flagship product launch later this week, Bill McKinley, Automotive Strategic Planner at Keysight Technologies, gives his thoughts on self-driving and the fast-changing connected and autonomous vehicle (CAV) landscape.

Avid readers may remember that Bill was on the panel I hosted at the Small Cells World Summit in May. He’s got 30+ years’ experience in wireless communications and his current focus is developing test solutions for the automotive sector.

BM: “The UK, in line with other nations around the world, is investing heavily in connectivity and electrification – both the vehicles themselves and the charging infrastructure. Connected vehicles have been demonstrated to enhance safety via cellular vehicle to everything (C-V2X) and dedicated short-range communication (DSRC).

“These technologies allow for more efficient driving, for example, by routing to avoid accidents or poor road conditions. They also enable higher levels of automation, all of which can lead to an improved overall driving experience.

“It is likely that the first fully automated vehicles will be delivery vehicles, controlled environment shuttle type services, and buses on specific routes. With the gradual introduction of robotaxis, we will no doubt start to see Mobility as a Service (MaaS) become more common over the next 10-15 years.

“From a Keysight perspective, we play a significant role at the very leading edge of connected and automated mobility. We participate in various global organisations developing the standards, test procedures and certification for the industry, including the 5G Automotive Association (5GAA), the Car 2 Car Communication Consortium (C2C CC), the China Academy of Information and Communications Technology (CAICT), the OmniAir Consortium and the Society of Automotive Engineers (SAE).

“Keysight was the first test and measurement company to be awarded Global Certification Forum (GCF) validation for C-V2X RF conformance. We have industry leading validated test cases for the C-V2X protocol conformance test, and we were the first to be awarded OmniAir Qualified Test Equipment (OQTE) status. 

“Cybersecurity will play a critical role in connected mobility and Keysight is working with leading companies and organisations in this space to develop solutions to ensure vehicular communications remain safe and robust against attacks. 

“Clearly, the main risks associated with self-driving vehicles are around the safety aspects, which in turn will heavily influence public acceptance of the technology. We are all very familiar with some of the headlines about Tesla vehicles.  

“It remains incredibly challenging to overcome the complexities of urban automated driving, but things are improving all the time. Our autonomous driving emulator (ADE) system is designed with this in mind – to test many autonomous drive systems in a rich and authentic environment within the lab, before moving testing out into the field.”

More on that to follow soon. For further info see keysight.com

Inma Martinez, author of new book The Future of the Automotive Industry, on self-driving and connected cars

Street smart cars of the future will drive like a local and diagnose Alzheimer’s

Described by Time magazine as “One of the best talents in human digital behaviour”, Inma Martinez advises business leaders and governments (including the UK’s Department of Culture, Media and Sport) on AI and digitisation. She’s just written a book called The Future of the Automotive Industry, so obviously we had to ask her about driverless cars.

How did you come to specialise in automotive?

IM: “I first got involved in the auto industry in the early 2000s, when BMW recognised that they had to attract female drivers and buyers. We made a series of short films with directors including Ridley Scott and John Woo, starring Clive Owen as The Driver. Guy Ritchie’s had Madonna in it. In those days, I was working as a human factors scientist, looking at how humans use technology.

“Previously, I had been a telecoms engineer specialising in internet protocols. Then, because Nokia bought two of my start-ups, I landed in their innovations department. Together with Intel, we came to the realisation that telecommunications companies had to create alliances with auto manufacturers for vehicle to everything (V2X) and vehicle to infrastructure (V2I) communications.

“I worked for Volkswagen Group designing cars with AI and met Mark Gallagher and all the Formula One crowd. I thought: I have to write about the future of this industry, because in the next five to ten years it will not look anything like today – the massive influence of the Internet of Things (IoT) and AI, sustainability and the green economy. I wrote the book during the pandemic and it came out in June.”

Setting EVs aside, how do you view the autonomous side of things?

IM: “I love the topic, firstly because it needs so much definition. People interchange ‘autonomous’ with ‘self-driving’, but they’re separate things. Unfortunately, the media is not very sophisticated in talking about that.

“For me, it’s something that’s been happening for 15 or 20 years, initially because the industry was pressed to improve safety. You got level one autonomous features, like cruise control and parking assistance, making things easier and safer. Now we’re at level three, and no one understands what on earth is going on!

“I hate it when Tesla put out press releases claiming full self-driving. The PR houses are doing a disservice to the industry because they’re confusing people. I delved into this for the book and came up to the conclusion that we’re not going to see autonomous cars until the regulation is ready for them.

“The European Union put out a good first attempt to define self-driving in 2019, and Japan has changed a lot of its traffic laws to allow Honda to start putting level three cars on the road.

“This will only happen when the legal framework is defined. Otherwise, you have the massive legal issue of who’s at fault in a crash. There’s got to be an effort in the industry to help create these legal frameworks, and I don’t think it’s too complicated.

“The way I see it, we need to differentiate an autonomous car – a level five car which can do literally everything by itself – from self-driving cars which can drive and brake and accelerate and have situational awareness, but which can’t operate constantly by themselves and still need the driver to keep their eyes on the road.”

Proposed changes to the Highway Code talk of drivers not having to pay attention anymore. Is there a danger that regulators could jump the gun?

IM: “That is frightening. You can’t put vehicles on the road driving themselves with just computer vision, you need V2X, roadside units (RSUs), Vehicular Ad Hoc Networks (VANETs) – all the beacons that make roads smart. You need 5G infrastructure, so the car is actually guided by connectedness. This has to do with urban planning and smart cities, not with the automotive industry per se.

“The point is not just whether can we make cars autonomous, it is whether we can make them street smart. The way people drive is different in every country. In Rome, people brake all the time. In Kuala Lumpur, there are mopeds everywhere. So, the car of the future is going to have to be adaptive – the AI, computer vision, all the settings will be different depending on where it is.

“There’s a wonderful thesis that asks whether people are born street smart or whether they get it when they move to a big city. I began to think about autonomous cars driving around big urban centres – they’re going to have to get the pulse of how you drive in a certain city. We need to train the system to learn how to integrate itself.

“We’ve only just begun to consider what autonomous is, and we need to have a bold vision as to what it should be. In my view, we need to make cars smart, not just autonomous.”

What are the main risks in the shift to self-driving?

IM: “We need a legal framework. We need integration into the smart city infrastructure, including telecommunications. We also need definitions.

“Cars look fabulous at the Geneva Motor Show, but nobody talks about them in contexts. Should there be designated lanes for hands-free driving? How are we going to deal with a car parc that is not all digital, that still has a lot of older vehicles?

“Automotive is one of the hardest industries to create innovation because you have the pressure of safety, safety, safety at all costs. For example, nobody’s working on voice commands anymore because it turned out they were a distraction, a nuisance.”

Can you address the challenges specific to the UK?

IM: “Yes – your road network. In the UK you have a lot of 60mph rural roads where you can barely see what’s coming. I drive in Somerset and holy cow! It’s only because humans drive in such a super intuitive way that there aren’t more crashes.

“Perhaps it’s also because your driving test is so rigorous. I did my test at school in a small town in Pennsylvania. The police would make you drive around the car park and give you your licence. That was it.

“Then you have London, which is like no other city. It is a Dickensian city with 21st century vehicles running through it. It is a costly challenge to test smart road infrastructure without creating congestion. Where are the budgets going to come from?”

Anything else you’d like to mention?

IM: “I was speaking to a board member at Volkswagen recently and he said that one of the revelations of the pandemic was that it motivated people to own a car, rather than use public transport, for health and safety reasons, and a certain level of freedom and privacy. People have conversations when driving that they wouldn’t have on a train.

“It is also worth highlighting the prospect of the automotive industry partnering with healthcare companies on predictive medicine – keeping track of your vital biometrics to help detect serious diseases. If you’re going to be sitting in this highly technical environment for two hours a day, data such as the way you check your mirrors can reveal early symptoms of things like Alzheimer’s.

“Connected cars will add another layer of personal profiling and data authentication. Digital fingerprinting companies will be able to see that it’s me on my usual route, doing what I normally do. The cybersecurity will have to be very strong though. Imagine somebody hacking into the traffic management system of a future city – that’d be the ultimate hack.”

And on that very Italian Job note, our time is up. Inma Martinez’s book The Future of the Automotive Industry is out now, or visit inmamartinez.io

Pressing data privacy questions as car computer processing power increases.

Connected car data surge: welcome to the world of petabytes and exaFLOPS

The sheer volume of data being collected by connected cars is soaring. Forget megabytes (MB), gigabytes (GB) and even terabytes (TB), it’s time to start thinking in petabytes (PB) and exaflops (EFLOPS).

A petabyte is equal to one quadrillion (one thousand trillion) bytes. However, rather than looking at storage capacity, there’s now been a shift towards performance, measured in floating-point operations per second (FLOPS).

At the CVPR 2021 Workshop on Autonomous Driving event earlier this year, Tesla unveiled its new in-house supercomputer, boasting an eyewatering 1.8 EFLOPS.

The University Information Technology Services tells us that: “To match what a one EFLOPS computer system can do in just one second, you’d have to perform one calculation every second for 31,688,765,000 years.”

Behind this unprecedented processing power sit important questions. Back in 2019 we asked Connected cars: whose data is it anyway? with Bill Hanvey, CEO of the Auto Care Association, warning that “carmakers have no incentive to release control of the data collected from our vehicles”.

Under the headline “Customer trust is essential to large-scale adoption of connected cars”, Engineering and Technology (E&T) recently highlighted a survey, by automotive engineering company Horiba MIRA, which asked 1,038 car owners from the UK, Germany and Italy about privacy in their connected vehicles. 42% said they were not made aware that they could withdraw their consent.

Garikayi Madzudzo, advanced cybersecurity research scientist at Horiba MIRA, commented: “Industry sources estimate that on average about 480 terabytes of data was collected by every automotive manufacturer in 2013, and it is expected that this will increase to 11.1 petabytes per year during the course of 2021.

“With such large volumes of personal information being collected, it is inevitable that privacy will be a challenge.”

This dovetails with a survey by Parkers which found that 86% of people wouldn’t be happy to share driving habit data with third-party companies.

Parkers.co.uk editor, Keith Adams, told Fleet News: “We’re agreeing to all manner of terms and conditions on a daily basis – I shudder to think what Google knows about me – but it comes as a surprise to see so few drivers are aware of what their cars knows about them.”

Meanwhile, The Star Online has published some interesting thoughts on data privacy from Volkswagen Group chief executive, Herbert Diess.

“In Europe, data belongs to our customers first and foremost – they decide what happens with it,” he said.

“In China, data is considered a common good, available for the people’s good. In America, data is predominantly seen as an economic good, is not public, but remains with the companies, with Google, with Apple, in order to serve the business model there.”

Vehicle-to-everything (V2X) 4G and 5G connectivity via small cells can be a lifesaver.

Carsofthefuture.co.uk editor to host Automotive & Transportation session at Small Cells World Summit 2021

Carsofthefuture.co.uk has signed a media partnership agreement with The Small Cell Forum (SCF) for its three-day online Small Cells World Summit, The Future of Mobile Networks, on 11-13 May 2021.

Small Cells World Summit 2021 registration
Small Cells World Summit 2021 registration

As part of the deal, Carsofthefuture.co.uk editor Neil Kennett will moderate the Automotive & Transportation session from 11am on Wednesday 12 May, with high-profile speakers including: Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook Proving Ground; Dr Maxime Flament, Chief Technology Officer at the 5G Automotive Association, one of the world’s leading authorities on Intelligent Transport Systems (ITS); Bill McKinley, Connected Car Business Lead at Keysight Technologies; and Mark Cracknell, Head of Connected and Automated Mobility at Zenzic, responsible for accelerating the self-driving revolution in the UK.

Neil Kennett, said: “We are delighted to partner with The Small Cell Forum for this exciting virtual event, which brings together mobile operators, vendors and regulators from around the globe. The Automotive & Transportation session will focus on connected and autonomous vehicle (CAV) opportunities, particularly vehicle-to-vehicle (V2V) and vehicle-to-everything (V2X) communications, in-vehicle payments, and the rival ITS-G5 and C-V2X 5G technologies.

“Small cells deliver high-quality, secure 4G and 5G coverage, so there are clearly a multitude of new use cases in the connected car world and the wider mobility ecosystem. Aside from supporting self-driving, they can facilitate everything from in-car infotainment and shopping, to fixing technical problems before they occur and pre-empting likely crash scenarios. It is no exaggeration to say they could be a lifesaver.”

Carsofthefuture.co.uk readers can benefit from a 40% discount on Small Cells World Summit 2021 tickets using the code SCWS2021. See www.smallcells.world/

The key to the future of self-driving is education, education, education, says Millbrook’s Stoker.

On track and in virtual space, Millbrook tests cars of the future

Our Zenzic CAM Creator series continues with Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook.

Part of CAM Testbed UK, Millbrook Proving Ground in Bedford boasts 700 acres of private roads on which to develop and test connected and autonomous vehicle (CAV) technologies. As Chief Engineer, Peter Stoker is right at the forefront of self-driving in the UK.

Peter Stoker
Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook

Please can you outline Millbrook’s work on connected and automated mobility?

“My primary role is to bring focus to two testbeds, our CAV testbed and our 5G testbed. We are not a purpose-built CAV testbed – we have safety, propulsion and conventional vehicle test facilities too – so CAV is something we’ve blended into the existing business.

“For the CAV testbed, we partnered with the UK Atomic Energy Authority (UKAEA), particularly the Remote Applications in Challenging Environments (RACE) division, to provide a controlled urban environment. We have three open source StreetDrone vehicles and miles of track with targets for very precise measurements, accurate to 1-2cm. We offer safety driver training and also have a simulation environment for driver-in-the-loop and hardware-in-the-loop testing. The whole idea is to fail in private, not in public, and to progress, to evolve out of the testbeds and on to open roads.

“The 5G testbed is a completely separate consortium, backed by the Department for Digital, Culture, Media and Sport (DCMS). We have 59 masts looking at all types of connectivity and I’d say the millimetre wave at 70GHz is currently the most interesting.”

Millbrook graphic
Millbrook Proving Ground graphic

What major shifts in UK road transport do you expect over the next 10 years? 

“Getting the crystal ball out, I see increased use of connectivity in existing vehicles and some very interesting new use cases – buses connected to city networks, video analytics from cameras, smart ambulances streaming live data, autonomous deliveries on campuses. What I don’t see within 10 years is millions of privately owned driverless cars. That will start in the luxury sector but to begin with it will be more about transporting goods.”

How do you see the testing framework for CAVs developing?

“There’s a lot of simulation in the automotive world – crash testing, fatigue testing, computational fluid dynamics. These days, manufacturers are developing whole vehicles before building a prototype. You have to have a good simulation on a good simulator and there’s an interesting shift that needs to happen on regulation. It’s early days on that, but it’s essential.

“The strength of virtual space is that you can run hundreds of scenarios in machine time – not only set up complicated scenarios that would take days with real cars, but actually speed up the process so it runs faster than real time. The national scenario database is already really good and regulation will move to being a mixture of real and virtual certification – global, European, UK and perhaps even city-specific. We are happy to advise, but don’t set policy.”

What are the biggest challenges in the shift to self-driving and how can these risks be mitigated?

“The key to the future of self-driving is education, education, education – for everyone, the public, vehicle manufacturers, the aftermarket, recovery operators. We have to work on the terminology – autonomous, driverless, CAV, CAM – it’s confusing, even to people who know what they’re talking about.

“At the moment, we’re making it harder to understand, not easier. We’re in a really grey area of transition with different trade names for systems. There’s a lot of groundwork needed to prepare people and, for example, the brilliant website mycardoeswhat.org does a great job of trying to explain it.

“If you get into a hire car, you need to have the right expectation of what it does and what it doesn’t do. If you buy a new car, you should read the manual, but how many people do? Especially with Covid, more cars are being delivered with minimal interaction – it’s a case of “there’s the key, where’s the station?”. Too often, the customer handover just isn’t there.

“How are garages, the aftermarket and the amber light sector going to deal with all this? Basic questions like how do you put it in neutral? ADAS has already led to huge changes in training and skill sets – how to calibrate and monitor them.

“We haven’t talked about over-the-air (OTA) updates, cameras embedded in the tarmac or even electrification – there’s a huge amount of things! How do you learn about them? Hopefully in testing rather than in crash situations.”

For further info, visit www.millbrook.co.uk

UK government sparks global business sharing transport sector data.

Sharing data collected by connected cars

Our Zenzic CAM Creator series continues with Mika Rasinkangas, founder and President of Chordant.

Originally part of the global wireless and internet of things (IoT) research company, InterDigital, Chordant was spun out as a separate business in 2019, as “a dynamic data sharing expert”. The spark was a UK government initiative to test the hypothesis that regional transportation data has tremendous value, especially when shared between different parties. The results of this two-year public-private partnership were startling.

Please can you outline your work on connected and automated mobility?

MR: “First of all we looked at the mobility space. There’s the segment that maintains the road network and their supply chain, the mobility service providers – bus companies, train operators and new entrants such as Uber – then the whole automotive sector, OEMs and their supply chain partners. We sit right in the middle of all this and our role is data exchange – bringing dynamic data sets from different sources to come up with something different that solves problems with data driven solutions.

“The hypothesis was that a lot of data in the transport segment was either underutilised, in really small silos, or not utilised at all. The idea was to work with different entities – organisations, companies and universities – to bring data together and make it more widely available, leading to innovation and efficiency.

“It was obvious from early on that this was not only a technical issue, there was a human element. Data is controlled by different entities and departments so the challenge was to get these different data owners comfortable with the idea that their data could be used for other purposes, and to get consumers comfortable with it too. The result was more usable and more reliable dynamic data.”

What major shifts in UK transport do you expect over the next 10-15 years?

MR: “Last mile transport, micromobility solutions are ballooning and Covid19 will only accelerate this. People are walking, scootering and biking more, making short trips by means which don’t involve public transport or being in close contact to others.

“In terms of automotive, we’re living through a massive change in how people perceive the need to own a car, and this shift in perception is changing the fundamental business models. Autonomous vehicle technology keeps developing, connected vehicles are everywhere already and electric cars represent an ever bigger proportion of the vehicle population. In all these segments data utilisation will continue to increase. New cars collect huge amounts of data for lots of purposes and this can be used for lots of things other than what it was originally collected for.”

Can you address the data privacy concerns surrounding connected cars?

MR: “Data privacy is a multifaceted topic. On the one hand, Europe has been at the forefront of it with GDPR. That puts businesses operating in Europe on a level playing field. In terms of connected and autonomous vehicles (CAVs), these regulations set limitations on what data can be harvested and what has to be anonymised in order for someone to use it. It fits the norms of today’s society, but you can see in social media that this kind of privacy seems less important to younger people, however perspectives vary greatly and companies need to be transparent in usage of people’s data.

“From a business perspective, we have to take privacy extremely seriously. The explosion of data usage can have unintended consequences but by and large the regulatory environment works quite reasonably.

“We typically deal with conservative entities which put privacy and security in the middle of everything – if there’s any uncertainty it’s better to not do it, is the attitude. Think of all the sensitive personal data that entities like car companies and mobile telephone companies have. It can give an extremely accurate picture of peoples’ behaviour. There are well established procedures to anonymise data so customers can be comfortable that their personal data cannot be identified.”

What are the main risks in the shift to self-driving and how can these be mitigated?

MR: “One could talk about a lot of different challenges. What about the latency in connectivity in order to ensure processing takes place fast enough? There’s a gazillion of things, but to me these are technical nuts that will be cracked, if they haven’t been already. One of the biggest challenges is the interaction between human-controlled vehicles and automated vehicles. When you add in different levels of driver assistance, urban and rural, different weather conditions – all sorts of combinations can happen.

“The UK is at the forefront of CAV testing. There are government sponsored testbeds and companies are running trials on open roads, so the automotive industry can test in real-life environments. We cannot simulate everything, and the unpredictability of interactions is one of the biggest challenges. A traffic planner once told me that in his nightmares he sees a driverless car heading toward a granddad in a pick-up truck, because there’s just no telling how he might react!”

Is there anything else you’d like to mention?

MR: “I’d like to address the explosion of data usage in mobility and how dynamic data enables not only efficiency improvements but new business models. According to recent studies by companies like Inrix, congestion costs each American nearly 100 hours or $1,400 a year. Leveraging data-driven insights can drive change in both public policies and behaviours. In turn, these can result in reduced emissions, improved air quality and fewer pollution-caused illnesses.

“CAVs can be data sources providing tons of insight. Think about potholes – new vehicles with all these cameras and sensors can report them and have them fixed much more efficiently. This is just one example of entirely data-driven efficiency, much better than eyeballing and human reporting. There will be a multitude of fascinating uses.

“Organisations such as vehicle OEMs, transport authorities and insurance providers will require facilities for the secure and reliable sharing of data, and that’s where we come in. I would urge anyone interested in data driven solutions in the mobility space to visit chordant.io or our Convex service site at convexglobal.io.”

Dr Joanna White says Highways England is currently more focused on the connected bit of connected and automated mobility (CAM).

Highways England expert predicts Level4 self-driving in towns before motorways

Our Zenzic CAM Creator series continues with Dr Joanna White, Head of Intelligent Transport Systems at Highways England.

As the body responsible for designing, building and maintaining our motorways and major A-roads, Highways England (HE) is a uniquely important player in the UK connected and automated mobility (CAM) ecosystem. Here, Head of Intelligent Transport Systems at Highways England, chartered engineer Dr Joanna White, outlines its work on CAM.

Dr Joanna White, Head of Intelligent Transport Systems at Highways England
Dr Joanna White, Head of Intelligent Transport Systems at Highways England

JW: “A key aim in improving our service is to look at how we can safely use emerging technology to better connect the country – people and places, families and friends, businesses and customers. This includes what digital channels we might use, delivering a cleaner road environment and achieving net zero carbon.

“Our connected corridor project on the A2/M2 in Kent finished 10 months ago and we are just completing the evaluation. Collaboration is vital and this was a joint project with Kent County Council (KCC), Transport for London (TfL), the Department for Transport (DfT) and others. It was also part of a wider European project, Intercor.

“We are currently more focused on the connected bit of CAM, building on the services we already provide. This includes beaming information directly into vehicles (replicating what you see on the gantries) and also what data we can anonymously collect from vehicles’ positioning sensors. Can we maintain service from one part of the network to another? Can we do it in an accurate, timely and secure way? How do people feel about it?

“We try not to choose particular technologies – whether it’s radar, lidar, cellular – we are interested in all of it. It could be 5G and, via the DfT, we work closely with the Department for Digital, Culture, Media and Sport (DCMS), which leads on that. One of the most positive government actions was the requirement for mobile operators to provide 90% coverage of the motorway network by 2026.

Highways England car interior 2
Highways England in-car upcoming junction message

“We were very proud to be involved with the HumanDrive project in which a self-driving Nissan Leaf navigated 230 miles from Cranfield to Sunderland. It was a great learning experience in how to  conduct these trials safely, underpinned by our safety risk governance. We had to identify all the risks of running such a vehicle on the strategic road network (SRN), and find ways to mitigate them. It was fascinating to see how it coped on different types of roads, kept to the lines and responded to road sign information.

“Then there’s our Connected and Autonomous Vehicles: Infrastructure Appraisal Readiness (CAVIAR) project, which has been slightly delayed due to Covid. We are building a simulation model of a section of the M1, a digital twin, and we have a real-world car equipped with all the tech which will start operating in 2021. That will collect a lot of data. This is one of our Innovation competition winning projects, run by InnovateUK.

“Within Highways England we have a designated fund for this kind of research, and that means we can invest in further trials and do the work needed to provide more vehicle-to-infrastructure (V2I) communications.

“Personally, I think that Level4 self-driving, eyes off and mind off, is years away, perhaps decades, certainly in terms of motorway environments. However, we are constantly in discussion with government on these issues, for example, we contributed to the recent consultation on Automated Lane Keeping Systems (ALKS).

“Working closely with industry and academia, we have already started off-road freight platooning and are looking to move to on-road trials. We’ve had lots of discussions about freight-only lanes and the left lane is often suggested, but you have to consider the design of the road network. There are lots of junctions close to each other, so how would that work, especially at motorway speeds? At first, I see self-driving more for deliveries at slower speeds in urban areas but, as always, we will listen to consumer demand.”

For further info see highwaysengland.co.uk.

Creative technologist Ushigome on future vehicle-to-pedestrian (V2P) communications.

Self-driving news flash: flickering lights to replace eye contact in facilitating trust

Our Zenzic CAM Creator series continues with Yosuke Ushigome, Director at design innovation studio Takram.

Listing his primary interest as “emerging technologies”, London-based creative technologist, Yosuke Ushigome, has been working with Toyota on future car concepts for over 10 years. Here, he gives his thoughts on the key issues in driverless car design.

Yosuke Ushigome, director Takram
Yosuke Ushigome, director Takram

YU: “We come from a user experience (UX) background and over the years our projects with Toyota have got bigger and higher level. In 2018, with the e-Palette concept, we started taking a more holistic approach to mobility and automation – an on-the-ground people perspective on the entire system, rather than the UX of an interior, exterior or service.

“There’s going to be a trend in transparency and trust. How can designers help the systems, passengers, pedestrians and others to communicate? In the past, this has usually been based around the driver and passenger, but that’s got to expand. In cars of the future, pedestrians will not be able to look into the driver’s eyes – what’s driving might not even be on the car, it might be in the cloud.

“How can you communicate interactions that facilitate trust? That’s really interesting. People pick things up from little movements in their peripheral vision, so you come back to old school ideas like patterns of flickering lights. How fast it flashes, or flashing from left to right, could give people a little nudge, maybe help them to detect danger. This kind of experimentation will definitely increase.

“Level5 autonomy seems to me to be very far off. Level4, in areas where the road system is designed for self-driving, or on private roads where there’s more separation between vehicles and pedestrians, is coming rapidly – things like deliveries between factories. Starship delivery robots are already deployed in Milton Keynes and economics will drive adoption, especially with the pandemic.

“I would like to be part of this transformation, so long as it is inclusive. There’s an opportunity to meet the needs of people left behind by our existing transport, whether that’s physical disability or economic disadvantage.”

Toyota e-Palette concept, via Takram
Toyota e-Palette concept, via Takram

Toyota had planned to showcase its e-Palette mobility solution at the Tokyo 2020 Olympic and Paralympic Games, so hopefully we’ll get to see it next summer.

For further info, visit Takram.com.

Thomas Sors says connectivity is the essential foundation for autonomous vehicles.

Putting the C in Connected and Automated Mobility

Our Zenzic CAM Creator series continues with Beam Connectivity CEO, Thomas Sors.

Having previously led Dyson’s Connected Vehicle programme, Thomas Sors launched Beam Connectivity in January this year. It might be one of the newest cogs in the UK automotive wheel, but its Connected Vehicle as a Service (CVaaS) product is already attracting interest from car, freight and public transport manufacturers.

TS: “When it comes to connected and automated mobility (CAM) and connected and autonomous vehicles (CAVs), we see a lot of focus on the ‘A’ part, but not so much about ‘C’, which is our focus. Connectivity is the essential foundation for automation later on, but at the moment it often doesn’t perform very well. For example, OEM apps sometimes get two point something star ratings due to problems with the initial connection and latency.

“Our CVaaS solution provides a better user experience and can unlock the value of data generated by vehicle fleets. It offers a new way of getting data from vehicles to the cloud and back-end, or to send data into the vehicle. Because we’re brand new, there are no issues with legacy software – privacy by design and security by design are embedded all the way through our process, not an afterthought or a bolt-on. That starts with ensuring that we fulfil General Data Protection Regulation (GDPR) access rights, including the right to be forgotten.

“I’ve seen quotes that by 2030 all cars will have some form of connectivity. eCall [the EU initiative to enable cars to automatically contact the emergency services in the event of a serious accident] is mandatory for new cars, and that’s just the start. It’s about transparency and explaining the benefits. If you give people the option to say ‘yes, take this data in order for me to get feature X’, then that builds trust.

“From the manufacturer or fleet operator perspective, prognostics is an interesting area – fixing things before they go wrong. Then there’s the ability to understand usage patterns and perform over the air (OTA) updates. Another thing we’re already seeing is support to improve the driving experience, for example, vehicle to infrastructure communications being used to reduce congestion. We expect that to build up quickly over the next 2-4 years.

“We’re only a few months in but we’ve already deployed an end-to-end system to several vehicles and we’re looking to do more and more. It’s not unusual for manufacturers to spend 12-18 months building a connected vehicle solution, so our platform can really speed up their development lifecycle. Why build a connectivity team when we’ve already done it very effectively?

“As to self-driving, the technology is leading the way and moving along quickly, so the focus needs to be on standards, legislation and public acceptance.”

For further info, visit beamconnectivity.com.