Malcolm Wilkinson, Head of Connected and Automated Vehicles (CAVs) and Energy at National Highways, talks future mobility.

National Highways: Making CAVs Work For The UK

Malcolm Wilkinson, Head of Connected and Automated Vehicles (CAVs) and Energy at National Highways, on intelligent infrastructure, freight platooning, hands-free zones and more…

National Highways has completed several major CAV studies recently – what are the most significant findings?

MW: “Our connected corridor project on the A2/M2 was very successful, certainly an important steppingstone. It was a joint project with Kent County Council (KCC), Transport for London (TfL), the Department for Transport (DfT) and others. We demonstrated that cellular and WiFi connectivity can be used to put highway information into vehicles, for example, signage, warnings and green lights. We also demonstrated that data can transfer the other way – to us from vehicles. The project informed our Digital Roads vision and Connected Services roadmap, influencing elements of our Digital for Customer programme.

“The Connected and Autonomous Vehicles: Infrastructure Appraisal Readiness (CAVIAR) project used both simulations and real-world data collection. The number one recommendation was the need for further study to determine how CAVs can best navigate roadworks – that’s the next step. This potentially includes infrastructure-based solutions, such as smart traffic cones, and OEMs developing ‘cautious’ behaviours, to be triggered once a CAV enters a work zone.

“The HelmUK freight platooning trial, that we led, working closely with DfT, was another really valuable exercise. We demonstrated real-world use of platooning on the M5/M6, although the fuel savings were very modest, and didn’t replicate what we were seeing on the test tracks. This was largely due to the geography and the need to break up the platoon at many of the junctions.

“We recognise the challenges with rolling out something like this, even the difficulties in ensuring that vehicles from different logistics companies – from the large suppliers to two-lorry outfits – were travelling at the same time. It is one of those technologies you can see working brilliantly on long outback roads in Australia, but the advantages of putting it into every cab in the UK are far less obvious. It’s important to learn from initiative like the ENSEMBLE multi-brand truck platooning project in Europe.”

What are the most pressing CAV issues facing National Highways?

MW: “My feeling is that car manufacturers aren’t going to want to develop completely different models for the UK market, so we need to understand our role as a highway authority. What do we need to think about in terms of highway designs, data/information provision and maintenance standards? What do we need to be investigating and researching to make sure that we as the highway authority are playing our part, doing what motor manufacturers and the public expect of us?

“There’s been a lot of talk about the need for the white lines to be readable by automated vehicles. Is that still the case? If so, what does that mean for our maintenance schedules? Can we use the data from vehicles to inform our congestion management? Is there data we can use for asset management purposes?

“It’s understanding what we need to put into the equation and what we’re going to get back out. Particularly over the next few years, with a mixed fleet with different levels of autonomy, that’s going to present new scenarios, new risks. As a highway authority we need to be conscious of those – how they’re going to affect our operations and the safety of the travelling public.”

How did you identify which parts of the network could be hands-free blue zones?

MW: “The Centre for Connected and Autonomous Vehicles (CCAV) and the Vehicle Certification Agency (VCA) led the discussions with Ford regarding authorisation of their technology on public roads. Although we liaise closely with both, we weren’t involved in the detailed discussions with Ford, but to be clear, BlueCruise is an advanced driver assistance system, so the driver has to remain alert and able to take back control.

“Going forward, we need to move closer to organisations developing these systems to understand when they are coming to market and in what numbers. That’s part of our role as a highway authority – to keep our customers safe and to inform our traffic officers, so everyone knows what to do in the event of an incident.

“We’re reaching out to Ford, to see what data they can they share with us and to develop a more collaborative relationship. It’s very exciting times. We want people to embrace CAV technology and enjoy the benefits.

“We’re some way off self-driving vehicles, but my personal view is that they will probably be available more quickly than many people think.”

Please note: a shorter version of this article was first published in the Institute of the Motor Industry’s MotorPro magazine.

Tom Leggett of Thatcham Research did an epic round of media interviews to explain what BlueCruise is – assisted driving – and isn’t – self-driving.

Not self-driving: Thatcham media marathon to clear up BlueCruise capability confusion

Few were expecting it, but 13 April 2023 will go down in British motoring history. It was the day Ford announced that the Department for Transport (DfT) had approved the use of its BlueCruise assisted driving system on parts of the UK motorway network, making hands-free legal for the first time.

Initially, only a select few gained the ability to go ‘hands off, eyes on’ – drivers of 2023 Ford Mustang Mach-E cars who activate a subscription. Even then, use is restricted to 2,300 miles of pre-mapped motorways in England, Scotland and Wales – the new ‘Blue Zones’. Be in no doubt though, this is momentous.

One foot in the future

“It’s not every day you can say you’ve placed one foot in the future,” said Martin Sander, General Manager at Ford in Europe. “BlueCruise becoming the first hands-free driving system of its kind to receive approval for use in a European country is a significant step forward for our industry.”

UK Transport Minister, Jesse Norman, agreed: “I am delighted that this country is once more at the forefront of innovation. The latest advanced driver assistance systems (ADAS) make driving smoother and easier, but they can also help make roads safer by reducing scope for driver error.”

One of the main themes at the recent Zenzic Connected and Automated Mobility (CAM) Innovators event was the need to do more to establish the UK as a global leader. This embracing of hands-free will be noted around the world.

Ford describes BlueCruise as Level 2 driver assistance, with Lisa Brankin, managing director of Ford in Britain, telling the BBC’s Today programme that, in the case of an accident, the driver will still be responsible as the technology is “not autonomous driving”.

Ford BlueCruise graphic, 2023
Ford BlueCruise graphic, 2023

BlueCruise combines intelligent adaptive cruise control and lane-centering with an in-cabin camera monitoring eye gaze and head position. If necessary, alerts in the instrument cluster and audible chimes will prompt the driver to return their eyes to the road.

Assisted not self-driving

Unfortunately, and rather predictably, much of the UK media again confused assisted driving and self-driving. The Guardian went with “First hands-free self-driving system approved for British motorways”, The Sun with “Huge car firm is launching the UK’s first-approved self-driving technology”.

Huge credit to Tom Leggett, vehicle technology specialist at Thatcham Research, for doing a marathon round of media interviews to explain what BlueCruise is – assisted driving– and what it isn’t – driverless or self-driving.

“The sudden introduction of this technology did catch the industry a little off-guard, as it was not anticipated that it would reach UK roads for another 18-months or maybe even two years,” he said.

“It has been approved by the Vehicle Certification Agency (VCA) under Article 39 for a new and innovative technology, albeit based on current technology. Basically, the VCA were convinced by evidence from Ford, and their own on-track and on-road testing, that BlueCruise is as safe as, and not fundamentally different to, existing assisted driving technologies.

“The key point to emphasise is that it is assisted driving. What makes it slightly different is that it permits the driver to take their hands off the steering wheel. However, the driver is always responsible for driving. Any input from the driver, such as braking or changing lane, and the system will essentially turn off.

“The hope is that the driver monitoring will make it even safer. It is a camera system which looks at the driver’s direction of gaze to ensure they’re concentrating on the road, not looking out of the window or checking their phone.

“At Thatcham Research, we believe direct driver monitoring will have a significant role in addressing drowsiness and distraction. Currently in the UK, about 25% of all accidents involve some sort of distraction.

“It is vital that drivers using BlueCruise are aware of their responsibilities, and we’ll also be very interested to understand how they feel about using it.”

Please note: a version of this article was first published in the Institute of the Motor Industry’s MotorPro magazine.

Related story: Barrister Alex Glassbrook says approval of hands-free driving is a radical development in UK motoring, and should be accompanied by effective official guidance, training and information to the public and affected organisations.

Motor law expert on hands-free – ‘hands off, eyes on’ – driving becoming legal in the UK.

Quiet regulation of a radical step: Barrister raises concerns about lack of guidance on hands-free driving

Alex Glassbrook, a barrister at Temple Garden Chambers, says that approval of hands-free driving is a radical development in UK motoring, and should be accompanied by effective official guidance, training and information to the public and affected organisations.

Where does the Ford hands-free announcement sit in the shift to self-driving in the UK?

AG: “The first question many of us asked was: Is this the first automated vehicle under the Automated and Electric Vehicles Act (AEVA) 2018? It appears that it’s not. First, because it hasn’t been listed under Section 1 of the Act by the Secretary of State for Transport. Second, because it seems not to fulfil the criterion of a system that does not need to be monitored by the driver, which is part of the legal definition under Section 1 and Section 8.

UK Government list of self-driving vehicles (3 May 2023)
UK Government list of self-driving vehicles (3 May 2023)

“So, what we’re looking at is a vehicle with advanced driver assistance, but not a driverless vehicle. Equally, what we’re looking at is something that does represent a culture change, because the driver is allowed to remove their hands from the steering wheel. It’s described as a ‘hands off, eyes on’ system, although this hasn’t prevented the media reporting it as a driverless system, which has implications for safety.”

What do you note about the roads which have been designated ‘Blue Zones’?

AG: “A Blue Zone seems to be the marketing name for an area in which this system can work. I’m not an engineer and I’ve not seen the technical details of the permission that has been given by government for this to operate. However, I note the description of the system as being limited to pre-mapped motorways.

“In a regulatory sense, there is broad symmetry between this and the e-scooter trials, in that they both appear to be based upon government permissions on a set of conditions and restricted to certain areas. But there are plenty of dissimilarities too. For example, that motorised scooters and mopeds (as e-scooters are classified) have been with us for over 100 years, whereas computer mapping technology is relatively new.

“What’s new about a ‘hands off, eyes on’ system is the relinquishing of physical control of steering by the human driver, which is a radical step. The technology itself is a progression of cruise control, which was introduced in the 1950s and came to prominence in the 1970s during the fuel crisis in the US. But relinquishing control of steering at motorway speeds is different – a profound step in both regulatory and practical terms.”

What needs to be considered now that hands-free driving is a legal reality in the UK?

AG: “Let’s begin with some historical context. Driver assistance systems have been accumulating for some time, but the legal standard for driving has not really altered since 1971. It was then that Lord Denning, in the case of Nettleship v Weston, set what can be summarised as the standard of the reasonably prudent human driver.

“It’s a largely objective test, and there are some exceptions, but since established it has never been substantially altered. That’s quite surprising because cruise control is now in such common use that you might have expected the standard of care to have been particularised in relation to it. Now we have a system that explicitly allows the driver to let go of the steering wheel while the car is in motion at motorway speeds. In the coming years, a court might face the question of what standard of attention is required of a driver using a ‘hands off’ system.

“For good reasons, namely the need to plan future laws, we have become very focused on fully driverless vehicles. That’s not a complete strategy, as it can mean that we’re looking to the horizon rather than at what is actually in front of us. To go back to the history for a moment, it took quite some time after the introduction of the motor car for The Highway Code to be introduced. The first edition was published in 1931, written guidance which many of us will have looked at.

“The Highway Code isn’t meant to be specialist guidance to industry, it’s meant to be comprehensible guidance to the public. Advanced driver assistance systems (ADAS) have been regulated ‘quietly’, mainly settled by negotiation at international level and then applied as industrial standards by national approval authorities. ‘Hands free’ driving seems too significant a step for that trend to continue without better official education about advanced driver assistance systems, and what they can and cannot be relied upon to do.”

So how does the guidance need to change?

AG: “The number of driver assistance systems has increased over time, and the quantity of such systems alone can be confusing. I saw an article recently on the most irritating modern vehicle features! Meanwhile, The Highway Code is still largely a text document, not very friendly to mobile devices, and there are plenty of situations it simply doesn’t deal with.

“At the moment, the guidance on driver assistance systems, rule 150, says in essence that those systems are only assistive, that you have to be careful while using them and not let your attention be distracted. Is that guidance too general, for a ‘hands off, eyes on’ system which allows the driver to take their hands off the wheel while driving a car on a motorway? Then there’s rule 160 – “Once moving you should… drive or ride with both hands on the wheel or handlebars where possible” – which will presumably need revision.

Hands-free but Highway Code says "both hands on the wheel" (3 May 2023)
Hands-free but Highway Code says “both hands on the wheel” (3 May 2023)

“We need to think practically about the information which people need to use these systems safely, and how best to communicate it. For example, a feature of this and other systems is that their announcement is often accompanied by explanatory YouTube videos. The Secretary of State for Transport has wide powers to provide guidance and road safety training and information, not only by the Highway Code, under sections 38 and 39 of the Road Traffic Act 1988. He is not limited to one means of providing that information.

“There’s also an argument that we focus too much upon the user of the system. Should road users around a vehicle be made aware that it might be being steered by a computer rather than a human?

“Others affected include those who enforce driving laws and who respond to road traffic collisions, particularly the police and National Highways officers. Then other public authorities, such as the judiciary, and businesses, such as driving instructors and insurance companies – those who form part of the wider motoring ecosystem. All of these people need to be aware.

“So, as well as the issue as to its content, I come back to the question of whether the Highway Code, coming up for its 100th birthday, and still a text document, represents the best or only available form of communication.”

Advanced, Automated and Electric Vehicle Law, 2023
Advanced, Automated and Electric Vehicle Law, 2023

The author of 2017’s “The Law of Driverless Cars: An Introduction” and co-author of 2019’s “A Practical Guide to the Law of Driverless Cars”, Alex Glassbrook’s new book “Advanced, Automated and Electric Vehicle Law” is available for pre-order now.

BlueCruise is good, but it’s not self-driving.

Bolt from the blue oval: hands-free Ford is UK 1st but NOT self-driving

Big news! The Department for Transport has approved the use of Ford’s BlueCruise assisted driving system on parts of the UK motorway network. Be in no doubt, this is momentous – the first time UK drivers will legally be able to take their hands off the wheel. But what does it mean for self-driving?

The scope

As we sit here today, only a select few have gained the ability to sometimes go hands-free – drivers of 2023 Ford Mustang Mach-E cars who activate a subscription. They can then use the “hands-off, eyes-on” tech on 2,300 miles of pre-mapped motorways in England, Scotland and Wales – the new ‘Blue Zones’.

UK motorway blue zones - April 2023
UK motorway blue zones – April 2023

The Ford video below explains how it works, with the voiceover saying: “BlueCruise combines with your intelligent adaptive cruise control and lane-centering systems, allowing you to take your hands off the steering wheel while it maintains cruising speed and keeps you in your current lane.

“An infrared camera monitors your eye gaze and head position to ensure that you’re paying due care and attention to the road ahead. If the system finds you’re not looking at the road it will notify you either with an alert message displayed in the instrument cluster or by sounding an audible chime to remind you to return your eyes to the road.

“If you do not react to the warnings the system will cancel, gently pump the brakes to get your attention and slow your vehicle down while maintaining steering control.”

Ford assisted driving video

The legalities

Last year the government seemed to be planning to class cars equipped with Automated Lane Keeping Systems (ALKS) as self-driving. That hasn’t happened, which is a very welcome shift.

The UK government’s website confirms: “At present, there are no self-driving vehicles listed for use in Great Britain”.

Ford itself describes BlueCruise as Level 2 driver assistance, and Transport Minister Jesse Norman made clear: “The latest advanced driver assistance systems make driving smoother and easier, but they can also help make roads safer.”

Jesse Norman, Minister of State in the Department for Transport
Jesse Norman, Minister of State in the Department for Transport

Lisa Brankin, managing director of Ford in Britain and Ireland, told the BBC‘s Today programme on Friday that, in the case of an accident, the driver will still be responsible as the technology is “not autonomous driving”.

The beeb also noted that other vehicle manufacturers offer similar systems – Tesla has Autopilot and Mercedes has Drive Pilot. Interestingly, the latter announced last year that it will accept legal responsibility for accidents caused by its system.

One of the main themes at the recent Zenzic Connected and Automated Mobility Innovators event was the need to do more to establish the UK as a global leader in CAM. This embracing of hands-free will be noted around the world.

Self-driving headlines

Unfortunately, and rather predictably, much of the UK media has again confused assisted driving and self-driving.

The Guardian went with the headline “First hands-free self-driving system approved for British motorways”.

The Sun went with “HANDS OFF Huge car firm is launching the UK’s first-approved self-driving technology”.

Various outlets, including ITV, even regurgitated the line from the press release that BlueCruise can operate up to 80mph. Not on UK roads presumably as that’s 10mph above the motorway speed limit!

Let’s be clear – this lack of clarity is dangerous. Lives are at stake and road safety should be paramount.

Eyes on the road

This Ford video shows a driver happily gazing out of the window and being warned to “watch the road”.

Ford hands-free video

As the All-Party Parliamentary Group on Connected and Automated Mobility stated in its red lines: “A statutory definition of self-driving must be established to distinguish this technology from assisted driving”.

The final word goes to Tom Leggett, of Thatcham, who emphasised: “For the first time ever drivers will be permitted to take their hands off the wheel. However, their eyes must remain on the road ahead. Crucially, the driver is not permitted to use their mobile, fall asleep or conduct any activity that takes attention away from the road.”

Cutting-edge radar for ADAS and self-driving

Revolutionary self-driving tech: Oxford RF’s solid-state 360-degree sensor

In this Cars of the Future exclusive, we talk solid-state 360-degree radar, ADAS, self-driving and Zenzic success with Dr Kashif Siddiq, founder of Oxford RF Solutions.

How did you come up with the 360-degree radar idea?

KS: “We’ve specialised in radar and sensor technologies for 15 years, creating a lot of tech for other businesses. Then it struck us that there’s a huge gap in the market.

“The problem we see is people taking off-the-shelf sensors and bolting them to vehicles to try and make them autonomous. This probably isn’t the right way of doing it. What we need is sensors designed specifically for autonomous vehicles. That was the idea behind Oxford RF.

“We’ve developed a prototype which solves some of the burning challenges in perception sensors for ADAS and self-driving. It also has drone, space and marine applications. It is the world’s first solid-state 360-degree sensor. Actually, we’ve already taken it to the next level by making it hemispherical, so it can see upwards in a dome as well as all-round.

“There are no moving parts and we have the capability to integrate multiple technologies within the same box, but we’re focusing mainly on radar for now.”

Oxford RF and the APC

Oxford RF has been supported by the Advanced Propulsion Centre (APC) via its Technology Developer Accelerator Programme (TDAP), including collaboration with the Warwick Manufacturing Group (WMG).

Self-driving investment: Oxford RF has been supported by the Advanced Propulsion Centre

And won funding as one of 2022’s Zenzic CAM Scale-Up winners

KS: “We applied last year but at that stage we only had an idea rather than a technology to test. Now we have a working prototype and are really leading the thought process when it comes to perception sensing.

“The current situation with advanced driver assistance systems (ADAS) is a mix of cameras, radars and lidars being used to effectively give a full 360-degree picture. There’s an architectural problem with this. First of all, the price.

“Each of those sensors is expensive and there’s so many of them. Then, obviously, all that data needs to be routed to a centralised computer, and that causes latency. Milliseconds are valuable when it comes to saving lives.

“Another issue is redundancy: what’s the backup if one sensor fails? All too often the answer is another sensor, which means yet more cost. And you start to run into the mutual interference problem.”

Self-driving winners: Zenzic CAM Scale-Up Programme (2022 cohort)
Self driving winners: Zenzic CAM Scale-Up Programme (2022 cohort)

Safety-critical benefits

KS: “In a nutshell, we’ve reengineered sensor architecture. It doesn’t need to be radar, it can be any sensor. This allows us to reduce the sensor count.

“Initially we installed them on the car roof, but we’re moving them to the four corners, inside the bumpers. Less sensors means less latency in decision making, so it’s a faster system overall. It’s also inherently more resilient to interference.

“From a safety critical point of view, the four corners approach comes with redundancy built-in, because if one of the 360-degree sensors fails, two others are still looking at the same point.

“Delivering visibility in all conditions has to be seen as a deep tech problem and solved on a scientific basis. Are we able to reduce the mortality rate? That’s the real acid test.

“Further to that, from a finance point of view, can we reduce the cost of what I call the minimum viable sensor suite? Does that enable manufacturers to reduce car prices? Or insurers to reduce premiums due to less crashes?

ADAS first, then self-driving

KS: “We’re taking a beachhead approach and the first application will be ADAS. We’ll prove our technology there and then scale to full autonomy. Over the next year, we’re planning to produce about 100 of our solid-state 360-degree radars, to expand trials with our initial customers.

“We’re planning to start commercial production in 2024. From there, we’ll expand into other markets, as many as we practically can. For example, in drone applications, we’ll usually only need one sensor. For spacecraft, we’re looking at two front-facing sensors. For marine vessels, we’re talking about three sensors – one on the bow and two on the stern.

“It will take time to develop our business to a level where we can supply all of these markets, but it’s really good to see that there’s already significant interest.”

For further info, visit the Oxford RF website

Lidar sector thriving as established players and new start-ups push for safe self-driving.

Self-driving gives lidar billion dollar boost

Two new reports have highlighted assisted- and self-driving as key factors predicted to boost the global automotive light detection and ranging (lidar) market.

According to Polaris Market Research, it will reach US$4.14bn by 2026, increasing at a Compound Annual Growth Rate (CAGR) of more than 35%.

The report summary noted: “The solid-state/flash lidar market is expected to grow at a very high pace during the forecast period. Solid state sensor being low-cost, robust, as well as compact in size makes it ideal for potential large-scale production of level 3 and 4 cars in coming years. Further, mechanical sensors and other sensors also capture decent market share.”

A separate report, by Markets And Markets, largely concurs, projecting a CAGR of 21.6% to reach US$3.4bn by 2026. However, it focuses more on unmanned aerial vehicles (UAVs) – drones – and 4D lidar, with the prospect of new entrants making a big impact.

Lidar in self-driving

In March, Aeva announced that its Aeries 4D lidar sensors are now supported on the Nvidia Drive autonomous vehicle platform. As well as measuring distance and plotting the position of objects in x, y and z, 4D plots velocity as a fourth dimension.

Aeva CEO Soroush Salehian on self-driving
Aeva CEO Soroush Salehian on self-driving

Both CEO Soroush Salehian and co-founder Mina Rezk previously worked on Apple’s Special Projects Group. “Bringing Aeva’s next generation 4D lidar to the Nvidia Drive platform is a leap forward for OEMs building the next generation of level 3 and 4 autonomous vehicles,” said Salehian.

“We believe Aeva’s sensors deliver superior capabilities that allow for autonomy in a broader operational design domain (ODD), and our unique features like Ultra Resolution surpass the sensing and perception capabilities of legacy sensors to help accelerate the realization of safe autonomous driving.”

You can always tell when a sector is thriving because dedicated events spring up. The fifth annual Automotive Lidar conference took place in September, while Lidar Magazine has documented the increasing crossover from surveying into car tech.

Its recent interview with Luis Dussan, founder of California-based AEye is well worth a read. “While at Northrop Grumman and Lockheed Martin, I was designing mission-critical targeting systems for our fighter jets and special ops units that searched for, identified and tracked incoming threats,” he said.

“I realized that a self-driving vehicle faces a similar challenge: it must be able to see, classify, and respond to an object – whether it’s a parked car or a child crossing the street – in real time and before it’s too late.”

Of course, the established players are also pouring money at lidar, and making huge strides. Polaris highlighted Bosch, Continental, Delphi, Denso and Velodyne, among others, with Bosch boasting “the first long-range lidar suitable for the automotive mass market”. It has a detection range of over 200m.

Dr. Mustafa Kamil of Bosch on self-driving
Dr. Mustafa Kamil of Bosch on self-driving

Dr. Mustafa Kamil, Bosch’s project manager for automated driving sensors, explained: “For automated driving to become a reality, the vehicle must perceive its surroundings more effectively than humans can, at all times. Alongside cameras, radar and ultrasonic, a further sensor principle is required in order to achieve this goal.

“For example, when the ambient light changes from bright to dark upon entering a tunnel, it can briefly pose a challenge for the camera. Meanwhile the lidar sensor remains majorly unimpeded by the change in light conditions, and can reliably recognize objects at the entrance to the tunnel in these critical milliseconds.”

He continued: “A former supervisor once told me that a lidar sensor is like a plate of spaghetti: As soon as you try to grab one piece, the others move as well. If you want to make the sensor smaller, this affects properties such as the visual field-of-view or detection range. Optimizing all components in such a way that they do not impede other variables is technically challenging.”

Please note: a version of this article was first published by the Institute of the Motor Industry’s MotorPro magazine.

EV to ADAS, Tesla has revolutionised the car industry at lightning speed

Tesla: With EV no longer a USP, ADAS is the new battleground

What company springs to mind when you think cutting-edge auto tech? Same here. Tesla. At the recent FT Future of the Car Summit, Elon Musk reminisced about the first Roadster.

“There were no start-ups doing electric cars, and the big car companies had really no electric car programmes,” he said. “Unless we tried, they were not going to be created. It wasn’t from a standpoint of thinking, hey, here’s a super lucrative idea.”

EV all the way: Tesla line-up
EV all the way: Tesla car line-up

20 years later, Tesla is the world’s most valuable car brand, and it’s not even close. In June 2022, Statista valued it at US$75.9 billion, up from a mere 40-odd billion in 2020, and substantially more than second-placed Toyota and third-placed Mercedes-Benz put together.

From drivetrains to marketing, it has shredded the vehicle manufacturing rulebook, and continues to do so. Consider just some of the key developments over the last six months.

Tesla to Twitter

In March, Musk entered into a Twitter spat with US president Joe Biden, after the latter praised Ford for investing $11billion to build EVs, creating 11,000 jobs, and GM for investing $7billion, creating 4,000 jobs. He retorted: “Tesla has created over 50,000 US jobs building electric vehicles and is investing more than double GM and Ford combined.”

Research by StockApps confirmed that Tesla spends miles more on R&D than rival carmakers, around $3,000 per vehicle produced. While Electrek highlighted that Tesla spends nothing on advertising, relying “almost entirely on word-of-mouth”.

It wasn’t all plain sailing. A court in Germany ordered Tesla to buy back a Model 3 from a customer who likened the Full Self-Driving (FSD) package to “a drunk first-time driver”. With EV no longer a USP, ADAS is the new battleground.

In May, a judge in California ruled that the driver of a Tesla operating in Autopilot must stand trial for a crash that killed two people. A Model S reportedly ran a red light and hit a Honda Civic at 74 mph. It could mark the first felony prosecution against a driver using a partially automated driving system.

More negative press followed when it emerged that hundreds of Tesla owners had complained about “phantom braking”, with cars stopping suddenly for no apparent reason.

Then, in June, the US National Highway Traffic Safety Administration (NHTSA) published the first of its new monthly reports into crashes involving vehicles with ADAS. Tesla had the most, followed by Honda and Subaru.

Cue the headlines, “Tesla Autopilot and Other Driver-Assist Systems Linked to Hundreds of Crashes” in The New York Times, and “Teslas running Autopilot involved in 273 crashes reported since last year” in The Washington Post.

Importantly, the US Public Interest Research Group clarified that: “Teslas are connected to the internet and automatically report if the car was in Autopilot. Honda asks its drivers if they were using ADAS, so it relies on hard-to-verify personal accounts. Everyone else leaves it up to the police report.”

Tesla went on the offensive, quoting some eye-catching statistics: “In 2021, we recorded 0.22 crashes for every million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded 0.77 crashes for every million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there are 1.81 automobile crashes for every million miles driven.”

Its Impact Report also noted that, “In 2021, the global fleet of Tesla vehicles, energy storage and solar panels enabled its customers to avoid emitting 8.4 million metric tons of CO2e”, compared to an ICE vehicle with a real-world fuel economy of 24mpg. A timely reminder of the extent of its achievement.

That’s a whirlwind six months, and we haven’t even mentioned the Gigafactory in Texas, the Cybertruck SUV, the plans to launch a steering wheel free robotaxi by 2024, June’s new car price hikes, or the off-the-chart used values.

The fact is Tesla has revolutionised the global motor industry at lightning speed, and shows no signs of slowing. 

Please note: a version of this article was first published by the Institute of the Motor Industry’s MotorPro magazine.

New survey on ADAS and self-driving by The Insurance Institute for Highway Safety in America raises questions for UK legislators and motorists

Who wants self-driving anyway? US survey finds 80% love ADAS but not hands-free

A new survey on full and partial self-driving by The Insurance Institute for Highway Safety (IIHS) in America has found significant mistrust of automated lane changing systems, with drivers preferring to stay hands-on and initiate the manoeuvre themselves.

The IIHS – a respected non-profit educational organization dedicated to reducing deaths from motor vehicle crashes – surveyed over 1,000 drivers on questions related to partial automation between September and October 2021, with the results published in June 2022.

The headline finding was that 80% wanted to use “at least some form of lane centering” – a strong endorsement for what we Brits call automated lane keeping systems (ALKS).

Report covers ADAS & ADS

IIHS report on consumer demand for ADAS and self-driving June 2022
IIHS report on consumer demand for ADAS and self-driving June 2022

36% preferred “hands-on-wheel” lane keeping, compared to 27% for “hands-free”, with 18% having no preference between the two types, 16% not wanting to use any form of lane keeping and 4% being unsure.

If you think that shows an appreciation of advanced driver assistance systems (ADAS) but a mistrust of conditionally automated driving systems (ADS), the next finding appears to confirm that.

Asked about lane changing assistance (as opposed to just lane keeping), 73% said they would use some form of auto lane change. However, 45% said they’d prefer to use driver-initiated auto lane change compared to only 14% for vehicle-initiated auto lane change. 23% said they wouldn’t use either type, 13% had no preference and 5% were unsure.

What’s more, on self-driving technology, 35% said they found it “extremely appealing” while 23% said it was “not at all appealing”.

Alexandra Mueller, the IIHS survey’s primary designer, commented: “Automakers often assume that drivers want as much technology as they can get in their vehicles. But few studies have examined actual consumer opinions about partial driving automation.

“It may come as a surprise to some people, but it appears that partially automated features that require the driver’s hands to be on the wheel are actually closer to one-size-fits-all than hands-free designs.”

Another eye-catching finding was the high number of people “at least somewhat comfortable” with in-cabin driver monitoring to support such systems: 70% for steering wheel sensors, 59% for camera monitoring of driver hands and 57% for camera monitoring of driver gaze.

“The drivers who were the most comfortable with all types of driver monitoring tended to say they would feel safer knowing that the vehicle was monitoring them to ensure they were using the feature properly,” said Mueller.

“That suggests that communicating the safety rationale for monitoring may help to ease consumers’ concerns about privacy or other objections.”

Self-driving questions

For us, the study is particularly interesting in terms of the UK government’s plan to list vehicles approved under the Automated Lane Keeping System (ALKS) Regulation as self-driving.

For the drivers of certain new high tech cars, this could be the first time that any hands-free driving becomes legal on UK roads. The current suggestion is for this to be restricted to slow motorway traffic (max 37mph), initially at least.

Further still, the acceptance of driver monitoring seems relevant to point four of the All-Party Parliamentary Group (APPG) on Connected and Automated Mobility’s seven expert recommended red lines: “Establish minimum standards for data sharing and handling to ensure transparency and effective governance”. 

The full IIHS report is available here.

Reaction to first monthly NHTSA data on crashes involving vehicles with ADAS and ADS.

US National Highway Traffic Safety Administration publishes first monthly report into ADAS and ADS crashes

On 15 June, the US Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) published the first of what will be monthly reports into crashes involving vehicles with advanced driver assistance systems (ADAS) and more advanced automated driving systems (ADS).

ADAS

In brief, for SAE Level 2 ADAS equipped vehicles, 367 crashes were reported from July 2021 to 15 May 2022, resulting in six fatalities and five cases of serious injury. Tesla reported the most, followed by Honda and Subaru.

Cue the headlines, “Tesla Autopilot and Other Driver-Assist Systems Linked to Hundreds of Crashes” in the New York Times and “Teslas running Autopilot involved in 273 crashes reported since last year” in the Washington Post.

However, the United States Public Interest Research Group (US PIRG) shed light on this, explaining that “Teslas are connected to the internet and automatically report if the car was in Autopilot when it crashed. Honda asks its drivers if they were using ADAS, so it relies on hard-to-verify personal accounts. Everyone else leaves it up to the police report.”

ADS

For ADS, nearly all the data comes from California. 130 crashes were reported from July 2021 to 15 May 2022. One resulted in serious injury. Waymo reported the most incidents, followed by Transdev Alternative Solutions and then Cruise.

Reaction

Dr. Steven Cliff, NHTSA’s Administrator, said: “The data released today are part of our commitment to transparency, accountability and public safety.

“New vehicle technologies have the potential to help prevent crashes, reduce crash severity and save lives, and the Department is interested in fostering technologies that are proven to do so; collecting this data is an important step in that effort.

“As we gather more data, NHTSA will be able to better identify any emerging risks or trends and learn more about how these technologies are performing in the real world.”

Autonomous vehicle safety consultant Philip Koopman welcomed the new data, commenting: “This is an excellent first step for transparency. All of us safety advocates can wish for more data and for less redaction, but this is a crucial step forward.

“If I had one wish, it would be to divide the narrative data field into two sections: public narrative and confidential narrative, and put huge pressure on the reporting companies to minimize things put into the confidential narrative.”

On this side of the pond, The Law Commission has recommended that automated vehicles must be able to record and store data necessary for incident investigation.

New reports predict self-driving will massively boost the global LiDAR market, with Aeva’s Aeries 4D LiDAR highlighted.

Self-driving to super charge global LiDAR market to US$3bn+ within 5 years

Two new reports have highlighted self-driving as one of the main factors predicted to boost the global LiDAR market to at least US$3.4 billion a year by 2026.

According to Polaris Market Research, the global automotive LiDAR market is anticipated to reach US$4.14bn by 2026, increasing at a Compound Annual Growth Rate (CAGR) of more than 35%.

LiDAR for self-driving

The report summary noted: “The Automotive LiDAR market growth is attributed to the increasing demand of autonomous vehicles for active safety and self-driving. As advanced driver assistance systems (ADAS) and autonomous vehicles are expected to witness growth at significant rates, it is expected to have a direct positive impact on the growth in the Automotive LiDAR market.

“These automated vehicles provide opportunities for a large number of firms to access a range of untapped facts, creating new revenue-generating opportunities, which will boost the market growth.

“The solid-state/flash LiDAR market is expected to grow at a very high pace during the forecast period. Solid state sensor being low-cost, robust, as well as compact in size makes it ideal for potential large-scale production of level 3 and level 4 cars in coming years. Further, mechanical sensors and other sensors also capture decent market share.”

Polaris highlight leading industry players including Scans, Velodyne LIDAR, Quanergy Systems, LeddarTech, First Sensor, Novariant, Delphi, Continental, Robert Bosch and Denso.

A separate report, by Markets And Markets, largely concurs with these findings, projecting that the LiDAR market will grow at a CAGR of 21.6% from 2021 to 2026 to reach US$3.4 billion by 2026.

LiDAR for UAVs

However, it focuses more on unmanned aerial vehicles (UAVs) – drones – and 4D LiDAR specifically.

“The rising adoption of LiDAR systems in UAVs, increasing adoption of LiDAR in engineering and construction applications, use of LiDAR in geographical information systems (GIS) applications, the emergence of 4D LiDAR, and easing of regulations related to the use of commercial drones in different applications are among the factors driving the growth of the LiDAR market,” it says.

“However, safety threats related to UAVs and autonomous cars and the easy availability of low-cost and lightweight photogrammetry systems are restraining the growth of the market.

“The market for 4D LiDAR is projected to grow at the highest CAGR from 2021 to 2026. This growth is attributed to the high adoption of 4D LiDAR in applications such as self-driving cars, robots, and other autonomous systems.

“Apart from automobiles, 4D LiDAR has applications in the architecture, engineering, and construction (AEC) industry, entertainment, and AR/VR. Some of the major companies offering 4D LiDAR are Aeva and TetraVue.”

In March, sensing systems developer Aeva announced that its Aeries 4D LiDAR sensors are now supported on the Nvidia Drive autonomous vehicle platform.

As well as measuring distance and plotting the position of objects in x, y and z, Aeva’s 4D-LiDAR plots velocity as a fourth dimension.

Aeva CEO Soroush Salehian on self-driving
Aeva CEO Soroush Salehian on self-driving

Soroush Salehian, Co-Founder and CEO at Aeva (formerly of Apple’s Special Projects Group), said: “Bringing Aeva’s next generation 4D LiDAR to the Nvidia Drive platform is a leap forward for OEMs building the next generation of Level 3 and Level 4 autonomous vehicles.

“We believe Aeva’s sensors deliver superior capabilities that allow for autonomy in a broader operational design domain (ODD), and our unique features like Ultra Resolution surpass the sensing and perception capabilities of legacy sensors to help accelerate the realization of safe autonomous driving.”

Gary Hicok, Senior Vice President of Engineering at Nvidia, added: “Aeva delivers a unique advantage for perception in automated vehicles because it leverages per-point instant velocity information to detect and classify objects with higher confidence across longer ranges.

“With Aeva as part of our Drive ecosystem network, we can provide customers access to this next generation of sensing capabilities for safe autonomous driving.”