Bosch self-driving sensors

Lidar sector thriving as established players and new start-ups push for safe self-driving.

Share this article

Self-driving gives lidar billion dollar boost


Two new reports have highlighted assisted- and self-driving as key factors predicted to boost the global automotive light detection and ranging (lidar) market.

According to Polaris Market Research, it will reach US$4.14bn by 2026, increasing at a Compound Annual Growth Rate (CAGR) of more than 35%.

The report summary noted: “The solid-state/flash lidar market is expected to grow at a very high pace during the forecast period. Solid state sensor being low-cost, robust, as well as compact in size makes it ideal for potential large-scale production of level 3 and 4 cars in coming years. Further, mechanical sensors and other sensors also capture decent market share.”

A separate report, by Markets And Markets, largely concurs, projecting a CAGR of 21.6% to reach US$3.4bn by 2026. However, it focuses more on unmanned aerial vehicles (UAVs) – drones – and 4D lidar, with the prospect of new entrants making a big impact.

Lidar in self-driving

In March, Aeva announced that its Aeries 4D lidar sensors are now supported on the Nvidia Drive autonomous vehicle platform. As well as measuring distance and plotting the position of objects in x, y and z, 4D plots velocity as a fourth dimension.

Aeva CEO Soroush Salehian on self-driving
Aeva CEO Soroush Salehian on self-driving

Both CEO Soroush Salehian and co-founder Mina Rezk previously worked on Apple’s Special Projects Group. “Bringing Aeva’s next generation 4D lidar to the Nvidia Drive platform is a leap forward for OEMs building the next generation of level 3 and 4 autonomous vehicles,” said Salehian.

“We believe Aeva’s sensors deliver superior capabilities that allow for autonomy in a broader operational design domain (ODD), and our unique features like Ultra Resolution surpass the sensing and perception capabilities of legacy sensors to help accelerate the realization of safe autonomous driving.”

You can always tell when a sector is thriving because dedicated events spring up. The fifth annual Automotive Lidar conference took place in September, while Lidar Magazine has documented the increasing crossover from surveying into car tech.

Its recent interview with Luis Dussan, founder of California-based AEye is well worth a read. “While at Northrop Grumman and Lockheed Martin, I was designing mission-critical targeting systems for our fighter jets and special ops units that searched for, identified and tracked incoming threats,” he said.

“I realized that a self-driving vehicle faces a similar challenge: it must be able to see, classify, and respond to an object – whether it’s a parked car or a child crossing the street – in real time and before it’s too late.”

Of course, the established players are also pouring money at lidar, and making huge strides. Polaris highlighted Bosch, Continental, Delphi, Denso and Velodyne, among others, with Bosch boasting “the first long-range lidar suitable for the automotive mass market”. It has a detection range of over 200m.

Dr. Mustafa Kamil of Bosch on self-driving
Dr. Mustafa Kamil of Bosch on self-driving

Dr. Mustafa Kamil, Bosch’s project manager for automated driving sensors, explained: “For automated driving to become a reality, the vehicle must perceive its surroundings more effectively than humans can, at all times. Alongside cameras, radar and ultrasonic, a further sensor principle is required in order to achieve this goal.

“For example, when the ambient light changes from bright to dark upon entering a tunnel, it can briefly pose a challenge for the camera. Meanwhile the lidar sensor remains majorly unimpeded by the change in light conditions, and can reliably recognize objects at the entrance to the tunnel in these critical milliseconds.”

He continued: “A former supervisor once told me that a lidar sensor is like a plate of spaghetti: As soon as you try to grab one piece, the others move as well. If you want to make the sensor smaller, this affects properties such as the visual field-of-view or detection range. Optimizing all components in such a way that they do not impede other variables is technically challenging.”

Please note: a version of this article was first published by the Institute of the Motor Industry’s MotorPro magazine.

Share this article

Author: Neil Kennett

Neil is MD of Featurebank Ltd. He launched Carsofthefuture.co.uk in 2019.