Steven Spieczny of Kognic gives us the lowdown on sensor-fusion annotation for ADAS and self-driving…

Self-driving safety accelerator: Kognic turns sensor-fusion into datasets you can trust

Launched five years ago and already working with household-name OEMs, Gothenburg-based Kognic is very much one to watch in the fast-growing self-driving perception software sector.

Cars of the Future spoke to Vice President of Marketing, Steven Spieczny, to find out about the sensor-fusion annotation platform everyone’s talking about…

Steven Spieczny is Vice President of Marketing at Kognic
Steven Spieczny is Vice President of Marketing at Kognic

“Kognic was founded by two technologists from the machine learning space to address the need for accurate training data for Advanced Driver Assistance Systems (ADAS) and Automated Driving Systems (ADS).

“We have a quickly built-up a diverse customer base, including global vehicle manufacturers such as Volvo Cars/Zenseact, Tier 1 suppliers like Bosch, Continental and Qualcomm, and some very innovate start-ups such as Kodiak, a leader in autonomous for commercial trucking.

“We process data from cars, vans, trucks, drones and robots, and feed it into our cloud-based software platform. Information from cameras, lidar, radar, gets pulled into one comprehensive dataset.

“From there, everything that can be sensed is defined and labelled – that’s a road sign, that’s a pedestrian sitting on a bench, that’s a big truck straight ahead!

Kognic annotation

“This tagging process is called annotation. You see it quite a bit in healthcare, for example, to automatically flag broken bones on scans. In automotive, the level of complexity is much higher because the data we’re capturing is constantly changing, literally a moving picture.

“We help our customers to manage and curate this data so they can, in turn, use these datasets to power their AI products through model validation and tracking of performance.

“For ADAS, it started with lane marking recognition, where there are a lot of variables. Then you expand the data domain, which gets you these rare occurrences. For instance, light source object detection (LSOD) is a crucial use case where the reflection of a vehicle must be distinguished from an actual vehicle on the road.

Kognic point cloud for ADAS and self-driving with pedestrians and vehicles
Kognic point cloud with pedestrians and vehicles

“Obviously, in the AV industry, there’s been a fair bit of turmoil over the last few years for consumer vehicle applications. This gave way to a parallel focus on commercial trucking.

“One of the early assessments was that long-haul trucking was the perfect use case – long straight roads, no pedestrians. It turns out this is actually a really hard dynamic to get right – high speeds, sensor range limitations and long stopping distances, especially when fully loaded, contribute to a similarly complex situation.

Kodiak is one of our trucking customers in the US. They’re doing about 70,000 autonomous miles a month now, all the way from California down through the Southwest into Texas.

“They’re a success story in a sector which, like robotaxis, has seen a lot of ups and downs. Kodiak supply a top to bottom autonomous stack, and we sit behind that, pushing and pulling all this sensor data to enable their machine learning to make better decisions.

Kognic pre-annotation

“The ascension of the data scientist is important here, along with the new depth of technology around self-supervised learning, all these very geeky things.

“Data is the fuel for Machine Learning Operations (MLOps) – this idea of programming with data, rather than traditional coding. Our software enables the fusion of data from various sensors and the way we pre-annotate helps to make the whole process more efficient and cost-effective. That’s our USP.

Kognic pre-annotation for ADAS and self-driving
Kognic pre-annotation for ADAS and self-driving

“In the future, we believe, as many do, that everything that moves will have some form of autonomy. The whole world of AI is very dynamic, particularly with regards to self-driving cars because of the vast amount of data involved.

“For something like ChatGPT, 80% accuracy might be ok. For something as safety critical as self-driving, it has to be 99.9%. The principal is the same though – more inputs in order for the machine to learn on its own and be smarter about the outputs.

Sensor-fusion for self-driving

“We agree with Wayve that Embodied AI is the great North Star, but we’re not there yet. It’s unrealistic for the market to assume that we’re quickly going to jump to level 5 autonomy. We’re going to have to build up the capability, and that’s a big challenge.

“Concurrent to all this is the transition to the software defined vehicle (SDV). There’s a Mercedes model which is approved for level 3 in certain conditions in Germany. In the UK, Ford’s BlueCruise assisted driving system enables hands-off on some motorways.

“In a nutshell, the better your data, the better your models will be, and that will ultimately result in better user experiences. We call this alignment to expectation, and safety is the biggest issue.

“The self-driving industry needs a way to accurately calibrate and merge senor data to provide the machine with a very specific picture about what it is seeing at any given time. Kognic has that annotation platform to produce what is needed.”

For further info see Kognic.com

FocalPoint’s automotive Global Navigation Satellite System (GNSS) tech for ADAS and self-driving

European Space Agency supports FocalPoint in pioneering London automotive GNSS project

Supported by the European Space Agency (ESA), FocalPoint has developed a new Global Navigation Satellite System (GNSS) receiver to demonstrate its S-GNSS Auto software.

Last year we covered how the Cambridge-based company’s S-GNSS Auto solution helps to improve positioning accuracy in urban environments, as well as being more resilient to radio frequency (RF) spoofing attacks – with clear benefits for self-driving. Now, thanks to ESA’s Navigation Innovation and Support Programme (NAVISP), it can evidence this in real-time.

Gonzalo Martin de Mercado, NAVISP Element 2 Manager at ESA, said: “We are very proud to have supported FocalPoint in developing their S-GNSS receiver. We are confident this technology will have significant growth potential in Advanced Driver Assistance Systems (ADAS), and even in consumer markets like smartphones and wearables.”

GNSS for self-driving

Cars of the Future spoke to FocalPoint CEO, Scott Pomerantz, and VP of Business Development, Manuel del Castillo, to find out more…

Automotive GNSS expert and CEO of FocalPoint, Scott Pomerantz
Automotive GNSS expert and CEO of FocalPoint, Scott Pomerantz

SP: “ESA’s NAVISP is a key enabler for innovation in the European positioning, navigation and timing (PNT) landscape. This newly developed receiver will support our commercialisation strategy, underpinning IP development and providing that much-needed proof of impact.

“Being able to rely on the accuracy of GNSS is key for ADAS and automated driving systems. Our Supercorrelation technology has already won multiple awards, including the Business Innovation Award from the Institute of Physics in 2023.

“By determining the arrival angle of satellite signals, permitting only the line-of-sight signals and ignoring reflections, it will help reduce the number of accidents worldwide. All the major automotive manufacturers are interested.”

VP of Business Development at FocalPoint, Manuel del Castillo
VP of Business Development at FocalPoint, Manuel del Castillo

MdC: “Applying for ESA NAVISP support and funding proved to be extremely successful for FocalPoint. The project ran for two years. We hit milestones throughout the period, and we have just submitted our final paper and made our closing presentation.

“The main goal was to develop our own software defined receiver to prove the commercial implementation of our S-GNSS software – our patented, chip-set level technology which enhances the positioning performance of a consumer-grade GNSS receiver.

“Our Supercorrelation technology has several functions in the GNSS receivers for the automotive autonomy industry. GNSS’s core function is to provide the essential cross-checking of the accuracy and reliability of the sensors, cameras and radars, which can become very challenging, particularly in urban areas.

“As the only sensor capable of determining the vehicle’s absolute position anywhere on earth, as well as the sensor typically used to discipline inertial sensors, enhancing the reliability of the GNSS receiver itself is a logical first step to help overcome the typical challenges facing today’s traditional GNSS receivers.

“Needless to say, in the automotive sector we want GNSS to be accurate to the lane at least, but often that doesn’t happen without our technology. Across many routes in inner city  London, around Canary Wharf, where there are lots of very tall buildings, and therefore a huge number of reflections, in testing without Supercorrelation, many times the positions computed were on the pavement or even inside buildings.

“The first baseline of performance for our tests was a dual frequency L1 and L5 commercial receiver, which represents the state-of-the-art. The second baseline was our own GNSS software defined radio (SDR) without Supercorrelation. When applying Supercorrelation, the results were always within the correct lane, clearly demonstrating a level of accuracy that’s essential for advancing ADAS functionality.

FocalPoint test results: true trajectory (green) versus results with S-GNSS (blue) and without (red).
FocalPoint testing: true trajectory (green) versus results with S-GNSS (blue) and without (red).

“Another core benefit we have been able to demonstrate is the enhanced urban accuracy irrespective of the quality of the antenna used. We can compute very accurate positions even with lower quality antennas.

“In many new cars the manufacturers try to embed the antennas for aesthetic reasons, which unfortunately can compromise the performance. The enhancement of Supercorrelation facilitates a sensitivity boost in the GNSS receiver.

Resistance to spoofing

“Finally, for advancing ADAS and self-driving vehicles, it is critical that the GPS is resistant to spoofing attacks, so rejecting those is the third major benefit that the addition of Supercorrelation brings to the chip.

“We’re now discussing a phase two with ESA to further develop new Supercorrelation technologies, including its application to Real Time Kinematic (RTK) and Precise Point Positioning (PPP) GNSS correction services. Early research has shown dramatic improvement of these services, but with ESA’s backing, we will bring these to market even sooner.

“The aspect of resistance to spoofing attacks could be hugely valuable as all cars from a given manufacturer could be constantly monitoring the environment and sharing this data between them, a crowdsourcing effort to increase the reliability of GNSS for automotive.”

Resistance to spoofing increases the reliability of GNSS.
Resistance to spoofing increases the reliability of GNSS.

SP: “You covered our strategic investment from General Motors last year and we will definitely be doing more testing in London because that’s interesting for many car manufacturers, particularly JLR. We’ve been testing in Seoul, in a place renowned for its modern skyscrapers and office buildings – an area critical to Seoul’s urban landscape called Teheran-ro in Gangnam, a challenging landscape that is local to Hyundai and Kia.

“We’ve also been testing in Tokyo’s skyscraper landscape, results that will be of interest for brands including Suzuki, Subaru, Nissan and Toyota, and also in Frankfurt and the Black Forest for brands including BMW, Mercedes and Audi. Of course we’ll keep testing in Michigan and San Francisco, home of US automotive OEMs. It is all commercially driven, to prove that we can overcome the challenges associated with GNSS for these manufacturers in their own test environments.

“Open sky is relatively easy, but the minute you move into urban areas, motorways with big barriers, or roads with deep foliage, you can often find yourself with reduced GPS accuracy. Similarly, if, say L1 were spoofed and the network went down, and you didn’t have an L5 acquisition capability, that could be a problem, so we need to be multi-band.

“The whole spoofing and jamming piece has been getting a lot of press and our three buzzwords are mitigation, identification and localization. This will be a theme for every car maker – to establish whether there was an error on their side that requires correcting, or whether it truly was due to a bad actor. In the case of the latter, how did that play out and what are the liabilities? FocalPoint provides the fundamental data.”

Malcolm Wilkinson, Head of Connected and Automated Vehicles (CAVs) and Energy at National Highways, talks future mobility.

National Highways: Making CAVs Work For The UK

Malcolm Wilkinson, Head of Connected and Automated Vehicles (CAVs) and Energy at National Highways, on intelligent infrastructure, freight platooning, hands-free zones and more…

National Highways has completed several major CAV studies recently – what are the most significant findings?

MW: “Our connected corridor project on the A2/M2 was very successful, certainly an important steppingstone. It was a joint project with Kent County Council (KCC), Transport for London (TfL), the Department for Transport (DfT) and others. We demonstrated that cellular and WiFi connectivity can be used to put highway information into vehicles, for example, signage, warnings and green lights. We also demonstrated that data can transfer the other way – to us from vehicles. The project informed our Digital Roads vision and Connected Services roadmap, influencing elements of our Digital for Customer programme.

“The Connected and Autonomous Vehicles: Infrastructure Appraisal Readiness (CAVIAR) project used both simulations and real-world data collection. The number one recommendation was the need for further study to determine how CAVs can best navigate roadworks – that’s the next step. This potentially includes infrastructure-based solutions, such as smart traffic cones, and OEMs developing ‘cautious’ behaviours, to be triggered once a CAV enters a work zone.

“The HelmUK freight platooning trial, that we led, working closely with DfT, was another really valuable exercise. We demonstrated real-world use of platooning on the M5/M6, although the fuel savings were very modest, and didn’t replicate what we were seeing on the test tracks. This was largely due to the geography and the need to break up the platoon at many of the junctions.

“We recognise the challenges with rolling out something like this, even the difficulties in ensuring that vehicles from different logistics companies – from the large suppliers to two-lorry outfits – were travelling at the same time. It is one of those technologies you can see working brilliantly on long outback roads in Australia, but the advantages of putting it into every cab in the UK are far less obvious. It’s important to learn from initiative like the ENSEMBLE multi-brand truck platooning project in Europe.”

What are the most pressing CAV issues facing National Highways?

MW: “My feeling is that car manufacturers aren’t going to want to develop completely different models for the UK market, so we need to understand our role as a highway authority. What do we need to think about in terms of highway designs, data/information provision and maintenance standards? What do we need to be investigating and researching to make sure that we as the highway authority are playing our part, doing what motor manufacturers and the public expect of us?

“There’s been a lot of talk about the need for the white lines to be readable by automated vehicles. Is that still the case? If so, what does that mean for our maintenance schedules? Can we use the data from vehicles to inform our congestion management? Is there data we can use for asset management purposes?

“It’s understanding what we need to put into the equation and what we’re going to get back out. Particularly over the next few years, with a mixed fleet with different levels of autonomy, that’s going to present new scenarios, new risks. As a highway authority we need to be conscious of those – how they’re going to affect our operations and the safety of the travelling public.”

How did you identify which parts of the network could be hands-free blue zones?

MW: “The Centre for Connected and Autonomous Vehicles (CCAV) and the Vehicle Certification Agency (VCA) led the discussions with Ford regarding authorisation of their technology on public roads. Although we liaise closely with both, we weren’t involved in the detailed discussions with Ford, but to be clear, BlueCruise is an advanced driver assistance system, so the driver has to remain alert and able to take back control.

“Going forward, we need to move closer to organisations developing these systems to understand when they are coming to market and in what numbers. That’s part of our role as a highway authority – to keep our customers safe and to inform our traffic officers, so everyone knows what to do in the event of an incident.

“We’re reaching out to Ford, to see what data they can they share with us and to develop a more collaborative relationship. It’s very exciting times. We want people to embrace CAV technology and enjoy the benefits.

“We’re some way off self-driving vehicles, but my personal view is that they will probably be available more quickly than many people think.”

Please note: a shorter version of this article was first published in the Institute of the Motor Industry’s MotorPro magazine.

Tom Leggett of Thatcham Research did an epic round of media interviews to explain what BlueCruise is – assisted driving – and isn’t – self-driving.

Not self-driving: Thatcham media marathon to clear up BlueCruise capability confusion

Few were expecting it, but 13 April 2023 will go down in British motoring history. It was the day Ford announced that the Department for Transport (DfT) had approved the use of its BlueCruise assisted driving system on parts of the UK motorway network, making hands-free legal for the first time.

Initially, only a select few gained the ability to go ‘hands off, eyes on’ – drivers of 2023 Ford Mustang Mach-E cars who activate a subscription. Even then, use is restricted to 2,300 miles of pre-mapped motorways in England, Scotland and Wales – the new ‘Blue Zones’. Be in no doubt though, this is momentous.

One foot in the future

“It’s not every day you can say you’ve placed one foot in the future,” said Martin Sander, General Manager at Ford in Europe. “BlueCruise becoming the first hands-free driving system of its kind to receive approval for use in a European country is a significant step forward for our industry.”

UK Transport Minister, Jesse Norman, agreed: “I am delighted that this country is once more at the forefront of innovation. The latest advanced driver assistance systems (ADAS) make driving smoother and easier, but they can also help make roads safer by reducing scope for driver error.”

One of the main themes at the recent Zenzic Connected and Automated Mobility (CAM) Innovators event was the need to do more to establish the UK as a global leader. This embracing of hands-free will be noted around the world.

Ford describes BlueCruise as Level 2 driver assistance, with Lisa Brankin, managing director of Ford in Britain, telling the BBC’s Today programme that, in the case of an accident, the driver will still be responsible as the technology is “not autonomous driving”.

Ford BlueCruise graphic, 2023
Ford BlueCruise graphic, 2023

BlueCruise combines intelligent adaptive cruise control and lane-centering with an in-cabin camera monitoring eye gaze and head position. If necessary, alerts in the instrument cluster and audible chimes will prompt the driver to return their eyes to the road.

Assisted not self-driving

Unfortunately, and rather predictably, much of the UK media again confused assisted driving and self-driving. The Guardian went with “First hands-free self-driving system approved for British motorways”, The Sun with “Huge car firm is launching the UK’s first-approved self-driving technology”.

Huge credit to Tom Leggett, vehicle technology specialist at Thatcham Research, for doing a marathon round of media interviews to explain what BlueCruise is – assisted driving– and what it isn’t – driverless or self-driving.

“The sudden introduction of this technology did catch the industry a little off-guard, as it was not anticipated that it would reach UK roads for another 18-months or maybe even two years,” he said.

“It has been approved by the Vehicle Certification Agency (VCA) under Article 39 for a new and innovative technology, albeit based on current technology. Basically, the VCA were convinced by evidence from Ford, and their own on-track and on-road testing, that BlueCruise is as safe as, and not fundamentally different to, existing assisted driving technologies.

“The key point to emphasise is that it is assisted driving. What makes it slightly different is that it permits the driver to take their hands off the steering wheel. However, the driver is always responsible for driving. Any input from the driver, such as braking or changing lane, and the system will essentially turn off.

“The hope is that the driver monitoring will make it even safer. It is a camera system which looks at the driver’s direction of gaze to ensure they’re concentrating on the road, not looking out of the window or checking their phone.

“At Thatcham Research, we believe direct driver monitoring will have a significant role in addressing drowsiness and distraction. Currently in the UK, about 25% of all accidents involve some sort of distraction.

“It is vital that drivers using BlueCruise are aware of their responsibilities, and we’ll also be very interested to understand how they feel about using it.”

Please note: a version of this article was first published in the Institute of the Motor Industry’s MotorPro magazine.

Related story: Barrister Alex Glassbrook says approval of hands-free driving is a radical development in UK motoring, and should be accompanied by effective official guidance, training and information to the public and affected organisations.

Motor law expert on hands-free – ‘hands off, eyes on’ – driving becoming legal in the UK.

Quiet regulation of a radical step: Barrister raises concerns about lack of guidance on hands-free driving

Alex Glassbrook, a barrister at Temple Garden Chambers, says that approval of hands-free driving is a radical development in UK motoring, and should be accompanied by effective official guidance, training and information to the public and affected organisations.

Where does the Ford hands-free announcement sit in the shift to self-driving in the UK?

AG: “The first question many of us asked was: Is this the first automated vehicle under the Automated and Electric Vehicles Act (AEVA) 2018? It appears that it’s not. First, because it hasn’t been listed under Section 1 of the Act by the Secretary of State for Transport. Second, because it seems not to fulfil the criterion of a system that does not need to be monitored by the driver, which is part of the legal definition under Section 1 and Section 8.

UK Government list of self-driving vehicles (3 May 2023)
UK Government list of self-driving vehicles (3 May 2023)

“So, what we’re looking at is a vehicle with advanced driver assistance, but not a driverless vehicle. Equally, what we’re looking at is something that does represent a culture change, because the driver is allowed to remove their hands from the steering wheel. It’s described as a ‘hands off, eyes on’ system, although this hasn’t prevented the media reporting it as a driverless system, which has implications for safety.”

What do you note about the roads which have been designated ‘Blue Zones’?

AG: “A Blue Zone seems to be the marketing name for an area in which this system can work. I’m not an engineer and I’ve not seen the technical details of the permission that has been given by government for this to operate. However, I note the description of the system as being limited to pre-mapped motorways.

“In a regulatory sense, there is broad symmetry between this and the e-scooter trials, in that they both appear to be based upon government permissions on a set of conditions and restricted to certain areas. But there are plenty of dissimilarities too. For example, that motorised scooters and mopeds (as e-scooters are classified) have been with us for over 100 years, whereas computer mapping technology is relatively new.

“What’s new about a ‘hands off, eyes on’ system is the relinquishing of physical control of steering by the human driver, which is a radical step. The technology itself is a progression of cruise control, which was introduced in the 1950s and came to prominence in the 1970s during the fuel crisis in the US. But relinquishing control of steering at motorway speeds is different – a profound step in both regulatory and practical terms.”

What needs to be considered now that hands-free driving is a legal reality in the UK?

AG: “Let’s begin with some historical context. Driver assistance systems have been accumulating for some time, but the legal standard for driving has not really altered since 1971. It was then that Lord Denning, in the case of Nettleship v Weston, set what can be summarised as the standard of the reasonably prudent human driver.

“It’s a largely objective test, and there are some exceptions, but since established it has never been substantially altered. That’s quite surprising because cruise control is now in such common use that you might have expected the standard of care to have been particularised in relation to it. Now we have a system that explicitly allows the driver to let go of the steering wheel while the car is in motion at motorway speeds. In the coming years, a court might face the question of what standard of attention is required of a driver using a ‘hands off’ system.

“For good reasons, namely the need to plan future laws, we have become very focused on fully driverless vehicles. That’s not a complete strategy, as it can mean that we’re looking to the horizon rather than at what is actually in front of us. To go back to the history for a moment, it took quite some time after the introduction of the motor car for The Highway Code to be introduced. The first edition was published in 1931, written guidance which many of us will have looked at.

“The Highway Code isn’t meant to be specialist guidance to industry, it’s meant to be comprehensible guidance to the public. Advanced driver assistance systems (ADAS) have been regulated ‘quietly’, mainly settled by negotiation at international level and then applied as industrial standards by national approval authorities. ‘Hands free’ driving seems too significant a step for that trend to continue without better official education about advanced driver assistance systems, and what they can and cannot be relied upon to do.”

So how does the guidance need to change?

AG: “The number of driver assistance systems has increased over time, and the quantity of such systems alone can be confusing. I saw an article recently on the most irritating modern vehicle features! Meanwhile, The Highway Code is still largely a text document, not very friendly to mobile devices, and there are plenty of situations it simply doesn’t deal with.

“At the moment, the guidance on driver assistance systems, rule 150, says in essence that those systems are only assistive, that you have to be careful while using them and not let your attention be distracted. Is that guidance too general, for a ‘hands off, eyes on’ system which allows the driver to take their hands off the wheel while driving a car on a motorway? Then there’s rule 160 – “Once moving you should… drive or ride with both hands on the wheel or handlebars where possible” – which will presumably need revision.

Hands-free but Highway Code says "both hands on the wheel" (3 May 2023)
Hands-free but Highway Code says “both hands on the wheel” (3 May 2023)

“We need to think practically about the information which people need to use these systems safely, and how best to communicate it. For example, a feature of this and other systems is that their announcement is often accompanied by explanatory YouTube videos. The Secretary of State for Transport has wide powers to provide guidance and road safety training and information, not only by the Highway Code, under sections 38 and 39 of the Road Traffic Act 1988. He is not limited to one means of providing that information.

“There’s also an argument that we focus too much upon the user of the system. Should road users around a vehicle be made aware that it might be being steered by a computer rather than a human?

“Others affected include those who enforce driving laws and who respond to road traffic collisions, particularly the police and National Highways officers. Then other public authorities, such as the judiciary, and businesses, such as driving instructors and insurance companies – those who form part of the wider motoring ecosystem. All of these people need to be aware.

“So, as well as the issue as to its content, I come back to the question of whether the Highway Code, coming up for its 100th birthday, and still a text document, represents the best or only available form of communication.”

Advanced, Automated and Electric Vehicle Law, 2023
Advanced, Automated and Electric Vehicle Law, 2023

The author of 2017’s “The Law of Driverless Cars: An Introduction” and co-author of 2019’s “A Practical Guide to the Law of Driverless Cars”, Alex Glassbrook’s new book “Advanced, Automated and Electric Vehicle Law” is available for pre-order now.

BlueCruise is good, but it’s not self-driving.

Bolt from the blue oval: hands-free Ford is UK 1st but NOT self-driving

Big news! The Department for Transport has approved the use of Ford’s BlueCruise assisted driving system on parts of the UK motorway network. Be in no doubt, this is momentous – the first time UK drivers will legally be able to take their hands off the wheel. But what does it mean for self-driving?

The scope

As we sit here today, only a select few have gained the ability to sometimes go hands-free – drivers of 2023 Ford Mustang Mach-E cars who activate a subscription. They can then use the “hands-off, eyes-on” tech on 2,300 miles of pre-mapped motorways in England, Scotland and Wales – the new ‘Blue Zones’.

UK motorway blue zones - April 2023
UK motorway blue zones – April 2023

The Ford video below explains how it works, with the voiceover saying: “BlueCruise combines with your intelligent adaptive cruise control and lane-centering systems, allowing you to take your hands off the steering wheel while it maintains cruising speed and keeps you in your current lane.

“An infrared camera monitors your eye gaze and head position to ensure that you’re paying due care and attention to the road ahead. If the system finds you’re not looking at the road it will notify you either with an alert message displayed in the instrument cluster or by sounding an audible chime to remind you to return your eyes to the road.

“If you do not react to the warnings the system will cancel, gently pump the brakes to get your attention and slow your vehicle down while maintaining steering control.”

Ford assisted driving video

The legalities

Last year the government seemed to be planning to class cars equipped with Automated Lane Keeping Systems (ALKS) as self-driving. That hasn’t happened, which is a very welcome shift.

The UK government’s website confirms: “At present, there are no self-driving vehicles listed for use in Great Britain”.

Ford itself describes BlueCruise as Level 2 driver assistance, and Transport Minister Jesse Norman made clear: “The latest advanced driver assistance systems make driving smoother and easier, but they can also help make roads safer.”

Jesse Norman, Minister of State in the Department for Transport
Jesse Norman, Minister of State in the Department for Transport

Lisa Brankin, managing director of Ford in Britain and Ireland, told the BBC‘s Today programme on Friday that, in the case of an accident, the driver will still be responsible as the technology is “not autonomous driving”.

The beeb also noted that other vehicle manufacturers offer similar systems – Tesla has Autopilot and Mercedes has Drive Pilot. Interestingly, the latter announced last year that it will accept legal responsibility for accidents caused by its system.

One of the main themes at the recent Zenzic Connected and Automated Mobility Innovators event was the need to do more to establish the UK as a global leader in CAM. This embracing of hands-free will be noted around the world.

Self-driving headlines

Unfortunately, and rather predictably, much of the UK media has again confused assisted driving and self-driving.

The Guardian went with the headline “First hands-free self-driving system approved for British motorways”.

The Sun went with “HANDS OFF Huge car firm is launching the UK’s first-approved self-driving technology”.

Various outlets, including ITV, even regurgitated the line from the press release that BlueCruise can operate up to 80mph. Not on UK roads presumably as that’s 10mph above the motorway speed limit!

Let’s be clear – this lack of clarity is dangerous. Lives are at stake and road safety should be paramount.

Eyes on the road

This Ford video shows a driver happily gazing out of the window and being warned to “watch the road”.

Ford hands-free video

As the All-Party Parliamentary Group on Connected and Automated Mobility stated in its red lines: “A statutory definition of self-driving must be established to distinguish this technology from assisted driving”.

The final word goes to Tom Leggett, of Thatcham, who emphasised: “For the first time ever drivers will be permitted to take their hands off the wheel. However, their eyes must remain on the road ahead. Crucially, the driver is not permitted to use their mobile, fall asleep or conduct any activity that takes attention away from the road.”

Cutting-edge radar for ADAS and self-driving

Revolutionary self-driving tech: Oxford RF’s solid-state 360-degree sensor

In this Cars of the Future exclusive, we talk solid-state 360-degree radar, ADAS, self-driving and Zenzic success with Dr Kashif Siddiq, founder of Oxford RF Solutions.

How did you come up with the 360-degree radar idea?

KS: “We’ve specialised in radar and sensor technologies for 15 years, creating a lot of tech for other businesses. Then it struck us that there’s a huge gap in the market.

“The problem we see is people taking off-the-shelf sensors and bolting them to vehicles to try and make them autonomous. This probably isn’t the right way of doing it. What we need is sensors designed specifically for autonomous vehicles. That was the idea behind Oxford RF.

“We’ve developed a prototype which solves some of the burning challenges in perception sensors for ADAS and self-driving. It also has drone, space and marine applications. It is the world’s first solid-state 360-degree sensor. Actually, we’ve already taken it to the next level by making it hemispherical, so it can see upwards in a dome as well as all-round.

“There are no moving parts and we have the capability to integrate multiple technologies within the same box, but we’re focusing mainly on radar for now.”

Oxford RF and the APC

Oxford RF has been supported by the Advanced Propulsion Centre (APC) via its Technology Developer Accelerator Programme (TDAP), including collaboration with the Warwick Manufacturing Group (WMG).

Self-driving investment: Oxford RF has been supported by the Advanced Propulsion Centre

And won funding as one of 2022’s Zenzic CAM Scale-Up winners

KS: “We applied last year but at that stage we only had an idea rather than a technology to test. Now we have a working prototype and are really leading the thought process when it comes to perception sensing.

“The current situation with advanced driver assistance systems (ADAS) is a mix of cameras, radars and lidars being used to effectively give a full 360-degree picture. There’s an architectural problem with this. First of all, the price.

“Each of those sensors is expensive and there’s so many of them. Then, obviously, all that data needs to be routed to a centralised computer, and that causes latency. Milliseconds are valuable when it comes to saving lives.

“Another issue is redundancy: what’s the backup if one sensor fails? All too often the answer is another sensor, which means yet more cost. And you start to run into the mutual interference problem.”

Self-driving winners: Zenzic CAM Scale-Up Programme (2022 cohort)
Self driving winners: Zenzic CAM Scale-Up Programme (2022 cohort)

Safety-critical benefits

KS: “In a nutshell, we’ve reengineered sensor architecture. It doesn’t need to be radar, it can be any sensor. This allows us to reduce the sensor count.

“Initially we installed them on the car roof, but we’re moving them to the four corners, inside the bumpers. Less sensors means less latency in decision making, so it’s a faster system overall. It’s also inherently more resilient to interference.

“From a safety critical point of view, the four corners approach comes with redundancy built-in, because if one of the 360-degree sensors fails, two others are still looking at the same point.

“Delivering visibility in all conditions has to be seen as a deep tech problem and solved on a scientific basis. Are we able to reduce the mortality rate? That’s the real acid test.

“Further to that, from a finance point of view, can we reduce the cost of what I call the minimum viable sensor suite? Does that enable manufacturers to reduce car prices? Or insurers to reduce premiums due to less crashes?

ADAS first, then self-driving

KS: “We’re taking a beachhead approach and the first application will be ADAS. We’ll prove our technology there and then scale to full autonomy. Over the next year, we’re planning to produce about 100 of our solid-state 360-degree radars, to expand trials with our initial customers.

“We’re planning to start commercial production in 2024. From there, we’ll expand into other markets, as many as we practically can. For example, in drone applications, we’ll usually only need one sensor. For spacecraft, we’re looking at two front-facing sensors. For marine vessels, we’re talking about three sensors – one on the bow and two on the stern.

“It will take time to develop our business to a level where we can supply all of these markets, but it’s really good to see that there’s already significant interest.”

For further info, visit the Oxford RF website

Lidar sector thriving as established players and new start-ups push for safe self-driving.

Self-driving gives lidar billion dollar boost

Two new reports have highlighted assisted- and self-driving as key factors predicted to boost the global automotive light detection and ranging (lidar) market.

According to Polaris Market Research, it will reach US$4.14bn by 2026, increasing at a Compound Annual Growth Rate (CAGR) of more than 35%.

The report summary noted: “The solid-state/flash lidar market is expected to grow at a very high pace during the forecast period. Solid state sensor being low-cost, robust, as well as compact in size makes it ideal for potential large-scale production of level 3 and 4 cars in coming years. Further, mechanical sensors and other sensors also capture decent market share.”

A separate report, by Markets And Markets, largely concurs, projecting a CAGR of 21.6% to reach US$3.4bn by 2026. However, it focuses more on unmanned aerial vehicles (UAVs) – drones – and 4D lidar, with the prospect of new entrants making a big impact.

Lidar in self-driving

In March, Aeva announced that its Aeries 4D lidar sensors are now supported on the Nvidia Drive autonomous vehicle platform. As well as measuring distance and plotting the position of objects in x, y and z, 4D plots velocity as a fourth dimension.

Aeva CEO Soroush Salehian on self-driving
Aeva CEO Soroush Salehian on self-driving

Both CEO Soroush Salehian and co-founder Mina Rezk previously worked on Apple’s Special Projects Group. “Bringing Aeva’s next generation 4D lidar to the Nvidia Drive platform is a leap forward for OEMs building the next generation of level 3 and 4 autonomous vehicles,” said Salehian.

“We believe Aeva’s sensors deliver superior capabilities that allow for autonomy in a broader operational design domain (ODD), and our unique features like Ultra Resolution surpass the sensing and perception capabilities of legacy sensors to help accelerate the realization of safe autonomous driving.”

You can always tell when a sector is thriving because dedicated events spring up. The fifth annual Automotive Lidar conference took place in September, while Lidar Magazine has documented the increasing crossover from surveying into car tech.

Its recent interview with Luis Dussan, founder of California-based AEye is well worth a read. “While at Northrop Grumman and Lockheed Martin, I was designing mission-critical targeting systems for our fighter jets and special ops units that searched for, identified and tracked incoming threats,” he said.

“I realized that a self-driving vehicle faces a similar challenge: it must be able to see, classify, and respond to an object – whether it’s a parked car or a child crossing the street – in real time and before it’s too late.”

Of course, the established players are also pouring money at lidar, and making huge strides. Polaris highlighted Bosch, Continental, Delphi, Denso and Velodyne, among others, with Bosch boasting “the first long-range lidar suitable for the automotive mass market”. It has a detection range of over 200m.

Dr. Mustafa Kamil of Bosch on self-driving
Dr. Mustafa Kamil of Bosch on self-driving

Dr. Mustafa Kamil, Bosch’s project manager for automated driving sensors, explained: “For automated driving to become a reality, the vehicle must perceive its surroundings more effectively than humans can, at all times. Alongside cameras, radar and ultrasonic, a further sensor principle is required in order to achieve this goal.

“For example, when the ambient light changes from bright to dark upon entering a tunnel, it can briefly pose a challenge for the camera. Meanwhile the lidar sensor remains majorly unimpeded by the change in light conditions, and can reliably recognize objects at the entrance to the tunnel in these critical milliseconds.”

He continued: “A former supervisor once told me that a lidar sensor is like a plate of spaghetti: As soon as you try to grab one piece, the others move as well. If you want to make the sensor smaller, this affects properties such as the visual field-of-view or detection range. Optimizing all components in such a way that they do not impede other variables is technically challenging.”

Please note: a version of this article was first published by the Institute of the Motor Industry’s MotorPro magazine.

EV to ADAS, Tesla has revolutionised the car industry at lightning speed

Tesla: With EV no longer a USP, ADAS is the new battleground

What company springs to mind when you think cutting-edge auto tech? Same here. Tesla. At the recent FT Future of the Car Summit, Elon Musk reminisced about the first Roadster.

“There were no start-ups doing electric cars, and the big car companies had really no electric car programmes,” he said. “Unless we tried, they were not going to be created. It wasn’t from a standpoint of thinking, hey, here’s a super lucrative idea.”

EV all the way: Tesla line-up
EV all the way: Tesla car line-up

20 years later, Tesla is the world’s most valuable car brand, and it’s not even close. In June 2022, Statista valued it at US$75.9 billion, up from a mere 40-odd billion in 2020, and substantially more than second-placed Toyota and third-placed Mercedes-Benz put together.

From drivetrains to marketing, it has shredded the vehicle manufacturing rulebook, and continues to do so. Consider just some of the key developments over the last six months.

Tesla to Twitter

In March, Musk entered into a Twitter spat with US president Joe Biden, after the latter praised Ford for investing $11billion to build EVs, creating 11,000 jobs, and GM for investing $7billion, creating 4,000 jobs. He retorted: “Tesla has created over 50,000 US jobs building electric vehicles and is investing more than double GM and Ford combined.”

Research by StockApps confirmed that Tesla spends miles more on R&D than rival carmakers, around $3,000 per vehicle produced. While Electrek highlighted that Tesla spends nothing on advertising, relying “almost entirely on word-of-mouth”.

It wasn’t all plain sailing. A court in Germany ordered Tesla to buy back a Model 3 from a customer who likened the Full Self-Driving (FSD) package to “a drunk first-time driver”. With EV no longer a USP, ADAS is the new battleground.

In May, a judge in California ruled that the driver of a Tesla operating in Autopilot must stand trial for a crash that killed two people. A Model S reportedly ran a red light and hit a Honda Civic at 74 mph. It could mark the first felony prosecution against a driver using a partially automated driving system.

More negative press followed when it emerged that hundreds of Tesla owners had complained about “phantom braking”, with cars stopping suddenly for no apparent reason.

Then, in June, the US National Highway Traffic Safety Administration (NHTSA) published the first of its new monthly reports into crashes involving vehicles with ADAS. Tesla had the most, followed by Honda and Subaru.

Cue the headlines, “Tesla Autopilot and Other Driver-Assist Systems Linked to Hundreds of Crashes” in The New York Times, and “Teslas running Autopilot involved in 273 crashes reported since last year” in The Washington Post.

Importantly, the US Public Interest Research Group clarified that: “Teslas are connected to the internet and automatically report if the car was in Autopilot. Honda asks its drivers if they were using ADAS, so it relies on hard-to-verify personal accounts. Everyone else leaves it up to the police report.”

Tesla went on the offensive, quoting some eye-catching statistics: “In 2021, we recorded 0.22 crashes for every million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, we recorded 0.77 crashes for every million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there are 1.81 automobile crashes for every million miles driven.”

Its Impact Report also noted that, “In 2021, the global fleet of Tesla vehicles, energy storage and solar panels enabled its customers to avoid emitting 8.4 million metric tons of CO2e”, compared to an ICE vehicle with a real-world fuel economy of 24mpg. A timely reminder of the extent of its achievement.

That’s a whirlwind six months, and we haven’t even mentioned the Gigafactory in Texas, the Cybertruck SUV, the plans to launch a steering wheel free robotaxi by 2024, June’s new car price hikes, or the off-the-chart used values.

The fact is Tesla has revolutionised the global motor industry at lightning speed, and shows no signs of slowing. 

Please note: a version of this article was first published by the Institute of the Motor Industry’s MotorPro magazine.

New survey on ADAS and self-driving by The Insurance Institute for Highway Safety in America raises questions for UK legislators and motorists

Who wants self-driving anyway? US survey finds 80% love ADAS but not hands-free

A new survey on full and partial self-driving by The Insurance Institute for Highway Safety (IIHS) in America has found significant mistrust of automated lane changing systems, with drivers preferring to stay hands-on and initiate the manoeuvre themselves.

The IIHS – a respected non-profit educational organization dedicated to reducing deaths from motor vehicle crashes – surveyed over 1,000 drivers on questions related to partial automation between September and October 2021, with the results published in June 2022.

The headline finding was that 80% wanted to use “at least some form of lane centering” – a strong endorsement for what we Brits call automated lane keeping systems (ALKS).

Report covers ADAS & ADS

IIHS report on consumer demand for ADAS and self-driving June 2022
IIHS report on consumer demand for ADAS and self-driving June 2022

36% preferred “hands-on-wheel” lane keeping, compared to 27% for “hands-free”, with 18% having no preference between the two types, 16% not wanting to use any form of lane keeping and 4% being unsure.

If you think that shows an appreciation of advanced driver assistance systems (ADAS) but a mistrust of conditionally automated driving systems (ADS), the next finding appears to confirm that.

Asked about lane changing assistance (as opposed to just lane keeping), 73% said they would use some form of auto lane change. However, 45% said they’d prefer to use driver-initiated auto lane change compared to only 14% for vehicle-initiated auto lane change. 23% said they wouldn’t use either type, 13% had no preference and 5% were unsure.

What’s more, on self-driving technology, 35% said they found it “extremely appealing” while 23% said it was “not at all appealing”.

Alexandra Mueller, the IIHS survey’s primary designer, commented: “Automakers often assume that drivers want as much technology as they can get in their vehicles. But few studies have examined actual consumer opinions about partial driving automation.

“It may come as a surprise to some people, but it appears that partially automated features that require the driver’s hands to be on the wheel are actually closer to one-size-fits-all than hands-free designs.”

Another eye-catching finding was the high number of people “at least somewhat comfortable” with in-cabin driver monitoring to support such systems: 70% for steering wheel sensors, 59% for camera monitoring of driver hands and 57% for camera monitoring of driver gaze.

“The drivers who were the most comfortable with all types of driver monitoring tended to say they would feel safer knowing that the vehicle was monitoring them to ensure they were using the feature properly,” said Mueller.

“That suggests that communicating the safety rationale for monitoring may help to ease consumers’ concerns about privacy or other objections.”

Self-driving questions

For us, the study is particularly interesting in terms of the UK government’s plan to list vehicles approved under the Automated Lane Keeping System (ALKS) Regulation as self-driving.

For the drivers of certain new high tech cars, this could be the first time that any hands-free driving becomes legal on UK roads. The current suggestion is for this to be restricted to slow motorway traffic (max 37mph), initially at least.

Further still, the acceptance of driver monitoring seems relevant to point four of the All-Party Parliamentary Group (APPG) on Connected and Automated Mobility’s seven expert recommended red lines: “Establish minimum standards for data sharing and handling to ensure transparency and effective governance”. 

The full IIHS report is available here.