Vehicle-to-everything (V2X) 4G and 5G connectivity via small cells can be a lifesaver.

Carsofthefuture.co.uk editor to host Automotive & Transportation session at Small Cells World Summit 2021

Carsofthefuture.co.uk has signed a media partnership agreement with The Small Cell Forum (SCF) for its three-day online Small Cells World Summit, The Future of Mobile Networks, on 11-13 May 2021.

Small Cells World Summit 2021 registration
Small Cells World Summit 2021 registration

As part of the deal, Carsofthefuture.co.uk editor Neil Kennett will moderate the Automotive & Transportation session from 11am on Wednesday 12 May, with high-profile speakers including: Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook Proving Ground; Dr Maxime Flament, Chief Technology Officer at the 5G Automotive Association, one of the world’s leading authorities on Intelligent Transport Systems (ITS); Bill McKinley, Connected Car Business Lead at Keysight Technologies; and Mark Cracknell, Head of Connected and Automated Mobility at Zenzic, responsible for accelerating the self-driving revolution in the UK.

Neil Kennett, said: “We are delighted to partner with The Small Cell Forum for this exciting virtual event, which brings together mobile operators, vendors and regulators from around the globe. The Automotive & Transportation session will focus on connected and autonomous vehicle (CAV) opportunities, particularly vehicle-to-vehicle (V2V) and vehicle-to-everything (V2X) communications, in-vehicle payments, and the rival ITS-G5 and C-V2X 5G technologies.

“Small cells deliver high-quality, secure 4G and 5G coverage, so there are clearly a multitude of new use cases in the connected car world and the wider mobility ecosystem. Aside from supporting self-driving, they can facilitate everything from in-car infotainment and shopping, to fixing technical problems before they occur and pre-empting likely crash scenarios. It is no exaggeration to say they could be a lifesaver.”

Carsofthefuture.co.uk readers can benefit from a 40% discount on Small Cells World Summit 2021 tickets using the code SCWS2021. See www.smallcells.world/

“We will be running autonomous buses this year. That’s an incredible milestone.”

The future is here, 2021: CAVForth buses will put UK on the driverless map

Our Zenzic CAM Creator series continues with Jim Hutchinson, CEO of Fusion Processing.

As a partner in the ambitious CAVForth project, predicted by the Scottish Mail on Sunday to make Edinburgh “the most ‘driverless’ city in the world”, Fusion Processing is delivering on its promise to design and build world leading systems for the automation of vehicles. Here, CEO Jim Hutchinson talks ADAS, cyclist detection and autonomous vehicle safety, explaining how CAVForth is set to make a major mark on the global self-driving map.

Jim Hutchinson, CEO of Fusion Processing
Jim Hutchinson, CEO of Fusion Processing

Please can you outline Fusion Processing’s work on connected and automated mobility?

JH: “We’ve been going since 2012. We set up to develop automated vehicle systems with the ultimate goal of being fully autonomous – able to do anything that a human-driven vehicle can. We knew from the start there were a lot of steps along the way and, for essentially a commercial company, we needed to have products along those steps rather than trying for a ‘Level 5 or nothing’ approach.

“We developed the CAVstar platform as a scalable solution – a drive system we could put into pretty much any vehicle, from small cars up to HGV. Along the way we’ve been involved in some great schemes like the Venturer project, one of the original three UK AV projects.

“Then, more or less in parallel with that, we were involved in the Gateway project in London. We provided the autonomous drive system for the pods that drove along the Thames path. That was a big trial with random members of the public – some who came along specifically to experience it, and many others who just wanted to get from the O2 to the other end of the route. The pods encountered various other people on the route – the vehicles had to be mindful of dog walkers and cyclists. The feedback was by and large very positive, and it was a good proof point for us of how our system can be used off-highway.

“It also led to other things, notably our partnerships with Stagecoach and Alexander Dennis. First-off we were exploring using autonomy in bus depots. Every night a lot of operations have to happen involving the movement of vehicles – they have to be fuelled, washed, made ready for the morning, so we put together a system which could automate that. The concept was based on a fleet manager directing all this from a control tower once the bus arrives back at the depot.

“The system proved very successful, demonstrating operating efficiency and improved safety for those working in the depot, so that led to CAVForth – an autonomous bus service. Again, we’re working with Stagecoach and Alexander Dennis, joined by Transport Scotland, Bristol Robotics Laboratory and Napier University.

Fusion Processing CAVstar
Fusion Processing’s CAVstar

“The intent is to put into service a number of Level 4 autonomous buses between the Fife Park & Ride and the Hermiston Gait Interchange. It’s a commuter route so we’re expecting a large number of daily commuters who want to travel to the Hermiston Gait Interchange, where they can transfer on to trams for the city centre, the airport or the rail network. We expect tourists will want to use it too to reach the Forth Road Bridge, a UNESCO heritage site.

“It’s a useful service, running every day of the week, and the hope is that it will go from a pilot service to a full service. It’s being registered as a new route, providing a service that wasn’t previously there, and Stagecoach anticipate around 10,000 journeys a week.

“The route includes a mix of road environments – motorway, bus lanes, roundabouts, signalled interchanges – so from our point of view it makes for a great demonstration of capability. There’s the technology side, which Fusion is focussed on, but there’s also key research around public acceptance and uptake. That’s really exciting too.

“The launch date isn’t set in stone due to Covid uncertainties, and the point at which they start taking passengers is still to be determined, but we will be running autonomous buses this year. That’s an incredible milestone, absolutely huge. It will be a very significant achievement to demonstrate a Level 4 capability on that class of vehicle – a big thing for the UK which will be noticed around the world.

“There are one or two other groups working on similar projects, but I haven’t seen anything with this level of ambition, this level of complexity, or length of route. It’ll obviously be fantastic for us and our CAVForth partners, but also for the UK autonomous vehicle industry as a whole. It will really put us on the worldwide map.”

Please can you outline Fusion Processing’s work on driver assistance?

JH: “CycleEye is an important product for us. We identified a need for collision avoidance technology. There are lots of collisions with cyclists and quite often they occur because the bus driver doesn’t know the cyclist is there. CycleEye is like a subsystem of CAVstar in a lot of ways – one of those steps to get some proof points on bits of the technology. It recognises and classifies different types of vehicle, and the driver gets an alert when there’s a cyclist in the danger zone. It is currently being used in a few cities around the UK, including on the Bristol Metrobus. It’s a good system. Whenever it has been evaluated against other cyclist detection systems it has always come out on top.

“We’re particularly excited about the next incarnation of CycleEye, evolving it to become a camera mirror system. It’s legal now to use cameras instead of mirrors, so we can provide that functionality too – monitors in the driver’s cab instead of mirrors. That has several benefits. Mirrors, on buses particularly, can be a bit of a liability – they quite often get knocked and sometimes they knock people. They stick out and head strikes are unfortunately quite common. They also get smashed, putting the bus out of service, which is an inconvenience and an operational cost. We think that being able to offer this camera mirror with CycleEye functionality is going to prove attractive to a lot of operators.”

Van with Fusion Processing technology
Van with Fusion Processing technology

Over what timescale do you expect Level 4 and 5 autonomy to be achieved in the UK and which sectors will be early adopters?

JH: “With CAVForth we’ll be running Level 4 autonomous vehicles, where you’ve got a restricted operational design domain (ODD), in the UK this year. Restricting these vehicles to particular routes or environments lends itself very well to public service, where the vehicles are maintained by an operator. That’s very achievable right now. As well as passenger service vehicles, other service vehicle fleets are easy wins, as well as off-highway stuff like industrial sites. Then you’ve got delivery vehicles.

“When it comes to true Level 5 – go anywhere, do anything vehicles – repair and maintenance is an issue. We know that with privately owned cars, some people maintain them exactly as they should, and other people don’t. There are other complications too – things that people perhaps don’t do that often but like their vehicles to be able to do, like parking in a farmer’s field at a festival – that’s a little bit further out still.

“If you just roll back slightly from true Level 5, if people want a city car or a comfortable car for a long motorway journey, nothing off-road, there’s a case for vehicles which have an autonomous mode. That certainly appeals to me.”

Can you address the concerns about ADAS, particularly handover of control, driver concentration levels and driver deskilling?

JH: “I’m not a big fan of Level 3. If you haven’t been driving for an hour to suddenly be asked to take the wheel because the car has encountered something it can’t handle, it’s just unrealistic. Whereas a Level 4 system, which can put itself into a safe state when it reaches the limits of its ODD – perhaps ready to be restarted in a manual mode when the driver wants to take control – that’s much more practical.

“If there are circumstances when the driver needs to take over then clearly the driver needs to be of a standard that they can drive safely. Once you have widespread adoption of autonomous systems, and people are not driving routinely, there is a risk of driver deskilling. If that were the case you’d really need to look at greater regulation of drivers.

“That said, you can sometime envisage problems that don’t really transpire. We’ve had cruise control and adaptive cruise control for a while now and I don’t think they’ve had the effect of particularly deskilling drivers. So, with Lane Keep, maybe it’s not such a big deal. Once you get to the point where cars are properly self-driving, there is a danger. If you haven’t got anything to do your mind will wander, that’s human nature, so it is a concern.”

For further info, visit fusionproc.com

AI and IoT expert Karim Jaser presents a resolute defence of the trolley problem.

Ethics in self-driving: The trolley problem strikes back

The trolley problem – the question of who to save, or kill, in no-win crash situations – continues to divide opinion like no other subject in the driverless world.

I must admit to flip-flopping on it myself. From being quite taken with it in 2018’s Autonomous now: the shift to self-driving, through 2019’s The driverless dilemma: touchstone or red herring?, to last month’s Self-driving experts across the world agree: the trolley problem is a nonsense.

That, I thought, was conclusion reached, the end of the matter. Far from it! In response to the latter article, Karim Jaser, Senior Product Manager specialising in artificial intelligence (AI) and internet of things (IoT) for blue chip companies, posted a resolute defence of the much-maligned thought experiment on our LinkedIn page.

“I do agree that humans don’t go through the trolley problem evaluation in the split decision second, however I also think not all experts agree it is a nonsense,” he said. “It is for society as a whole to discuss these ethical problems. From the point of view of self-driving technology, this can be solved in many ways, with probability theory and estimations on minimal loss, but it is not up to developers or self-driving experts alone to decide how to tackle the point. It needs the involvement of regulators, governments and the industry.”

Karim Jaser
Karim Jaser, Senior Product Manager specialising in AI and IoT.

Well, with our mission to encourage debate about all aspects of autonomous vehicles, how could we resist? We asked Karim if he’d be up for an interview. He kindly agreed and here we present his thoughtful and cohesive opinion.

KJ: “I was always fascinated by robot intelligence, so at university I studied telecommunication engineering. There were lots of exams on probability theory, system control, software engineering. I was also involved in coding in my spare time, and later did it as a job.

“Self-driving is a control problem first and foremost. There are elements of robotics, including perception state estimation and trajectory planning, but also software, hardware and AI working together.

“The interest grew stronger when I started studying machine learning and AI about four years ago. When I was at university in the 90s, AI was not really a popular subject. It was a topic I picked up later in my career. As a senior product manager at a high technology company, AI is everywhere now – it’s an essential part of the skills necessary to perform and innovate, from biometric scans and image recognition to automated travel.

“AI has a lot of potential to have a beneficial impact on society – fewer accidents, better mobility, less pollution, more autonomy for people with disabilities – but it doesn’t come without challenges, for example, cyber threats, and also ethical and regulatory issues, which is why I got involved on the trolley problem.”

“It’s not straightforward. If we take a step back, you we need to understand how self-driving cars take decisions. They’re using supervised learning, reinforcement learning, convolutional neural networks (CNN) and recurrent neural networks (RNN), deep learning for computer vision and prediction. Specifically, reinforcement and inverse reinforcement learning are very tightly linked to the way driverless vehicles behave through means of policies.

“Policies are related to the distribution of probabilities, but the trolley problem is an ethical choice, so I understand why a lot of people in the industry dismiss it. It’s not the way autonomous vehicles take decisions, going through philosophical considerations in a split second, so it might seem irrelevant, right? Like the Turing Test and Asimov’s Robot Rules, the trolley problem can be perceived as a distraction from more practical considerations.

“It can be distracting for two reasons: first, these considerations are corner cases – there are other priorities, more likely scenarios still to be addressed; second, autonomous vehicles will not be given ethical guidelines to link with probabilities.

“With regards to the first objection, as Patrick Lin (director of the Ethics and Emerging Sciences Group at California Polytechnic State University) has pointed out, it shouldn’t matter if these scenarios are impossible, because the job of these thought experiments is to force us to think more carefully about ethical priorities, not to simulate the realities.

“The second objection is related to self-driving cars taking decisions through distribution of probabilities. The actions of these vehicles are linked not to hard coding, but all statistical contextual information, and that makes each scenario difficult to interpret. You can potentially have millions of mini trolley problems in different contexts.

“The trolley problem is a reminder that corner cases and autonomous vehicle behaviours are not a technical irrelevance. This is an issue that belongs to society and should be discussed in the same way as other AI pitfalls like privacy and bias.

“Actually, the trolley problem is more related to the third pitfall of AI, replicability. When trying to understand why and how an autonomous vehicle takes a decision, it is important to note that most autonomous vehicle developers are taking ethical considerations into account.

“In 2017 in America, Apple commended the National Highway Traffic Safety Administration (NHTSA) for including ethical considerations in its Federal Automated Vehicles Policy. It even highlighted three particular areas: 1) the implications of algorithmic decisions for the safety, mobility and legality of automated vehicles and their occupants; 2) the challenge of ensuring privacy and security in the design of automated vehicles; and 3) the impact of automated vehicles on the public good, including their consequences for employment and public spaces.

“The automotive industry has also approached the issue of accidents caused by autonomous vehicles in relation to ethics. For example, Volvo stated in 2015 that it would take responsibility for all Volvo self-driving car accidents. This is an ethical decision, because it did so without regulation forcing it to do so.

“We will see what happens. If there are no ethical decisions by the industry, the regulators will step in. On a fun note, looking to the past, horses were not considered responsible for their actions, the rider was. Whereas in this case, responsibility for the autonomous vehicle will lie not with the owner but with the carmaker.

“So, to conclude, automakers and AV developers are taking ethical and regulatory matters into account, which underlines the importance of these discussions. We cannot just dismiss the trolley problem because it’s not the way an autonomous vehicle decides, or because it distracts from technical development.

“The way to deal with this is to discuss the implications in the right context, being aware of how autonomous vehicles are developed without scaring the public with sensationalist articles. The trolley problem might be perceived as a Terminator-style situation, and that’s where it gets on the nerves of a lot of people that are developing and testing AI. It’s not black and white, it’s a grey area, and that takes us to the path of discussions.

“The trolley problem forces us to consider ethics in vehicle development and confront the fact that ethical principles differ around the world, as documented by the Massachusetts Institute of Technology (MIT) simulation.

“Are we at the point where discussing the trolley problem should be a priority? I believe that would be beneficial to the success of the self-driving industry, guiding us in the thinking process of building the right mix of safeguards and transparency.”

CGA’s simulations train autonomous vehicles to deal with environments specific to the UK.

Self-driving and smart cities: stop wishcasting and get real with predictive simulation

Our Zenzic CAM Creator series continues with Liverpool-based Jon Wetherall, Managing Director of CGA Simulation, and Max Zadow, Director of Future Coders.

By applying gaming knowledge to real-world mobility questions, CGA has created engaging simulations to study autonomous driving and smart city solutions.

JW: “My background is gaming. I used to work for the company that did Wipeout and F1 games. We made a racing game called Space Ribbon and one day, about five years ago, we got a call from The Department for Transport (DfT). They were doing a research project on virtual reality (VR) in the testing and training of drivers, specifically hazard awareness.

“We turned it into a game and it worked – people said their attitudes changed as a result of our simulations. The hardest scenario came early in the game – a parked lorry with a big blind spot – and a lot of people crashed. VR feels so visceral, the experience can be quite vivid and shocking. Of course, smarter cars will hopefully fix these types of situations.”

CGA Simulation junction and forecourt
CGA Simulation junction and forecourt

To pursue this goal, CGA received a grant from Innovate UK to create an artificial learning environment for autonomous driving (ALEAD).

JW: “The aim was to make these cars safer and we stayed true to our computer game history. We didn’t have the resources to lidar scan the whole area, so we did our own thing using mapping data. We made a digital twin of Conwy in north Wales and unlike other simulations we kept all the ‘noise’ in – things like rain. This was important because it is now well-understood that noise is a big challenge for autonomous vehicles (AVs).

“Modern autonomous driving stacks have 20 different subsystems and we generally focus on only one or two, to do with perception. There’s been massive progress in this area over recent years, to the extent that artificial intelligence (AI) can identify an individual by their gait. What’s more, you can now do this on a computer you can put in a car – this is one of the cornerstones of driverless.

“It’s not the first time people have been excited about AI. In the 50s they were saying it was only a few years away. It has taken much longer than people thought, but major problems have now been solved.

“We are lucky to have one of the world’s leading experts in radar on our doorstep, Professor Jason Ralph of The University of Liverpool, and he helped us develop the simulation. You have to feed the car’s brain, a computer, all the information it will need – from sensors, cameras, GNSS – and you can do all that in the software.”

MZ: “In particular, The University of Liverpool were interested in how weather affects things, right down to different types of rain and mist. In California, if an AV encounters conditions it can’t handle, like heavy rain, it pulls to the side of the road. That’s ok for San Francisco but not for Manchester!

“A few years ago, everyone seemed to be using the example of an AV encountering a kangaroo. How would it cope? The point is you can use our simulations to train cars, to create algorithm antibodies for once in a lifetime events and regular things in different environments. That remains an essential part of what’s needed to make AVs a reality.

“We picked Conwy partly because it has very different patterns of land use to America. An early use case for AVs is predicted to be taxis, but in the UK these are most frequently used by people who don’t own their own car, and they often live in high density housing or narrow streets. The operational design domains (ODDs) are going to have to deal with environments specific to this country – steep hills, roads which twist and turn, and changeable weather.”

Mobility Mapper

Wetherall and Zadow’s latest collaboration is Mobility Mapper, a project to create greener and more intelligently designed transport hubs. The technology underpinning Mobility Mapper has been used previously by the team to model Covid 19 spread, autonomous vehicle technology and by the Liverpool 5G Create project (funded by DCMS as part of their 5G Testbeds and Trials Programme).

JW: “E-hubs are basically an extension of what used to be called transport hubs – train or bus stations. They’ll provide charging facilities and access to different modes of transport, for example, you can drop off an e-scooter and hop into a shared autonomous car.

“Here in Liverpool, there was a big trial of e-scooters, big in international terms not just UK. The worry was that a lot of them would end up in the canal, but that didn’t happen. The trial was incredibly successful. It’s all about linking that movement and nudging people away from car ownership.”

MZ: “We were already thinking about how Jon’s technology could be used for mobility as a service (MAAS) when we attended a virtual future transport conference in LA with the Centre for Connected and Autonomous Vehicles (CCAV).

“That was an influence, as was an Intelligent Transportation Systems (ITS) trade show in Copenhagen, where we saw an autonomous tram system designed to take bicycles. It was a small step from there to imagining autonomous trams carrying autonomous delivery pods.

“This is classic smart city stuff but you need to know how these e-hubs are likely to be used, with no track record, nothing to go on. We need simulated environments to make best guesses in. That’s Mobility Mapper.”

JW: “It is early days, still in the development phase, but the authorities in both Manchester and Liverpool have agreed there’s a need for such a predictive simulation tool.”

As we wrap-up a thoroughly enjoyable interview, Max dons his Director of Digital Creativity in Disability hat: “Autonomous delivery bots are basically electric wheelchairs without a person, so there’s clearly a potential benefit, but there needs to less wishcasting and more real work on how accessibility will be affected.”

For further info, visit CGAsimulation.com

Mitchell Gingrich on the Elaine Herzberg tragedy and why the future will be autonomous.

Self-driving experts across the world agree: the trolley problem is a nonsense

Thanks to LinkedIn, self-driving experts from the UK and New Zealand have united to decry the trolley problem in relation to driverless cars.

Mitchell Gingrich, President of Autonomous Consulting in Christchurch, New Zealand, responded to our interview with Professor John McDermid, Director of the Assuring Autonomy International Programme at the University of York, saying: “Spot on about the trolley problem.”

Professor McDermid had asserted that: “The trolley problem is a nonsense… all these elaborate versions require self-driving vehicles to make distinctions that you or I could not.”

The trolley problem is a thought experiment which runs like this: Imagine there’s a runaway trolley and, ahead, five people are tied to the track; You are standing some distance off, next to a lever. If you pull it, the trolley will switch to a track only one person is tied to. What do you do?

Or, as Professor McDermid puts it: “Who do you save, a child or an older person? The child because they can be expected to live longer and benefit more. However, this is based on false assumptions. I don’t believe in the split second of a crash you go into that sort of thought process – you focus on controlling the vehicle and in most cases the best option is to (try to) stop.” 

I explained to Gingrich that my own opinion on the trolley problem has changed dramatically. When I wrote “Autonomous now: the shift to self-driving” in 2018 I was quite taken with it. In 2019, I wrote “The driverless dilemma: touchstone or red herring?”. Now, I am much more with Professor McDermid.

Gingrich opined that the March 2018 fatal accident involving an Uber Advanced Technology Group (Uber ATG) self-driving vehicle can aid in evaluating the trolley problem. The National Transportation Safety Board (NTSB) in the US recently completed an 18-month-long investigation and concluded there were 20 contributing factors. Some of those concerned the software misclassifying a pedestrian. A significant contributing factor was the safety driver’s inattentiveness.

The trolley problem assumes that a person or system is not only aware of the task of driving but also of the present and future merits of the lives of road users, he says. However, experience demonstrates that, sadly and all too frequently, road users pay the price for a lack of vigilance.

It turns out that Gingrich, a lawyer by trade, has been on quite a journey with autonomous vehicles himself. From working for Uber ATG in Phoenix, seeing first-hand the fallout from the Elaine Herzberg tragedy, to relocating to New Zealand and setting up Autonomous Consulting to push the case for driverless transport.

“I’m convinced that the future will be autonomous,” he says. “Whether it’s on public roads, in the air or on the seas, we will be utilising autonomous technology to transport our people and goods. That’s what autonomy is promising, but we’re in an interim period.

“New cars have advanced driver assistance systems (ADAS) like lane keep assist and automatic emergency braking. Some of us have been using cruise control for a long time, now it is adaptive – the car will keep its distance. These are autonomous features but not autonomy and we need to educate the public about the difference.

“Autonomy is about safety, resources and the environment. These ADAS systems expect me to pay attention to the road and the robot, and that’s not a recipe for safety. 93-94% of accidents are caused by human error, usually distraction – we think we’re paying attention, but we aren’t. There are repair and maintenance issues too, for example, around the correct calibration of sensors.

“In terms of resources, my personal car is a depreciating asset that isn’t used 90% of the time. Autonomous vehicles will also have a tremendous impact on town planning. An architect in the US imagined Manhattan pedestrianised and it freed up 60% of space.

“My freedom is not challenged by not having a personal vehicle. I’d have more money in my pocket and could use my smartphone to access different vehicles for different purposes.”

For further info, check out the white paper “The Driverless Revolution: What Next? The Future of Autonomous Vehicles in New Zealand”, by Mitchell Gingrich and Steven Moe, and this related podcast.

Carsofthefuture.co.uk is media partner for event boasting most senior collection of technology, AV, EV and ADAS leaders ever seen.

Carsofthefuture.co.uk is media partner for Car of the Future 2021

Carsofthefuture.co.uk has signed a media partnership agreement with Reuters Events for the two-day Car of the Future 2021 online event in June.

Intended to drive vehicle change to create a safer and more sustainable world, the event boasts the most senior collection of technology, autonomous vehicles (AV), electric vehicle (EV) and advanced driver-assistance system (ADAS) leaders ever seen.

High profile speakers include: Michelle Avary, Head of Automotive and Autonomous Mobility at The World Economic Forum; Carla Gohin, Research & Innovation Senior Vice President at Stellantis; Henrik Green, Chief Technology Officer at Volvo Cars; Sajjad Khan, Member of the Board of Management at Mercedes-Benz AG; José Muñoz, Global Chief Operating Officer at Hyundai Motor Company; and Dr Ken Washington, Chief Technology Officer at Ford Motor Company.

Carsofthefuture.co.uk founder, Neil Kennett, said: “We’re delighted to be a media partner for this exciting Reuters event which fits perfectly with our mission to chart the development of, and encourage sensible debate about, driverless cars in the UK. Full self-driving is a way off yet but as ever more advanced driver assistance systems become available, notably Automated Lane Keeping (ALK), it is vital that the public understands where we are with the technology and what it can and can’t do.”

Car of the Future 2021 will take place on 14-15 June. See reutersevents.com 

Ahead of this, Reuters Events will host a free webinar, Connectivity: Smarter and Safer Vehicles, on 24 March. Confirmed speakers include: Michelle Avary; Szabi Patay, Head of Automotive at Commsignia; Prashant Tiwari, Director of Intelligent Connected Systems at Toyota North America; and Frank Weith, Director of Connected and Mobility Services at Volkswagen Group America. Register here.

#ReutersEventsAutomotive

The key to the future of self-driving is education, education, education, says Millbrook’s Stoker.

On track and in virtual space, Millbrook tests cars of the future

Our Zenzic CAM Creator series continues with Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook.

Part of CAM Testbed UK, Millbrook Proving Ground in Bedford boasts 700 acres of private roads on which to develop and test connected and autonomous vehicle (CAV) technologies. As Chief Engineer, Peter Stoker is right at the forefront of self-driving in the UK.

Peter Stoker
Peter Stoker, Chief Engineer for Connected and Autonomous Vehicles at Millbrook

Please can you outline Millbrook’s work on connected and automated mobility?

“My primary role is to bring focus to two testbeds, our CAV testbed and our 5G testbed. We are not a purpose-built CAV testbed – we have safety, propulsion and conventional vehicle test facilities too – so CAV is something we’ve blended into the existing business.

“For the CAV testbed, we partnered with the UK Atomic Energy Authority (UKAEA), particularly the Remote Applications in Challenging Environments (RACE) division, to provide a controlled urban environment. We have three open source StreetDrone vehicles and miles of track with targets for very precise measurements, accurate to 1-2cm. We offer safety driver training and also have a simulation environment for driver-in-the-loop and hardware-in-the-loop testing. The whole idea is to fail in private, not in public, and to progress, to evolve out of the testbeds and on to open roads.

“The 5G testbed is a completely separate consortium, backed by the Department for Digital, Culture, Media and Sport (DCMS). We have 59 masts looking at all types of connectivity and I’d say the millimetre wave at 70GHz is currently the most interesting.”

Millbrook graphic
Millbrook Proving Ground graphic

What major shifts in UK road transport do you expect over the next 10 years? 

“Getting the crystal ball out, I see increased use of connectivity in existing vehicles and some very interesting new use cases – buses connected to city networks, video analytics from cameras, smart ambulances streaming live data, autonomous deliveries on campuses. What I don’t see within 10 years is millions of privately owned driverless cars. That will start in the luxury sector but to begin with it will be more about transporting goods.”

How do you see the testing framework for CAVs developing?

“There’s a lot of simulation in the automotive world – crash testing, fatigue testing, computational fluid dynamics. These days, manufacturers are developing whole vehicles before building a prototype. You have to have a good simulation on a good simulator and there’s an interesting shift that needs to happen on regulation. It’s early days on that, but it’s essential.

“The strength of virtual space is that you can run hundreds of scenarios in machine time – not only set up complicated scenarios that would take days with real cars, but actually speed up the process so it runs faster than real time. The national scenario database is already really good and regulation will move to being a mixture of real and virtual certification – global, European, UK and perhaps even city-specific. We are happy to advise, but don’t set policy.”

What are the biggest challenges in the shift to self-driving and how can these risks be mitigated?

“The key to the future of self-driving is education, education, education – for everyone, the public, vehicle manufacturers, the aftermarket, recovery operators. We have to work on the terminology – autonomous, driverless, CAV, CAM – it’s confusing, even to people who know what they’re talking about.

“At the moment, we’re making it harder to understand, not easier. We’re in a really grey area of transition with different trade names for systems. There’s a lot of groundwork needed to prepare people and, for example, the brilliant website mycardoeswhat.org does a great job of trying to explain it.

“If you get into a hire car, you need to have the right expectation of what it does and what it doesn’t do. If you buy a new car, you should read the manual, but how many people do? Especially with Covid, more cars are being delivered with minimal interaction – it’s a case of “there’s the key, where’s the station?”. Too often, the customer handover just isn’t there.

“How are garages, the aftermarket and the amber light sector going to deal with all this? Basic questions like how do you put it in neutral? ADAS has already led to huge changes in training and skill sets – how to calibrate and monitor them.

“We haven’t talked about over-the-air (OTA) updates, cameras embedded in the tarmac or even electrification – there’s a huge amount of things! How do you learn about them? Hopefully in testing rather than in crash situations.”

For further info, visit www.millbrook.co.uk

IPG expert says simulations can be better than real world testing.

The road to self-driving: Vehicle Certification Agency urged to accept simulation

Our Zenzic CAM Creator series continues with Elliot Hemes and Will Snyder of IPG Automotive UK.

Chartered engineer and self-proclaimed simulation evangelist, Elliot Hemes, previously worked in global product marketing at Jaguar Land Rover (JLR), covering future automotive trends. Now managing director at IPG Automotive UK, he works with big-hitters including Ford and JLR to provide virtual test driving environments. Here, in discussion with IPG Automotive sales engineer Will Snyder, he explains how simulation will be vital for the shift to self-driving.

EH: “As vehicle systems become more complex and interconnected, we ensure that manufacturers can virtually test their systems in realistic traffic situations, using an approach that is quick and accurate.”

WS: “IPG Automotive started in vehicle dynamics, then advanced driver assistance (ADAS) was the next big thing, now it is autonomous vehicles (AVs). The amount of testing required to achieve true autonomy is impossible to do in the real world. I believe we will get to Level5 autonomy, but there are some big hurdles such as accounting for human drivers in other vehicles – it would be much easier if every vehicle on the road was autonomous and connected.”

EH: “We might see it first in a city environment, restricted to less than 20mph. People put up lots of reasons why full autonomy can’t happen, but a blanket statement of “it’s too hard” just isn’t good enough. You could say, for example, you can’t use the M6 Toll unless you have vehicle-to-vehicle (V2V) communications. That would enable platooning – if one vehicle brakes, they all know about it. 99% of the time, great brakes will get you out of trolley problem scenarios.”

WS: “You cannot say AVs will never crash. The question should be: are they safer than human drivers? And the answer is yes, they definitely will be. When people talk about ADAS deskilling drivers, my response is: what skills?! It is well proven that concentration is badly affected by holding a conversion with someone else in the car, let alone fiddling with the radio or holding a hands-free phone call. We all get defensive about our driving prowess, but it needs to be recognised that the bar for driving is very low. You don’t even learn how to drive on a motorway – that’s not part of the driving test, which is one reason you get so many middle lane sitters.”

EH: “At the moment none of the major vehicle manufacturers are taking the leap to level 4/5, partly because they’re worried about litigation. Once the legislation is in place you will see truck platooning very quickly because of the enormous cost savings. It will require vehicle-to-everything (V2X) and V2V communications. The current ADAS technology is great but the systems are very digital and can have issues with poor light and bad weather. It will improve over time.”

WS: “We could even skip Level 3 as it is safer to move straight to Level4. In my opinion, the driver needs to be either active or not – expecting them to retake control in time in an emergency situation is just not realistic.”

EH: “Over the next decade you will see the gradual adoption of ADAS technologies. Adaptive cruise control (ACC) will become standard and that will avert so many crashes, particularly rear-end shunts. It doesn’t take away from the driver, it just intervenes. However, there is a concern about the performance of these systems in low light conditions – we need much more focus on the edge cases.

“OEMs engineer to perfect Euro NCAP test conditions. In the real world, what happens if the sun is low in the sky, or the pedestrian steps out more quickly? You cannot practically test these kinds of things on a track, which is why you have simulations. You can study that edge case over and over. We’ve had customers ask us to recreate exactly the same environment as the test track, including noise that’s nothing to do with the question in hand. Our advice is not to try to simulate the real world – design the simulation to study exactly the question you want to answer.

“In this way simulation can be better than the real world. Say, for example, you want to test a pedestrian Autonomous Emergency Braking (AEB) function in the early stage of development. You just want to know if, in the CarMaker environment, it performs the right output – applying enough braking to stop before it hits the dummy pedestrian. The next step is to put that software into an ECU. You can do all that with hardware-in-the-loop testing, improving the capability step-by-step without building a prototype vehicle or driving billions of real-world miles.

“Further still, under heavy braking, the front camera might well point to the floor, maybe the car might start to drift. You can do all that in simulation, to prove that your algorithms hold up and the car does what you think it will do.”

WS: “Another problem with running prototype vehicles on test tracks is that you spend an awful lot of time fixing thousands of other small faults before you get on with what you’re supposed to be testing. We can get all these edge cases done before you get to the test track. By using simulations you get so much more out of the valuable test track time.”

EH: “The ‘systems engineering V’ has all the theoretical stuff on the left, then hardware on the right and validation at the top. Ideally we’ll get to the stage where only validation happens in the physical world. Until the homologation and certification authorities are able to accept simulation results you can’t do enough testing to get AVs on the road. That’s why it is such a vital part of the Zenzic CAM Roadmap.”

For further info, visit ipg-automotive.com

The UK’s National Physical Laboratory is working on a framework for virtual sensor testing.

Developing test frameworks which build a bridge of trust to driverless cars in the UK

Our Zenzic CAM Creator series continues with Andre Burgess, digital sector strategy leader at the National Physical Laboratory (NPL).

NPL is the UK’s National Metrology Institute, responsible for developing and maintaining the national primary measurement standards. For over a century, it has worked to translate scientific expertise into economic prosperity, skilled employment and improved quality of life, covering everything from cancer treatments to quantum computing. In the self-driving sector, Andre Burgess’s focus is test frameworks to support the deployment of safe and reliable autonomous transport on land, sea and air.

Andre Burgess, digital sector strategy leader at NPL
Andre Burgess, digital sector strategy leader at NPL.

AB: “We’re all about measurement and how it can be applied to the autonomous vehicle space. Artificial intelligence (AI) and machine learning represents a great transformation. Whereas in the past we’ve developed tests for whether a human is fit to do something, in this new world we need a new set of tests to assure autonomous systems and build a bridge of trust. This is not a one-off test, it is ongoing work to develop new methodologies and support the development of new standards.

“One of the key things this country has developed is Testbed UK, a collaboration between government and industry which has delivered a formidable testing environment – a network of safe, highly controlled environments increasingly linked to virtual testing.

“Working with the Met Office on behalf of the Centre for Connected and Autonomous Vehicles (CCAV) over the last year we have focused on the usability and reliability of sensors in different weather conditions. How do you know if sensors are performing well? How do you validate the decision making? How do you apply metrics and KPIs to this? Having undertaken a proof of concept for a testing framework, we are confident this can be delivered and deployed throughout the industry.

“There is much talk about pass/fail tests but our focus is confidence, improving confidence in the outputs and building confidence in the system. We collaborate across the board, with regulators, testers, developers – engaging with them to understand their requirements.  Our approach is to provide tools which help reduce the barriers to innovation without compromising regulation and safety assurance.  Striking the right balance between reliability and usability is key. Our work will support validation and help the UK to influence international standards.

“The biggest transformation in road transport over the next decade will be emissions reduction and self-driving vehicles and smart mobility systems will be key drivers. It will require changes to infrastructure and changes in habits – batteries or hydrogen will be critical, perhaps a need to drive more slowly, maybe less private car ownership. The impact of Covid has led to a move away from trains and buses, so a resurgence of public transport is vital.

“In terms of self-driving, I envisage there will be personally driven vehicles and on-demand vehicles. Increasingly I expect we’ll see a transition into smaller public transport vehicles, perhaps for 8-10 people, in continuous use. There’s real value in getting to places that don’t have bus stops and there’ll be benefits from autonomous safety features too. It won’t be everywhere but I hope within 10 years there’ll be good examples of that in the UK. The question is will we be ahead or behind the curve? In some more authoritarian countries implementation might be faster but maybe not better.

“We’ll also start to see autonomous low level aviation and autonomous shipping, for example, short cargo sea freight. Combined, these things will make roads less congested. Key transport stakeholders have expressed the need to integrate, to pursue the most efficient way to get goods into and around the UK.

“For our part, we are focused on the framework for virtual sensor testing, and also integration between virtual and physical testing. To give an accurate level of confidence requires understanding the common metrics and the areas of uncertainty. The human factor is so important, for example, what about the people with cars that don’t have this tech – how do they respond?”

For further info visit www.npl.co.uk.

Autonomous vehicle software specialist set to become a major UK success story.

Oxbotica secures huge BP investment and targets anything that moves people or goods

Oxford University spin-out, Oxbotica, has been on our must-speak-to list for a while, and on Friday we got some Zoom time with the top people – CEO, Ozgur Tohumcu, and co-founder and CTO, Professor Paul Newman.

It’s three weeks since the autonomous vehicle software specialist announced a US$47m Series B investment led by bp ventures. Yes, that BP. The press release asserts that this will accelerate the deployment of Oxbotica’s platform “across multiple industries and key markets”, but Prof. Newman is quick to emphasise this is not about robotaxis, not even about cars.

Prof Paul Newman, Oxbotica co-founder and CTO.
Prof Paul Newman, Oxbotica co-founder and CTO.

“We’ve been deploying our software in industrial settings – mines, airports – for six years now, and not only in the UK, in Europe, North America, Australia,” he says. “Everyone talks about cars but all vehicles are game for us – anything that requires moving people or goods. That’s the advantage of being pure software.

“We’re a global business and raising this kind of money during a pandemic speaks volumes. We have clear water behind and blue sky ahead. Having these new investors and strategic partners will really allow us to drive home the opportunities that came last year. Vehicles are common but software of our standard is not. We’re showing that great IP can be generated everywhere, not just Silicon Valley, and that’s very refreshing.”

While Prof. Newman focuses on the vision, Tohumcu provides the detail. “Since the funding announcement, the exchange rate means it’s actually worth closer to $50m, so that’s not bad,” he says. “We’ve just conducted a review of the business and it was pleasing to see that we achieved exactly what we said we’d do two years ago – delivering results against measurable goals.

Ozgur Tohumcu, Oxbotica CEO.
Ozgur Tohumcu, Oxbotica CEO.

“We’ve done a lot of planning recently – some well-defined, other things we’re still making choices about. We’ve been approached by new companies interested in using our tech and there are exciting deals in the pipeline, deals that come with investment. We’ll be making further announcements over the coming weeks and months.”

Make no mistake, Oxbotica is set to become a major UK success story… just don’t mention driverless cars!