The UK’s National Physical Laboratory is working on a framework for virtual sensor testing.

Developing test frameworks which build a bridge of trust to driverless cars in the UK

Our Zenzic CAM Creator series continues with Andre Burgess, digital sector strategy leader at the National Physical Laboratory (NPL).

NPL is the UK’s National Metrology Institute, responsible for developing and maintaining the national primary measurement standards. For over a century, it has worked to translate scientific expertise into economic prosperity, skilled employment and improved quality of life, covering everything from cancer treatments to quantum computing. In the self-driving sector, Andre Burgess’s focus is test frameworks to support the deployment of safe and reliable autonomous transport on land, sea and air.

Andre Burgess, digital sector strategy leader at NPL
Andre Burgess, digital sector strategy leader at NPL.

AB: “We’re all about measurement and how it can be applied to the autonomous vehicle space. Artificial intelligence (AI) and machine learning represents a great transformation. Whereas in the past we’ve developed tests for whether a human is fit to do something, in this new world we need a new set of tests to assure autonomous systems and build a bridge of trust. This is not a one-off test, it is ongoing work to develop new methodologies and support the development of new standards.

“One of the key things this country has developed is Testbed UK, a collaboration between government and industry which has delivered a formidable testing environment – a network of safe, highly controlled environments increasingly linked to virtual testing.

“Working with the Met Office on behalf of the Centre for Connected and Autonomous Vehicles (CCAV) over the last year we have focused on the usability and reliability of sensors in different weather conditions. How do you know if sensors are performing well? How do you validate the decision making? How do you apply metrics and KPIs to this? Having undertaken a proof of concept for a testing framework, we are confident this can be delivered and deployed throughout the industry.

“There is much talk about pass/fail tests but our focus is confidence, improving confidence in the outputs and building confidence in the system. We collaborate across the board, with regulators, testers, developers – engaging with them to understand their requirements.  Our approach is to provide tools which help reduce the barriers to innovation without compromising regulation and safety assurance.  Striking the right balance between reliability and usability is key. Our work will support validation and help the UK to influence international standards.

“The biggest transformation in road transport over the next decade will be emissions reduction and self-driving vehicles and smart mobility systems will be key drivers. It will require changes to infrastructure and changes in habits – batteries or hydrogen will be critical, perhaps a need to drive more slowly, maybe less private car ownership. The impact of Covid has led to a move away from trains and buses, so a resurgence of public transport is vital.

“In terms of self-driving, I envisage there will be personally driven vehicles and on-demand vehicles. Increasingly I expect we’ll see a transition into smaller public transport vehicles, perhaps for 8-10 people, in continuous use. There’s real value in getting to places that don’t have bus stops and there’ll be benefits from autonomous safety features too. It won’t be everywhere but I hope within 10 years there’ll be good examples of that in the UK. The question is will we be ahead or behind the curve? In some more authoritarian countries implementation might be faster but maybe not better.

“We’ll also start to see autonomous low level aviation and autonomous shipping, for example, short cargo sea freight. Combined, these things will make roads less congested. Key transport stakeholders have expressed the need to integrate, to pursue the most efficient way to get goods into and around the UK.

“For our part, we are focused on the framework for virtual sensor testing, and also integration between virtual and physical testing. To give an accurate level of confidence requires understanding the common metrics and the areas of uncertainty. The human factor is so important, for example, what about the people with cars that don’t have this tech – how do they respond?”

For further info visit www.npl.co.uk.

Autonomous vehicle software specialist set to become a major UK success story.

Oxbotica secures huge BP investment and targets anything that moves people or goods

Oxford University spin-out, Oxbotica, has been on our must-speak-to list for a while, and on Friday we got some Zoom time with the top people – CEO, Ozgur Tohumcu, and co-founder and CTO, Professor Paul Newman.

It’s three weeks since the autonomous vehicle software specialist announced a US$47m Series B investment led by bp ventures. Yes, that BP. The press release asserts that this will accelerate the deployment of Oxbotica’s platform “across multiple industries and key markets”, but Prof. Newman is quick to emphasise this is not about robotaxis, not even about cars.

Prof Paul Newman, Oxbotica co-founder and CTO.
Prof Paul Newman, Oxbotica co-founder and CTO.

“We’ve been deploying our software in industrial settings – mines, airports – for six years now, and not only in the UK, in Europe, North America, Australia,” he says. “Everyone talks about cars but all vehicles are game for us – anything that requires moving people or goods. That’s the advantage of being pure software.

“We’re a global business and raising this kind of money during a pandemic speaks volumes. We have clear water behind and blue sky ahead. Having these new investors and strategic partners will really allow us to drive home the opportunities that came last year. Vehicles are common but software of our standard is not. We’re showing that great IP can be generated everywhere, not just Silicon Valley, and that’s very refreshing.”

While Prof. Newman focuses on the vision, Tohumcu provides the detail. “Since the funding announcement, the exchange rate means it’s actually worth closer to $50m, so that’s not bad,” he says. “We’ve just conducted a review of the business and it was pleasing to see that we achieved exactly what we said we’d do two years ago – delivering results against measurable goals.

Ozgur Tohumcu, Oxbotica CEO.
Ozgur Tohumcu, Oxbotica CEO.

“We’ve done a lot of planning recently – some well-defined, other things we’re still making choices about. We’ve been approached by new companies interested in using our tech and there are exciting deals in the pipeline, deals that come with investment. We’ll be making further announcements over the coming weeks and months.”

Make no mistake, Oxbotica is set to become a major UK success story… just don’t mention driverless cars!

Humanising Autonomy uses behavioural psychology and computer algorithms to make cities safer for pedestrians and cyclists.

Using cameras and AI to protect vulnerable road users

Our Zenzic CAM Creator series continues with Raunaq Bose, co-founder of Humanising Autonomy.

Before establishing predictive artificial intelligence (AI) company Humanising Autonomy in 2017, Raunaq Bose studied mechanical engineering at Imperial College London and innovation design engineering at the Royal College of Art. Focusing on the safety of vulnerable road users, Humanising Autonomy aims to redefine how machines and people interact, making cities safer for pedestrians, cyclists and drivers alike.

RB: “Our model is a novel mix of behavioural psychology, deep learning and computer algorithms. We work with OEMs and Tier 1 suppliers on the cameras on vehicles, with the aftermarket on retrofitted dashcams, and also with infrastructure. Our software works on any camera system to look for interactions between vulnerable road users, vehicles and infrastructure in order to prevent accidents and near misses. While most AI companies use black box systems where you can’t understand why decisions are made, we set out to make our models more interpretable, ethically compliant and safety friendly.

“When it comes to questions like ‘Is this pedestrian going to cross the road?’, we look at body language and factors like how close they are to the edge of the pavement. We then put a percentage on the intention. Take distraction, for example, we cannot see it but we can infer it. Are they on the phone? Are they looking at the oncoming vehicle? Is their view blocked? These are all behaviours you can see and our algorithm identifies them and puts a numerical value on them. So we can say, for example, we’re 60% sure that this pedestrian is going to cross. This less binary approach is important in building trust – you don’t want lots of false positives, for the system to be pinging all the time.

“One of the main things we’re likely to see over the next decade is increased use of micromobility, such as cycling and e-scootering. At the same time you will see more communication between these different types of transportation, and also with vehicles and infrastructure. The whole point of ADAS is to augment the driver’s vision, to reduce blind spots and, if necessary, take control of the vehicle to avoid a shunt. Then there’s the EU agreement that by 2022 all buses and trucks must have safety features to detect and warn of vulnerable road users.

“We currently only look at what’s outside the vehicle, but with self-driving there will be monitoring of the cabin. In terms of privacy, we have a lot of documentation about our GNPR processes and how we safeguard our data. Importantly, we never identify people, for example, we never watch for a particular individual between camera streams. We look to the future with autonomous cars but for now we’re focused on what’s on the road today.”

For further info visit humanisingautonomy.com.

UK government sparks global business sharing transport sector data.

Sharing data collected by connected cars

Our Zenzic CAM Creator series continues with Mika Rasinkangas, founder and President of Chordant.

Originally part of the global wireless and internet of things (IoT) research company, InterDigital, Chordant was spun out as a separate business in 2019, as “a dynamic data sharing expert”. The spark was a UK government initiative to test the hypothesis that regional transportation data has tremendous value, especially when shared between different parties. The results of this two-year public-private partnership were startling.

Please can you outline your work on connected and automated mobility?

MR: “First of all we looked at the mobility space. There’s the segment that maintains the road network and their supply chain, the mobility service providers – bus companies, train operators and new entrants such as Uber – then the whole automotive sector, OEMs and their supply chain partners. We sit right in the middle of all this and our role is data exchange – bringing dynamic data sets from different sources to come up with something different that solves problems with data driven solutions.

“The hypothesis was that a lot of data in the transport segment was either underutilised, in really small silos, or not utilised at all. The idea was to work with different entities – organisations, companies and universities – to bring data together and make it more widely available, leading to innovation and efficiency.

“It was obvious from early on that this was not only a technical issue, there was a human element. Data is controlled by different entities and departments so the challenge was to get these different data owners comfortable with the idea that their data could be used for other purposes, and to get consumers comfortable with it too. The result was more usable and more reliable dynamic data.”

What major shifts in UK transport do you expect over the next 10-15 years?

MR: “Last mile transport, micromobility solutions are ballooning and Covid19 will only accelerate this. People are walking, scootering and biking more, making short trips by means which don’t involve public transport or being in close contact to others.

“In terms of automotive, we’re living through a massive change in how people perceive the need to own a car, and this shift in perception is changing the fundamental business models. Autonomous vehicle technology keeps developing, connected vehicles are everywhere already and electric cars represent an ever bigger proportion of the vehicle population. In all these segments data utilisation will continue to increase. New cars collect huge amounts of data for lots of purposes and this can be used for lots of things other than what it was originally collected for.”

Can you address the data privacy concerns surrounding connected cars?

MR: “Data privacy is a multifaceted topic. On the one hand, Europe has been at the forefront of it with GDPR. That puts businesses operating in Europe on a level playing field. In terms of connected and autonomous vehicles (CAVs), these regulations set limitations on what data can be harvested and what has to be anonymised in order for someone to use it. It fits the norms of today’s society, but you can see in social media that this kind of privacy seems less important to younger people, however perspectives vary greatly and companies need to be transparent in usage of people’s data.

“From a business perspective, we have to take privacy extremely seriously. The explosion of data usage can have unintended consequences but by and large the regulatory environment works quite reasonably.

“We typically deal with conservative entities which put privacy and security in the middle of everything – if there’s any uncertainty it’s better to not do it, is the attitude. Think of all the sensitive personal data that entities like car companies and mobile telephone companies have. It can give an extremely accurate picture of peoples’ behaviour. There are well established procedures to anonymise data so customers can be comfortable that their personal data cannot be identified.”

What are the main risks in the shift to self-driving and how can these be mitigated?

MR: “One could talk about a lot of different challenges. What about the latency in connectivity in order to ensure processing takes place fast enough? There’s a gazillion of things, but to me these are technical nuts that will be cracked, if they haven’t been already. One of the biggest challenges is the interaction between human-controlled vehicles and automated vehicles. When you add in different levels of driver assistance, urban and rural, different weather conditions – all sorts of combinations can happen.

“The UK is at the forefront of CAV testing. There are government sponsored testbeds and companies are running trials on open roads, so the automotive industry can test in real-life environments. We cannot simulate everything, and the unpredictability of interactions is one of the biggest challenges. A traffic planner once told me that in his nightmares he sees a driverless car heading toward a granddad in a pick-up truck, because there’s just no telling how he might react!”

Is there anything else you’d like to mention?

MR: “I’d like to address the explosion of data usage in mobility and how dynamic data enables not only efficiency improvements but new business models. According to recent studies by companies like Inrix, congestion costs each American nearly 100 hours or $1,400 a year. Leveraging data-driven insights can drive change in both public policies and behaviours. In turn, these can result in reduced emissions, improved air quality and fewer pollution-caused illnesses.

“CAVs can be data sources providing tons of insight. Think about potholes – new vehicles with all these cameras and sensors can report them and have them fixed much more efficiently. This is just one example of entirely data-driven efficiency, much better than eyeballing and human reporting. There will be a multitude of fascinating uses.

“Organisations such as vehicle OEMs, transport authorities and insurance providers will require facilities for the secure and reliable sharing of data, and that’s where we come in. I would urge anyone interested in data driven solutions in the mobility space to visit chordant.io or our Convex service site at convexglobal.io.”

Dr Charlie Wartnaby says there’s an industry consensus that Level3 self-driving is not reasonable if it requires quick driver intervention.

Self-driving world first: multi-car cooperative crash avoidance

Our Zenzic CAM Creator series continues with Dr Charlie Wartnaby, chief engineer at Applus IDIADA.

Way back in 2019 we covered IDIADA’s role in the construction of the new CAVWAY testing facility, and that investment continued with a large new venture. With a PhD in physical chemistry from the University of Cambridge, Charlie Wartnaby was technical lead for the ground-breaking Multi-Car Collision Avoidance (MuCCA) project.

Charlie Wartnaby, chief engineer at Applus IDIADA
Charlie Wartnaby, chief engineer at Applus IDIADA

CW: “Certainly the funding from the Centre for Connected and Autonomous Vehicles (CCAV) for MuCCA and CAVWAY were big wins for us. Traditionally, we’d focused on automotive electrics and engine management, but we could see there was all this exciting CAV work. Now we’re working with an OEM I can’t name to run an field operational test using our IDAPT development tool – a high performance computer with GPS and car-to-car communications – as a spin-off from MuCCA.

“With the MuCCA project, we think we achieved a world first by having multiple full-sized vehicles do real-time cooperative collision avoidance. We still have the cars for further R&D when time, budget and Covid allow.

IDIADA’s Multi-Car Collision Avoidance (MuCCA) project

“In the UK, we’re focussed on building a new proving ground (CAVWAY) near Oxford, which should open in 2021. There’s also our CAVRide Level4 taxi project, at our headquarters near Barcelona. CAVRide shares some of the technology developed for MuCCA and they’ve done some really interesting vehicle-in-the-loop testing, having the real vehicle avoid virtual actors in a simulation environment.

“In the short term, we’re really working hard on the C in CAV. Connected vehicles offer massive safety and efficiency improvements, for example, by warning about stopped vehicles or advising on speed to get through traffic lights on green. There’s a bit of a VHS versus Betamax situation, with both WiFi-based short-range communications and the C-V2X 5G-based protocol, so we’ve upgraded IDAPT to support both.

“Personally I think that while heroic work by the big players shows robotaxi applications are feasible, economic viability is a long way off, 2030 maybe. Watch the latest Zoox and Waymo videos from America, they’re mesmerising! No way is that kind of tech going to be installed in private cars any time soon because it’s eye-wateringly expensive. Think about the costs involved in making every taxi driverless – it’d be out of all proportion to replacing driver salaries, especially considering backup teleoperators and maintenance and charging personnel.

“These big self-driving companies aren’t operating in the UK yet, but we do have very successful smaller players with intellectual property to sell. The UK government has been supporting a good number of R&D projects, via the CCAV and UK Research and Innovation (UKRI), and the regulatory environment has been reasonably friendly so far.

“I feel the first practical applications are likely to be low-speed shuttle buses and small autonomous delivery droids, but trucking is a very important area. If lorry drivers were permitted to stop their tachographs while napping in the back of the cab once on the motorway – only clocking up hours for parts of long journeys – that would make a viable economic case for a Level4 operating design domain (ODD) of ‘just motorways’, which is harder to justify merely as a convenience feature in private cars.

“In terms of current tech, emergency lane keeping systems (ELK), to stop drifting, are a major breakthrough, requiring cameras, sensors and autonomous steering. I welcome the road safety, however, if drivers engage automation systems like ALKS (automated lane keeping) by habit, for sure their skills will be affected. Perhaps there’s a case for the system enforcing some periods of manual driving, just as airline pilots perform manual landings to stay in practice even in planes that can land themselves.

“Concerns about timely handover are well-founded and I think there’s an industry consensus now that Level3 is not reasonable if it requires quick driver intervention. We see up to 20 seconds before some unprepared drivers are properly in control when asked to resume unexpectedly. It really requires that the vehicle can get itself into (or remain in) a safe state by itself, or at least there needs to be a generous takeover period. The difference between L3 and L4 is that the latter must always be able to achieve that safe state.”

For further info, visit www.idiada.com

Prof John McDermid says the trolley problem is a nonsense, requiring self-driving vehicles to make distinctions that you or I could not.

Why assuring machine learning is crucial to self-driving

Our Zenzic CAM Creator series continues with Professor John McDermid OBE FREng, Director of the Assuring Autonomy International Programme at the University of York.

Professor John McDermid has been Director of the Assuring Autonomy International Programme, a partnership between Lloyd’s Register Foundation and the University of York, since 2018. He advises government and industry on safety and software standards, including Five and the Ministry of Defence, and was awarded an OBE in 2010. The author of 400 published papers, his 2019 article, Self-driving cars: why we can’t expect them to be ‘moral’, was highly critical of the oft-quoted trolley problem in relation to driverless vehicles.

Professor John McDermid, University of York
Professor John McDermid, University of York

PJM: “I’ve been at York for 30 years working on the safety of complex computer-controlled systems. What you define as complex changes all the time. In January 2018 we started a new programme, looking at the assurance of robots and autonomous systems, including automated mobility, but also robots in factories, healthcare and mining.

“It’s important to demonstrate the safety and security of novel technologies like machine learning, but there’s often a trade-off involved, because you can make things so secure they become unusable. If I open my car with the remote key I have a couple of minutes before it automatically locks again, and there’s a small possibility that someone could get their finger trapped if they try to open the door just as it automatically re-locks. We encounter these types of trade-offs all the time.”

What major shifts in UK transport do you expect over the next 10-15 years?

PJM: “Over the next decade we will get to Level4 autonomous driving, so in defined parts of the road network cars will drive themselves. We will solve the safety problems of that technology, but I’d be surprised if it is within five years. Despite the rhetoric, Tesla’s approach is not on track for safe autonomous driving within the year.

“At the same time, there will be a trend towards Mobility as a Service (MAAS). I love my car, but I’ve had it for 18 months and have only driven 7,000 miles. I sometimes ask myself why I have this expensive piece of machinery. A recent study showed that the average car in the UK is only used for 53 minutes a day. Mostly, they sit doing nothing, which, considering the huge environmental impact of manufacturing all these vehicles, is very wasteful.

“If I could call upon a reliable autonomous vehicle and be 99% certain that it would arrive in a timely manner, say within five minutes, I’d probably give up my car. It should also be noted that the two trends go hand-in-hand. Having Level4 is critical to achieving MAAS, delivering all the convenience of having your own car without any of the hassle.”

Can you address some of the data privacy concerns surrounding connected cars?

PJM: “We are back to this issue of trade-offs again. I want my MAAS so I’ve called it up and given the service provider some information about where I am. If they delete that information after I’ve paid then I’m prepared to accept that. What if the company wants to keep the information but won’t allow access except for law enforcement – would that be acceptable to the public? What can government agencies require this company to do?

“Another example: What if your 10-year-old daughter needs MAAS to take her to school? A reasonable concerned parent should be able to track that. What if the parents are divorced, can they both access that data? There’s clearly a privacy issue and there needs to be a legislative framework, but it’s a balance. For the purposes of getting from A to B, most people would accept it, so long as their data is normally kept private.”

Can you address concerns about the trolley problem in relation to self-driving cars?

PJM: “My basic feeling is that the trolley problem is a nonsense, a distraction. All these elaborate versions require self-driving vehicles to make distinctions that you or I could not.

“The big Massachusetts Institute of Technology (MIT) study sets a higher standard for autonomous vehicles than any human can manage. Who do you save, a child or an older person? The child because they can be expected to live longer and benefit more. However, this is based on false assumptions. I don’t believe in the split second of a crash you go into that sort of thought process – you focus on controlling the vehicle and in most cases the best option is to (try to) stop.

“I don’t know why people find the trolley problem so compelling, why they waste so much energy on it. I really wish it would go away. Fortunately, most people seem to be coming to that conclusion, although one of our philosophy lecturers strongly disagrees with me.”

Which sectors do you think will adopt self-driving first?

PJM: “Farming applications might come first as they are short of people in agriculture and the problems are simpler to overcome. If you geofence a field where you wish to use a combine harvester and equip it with technology so it doesn’t run over a dog lying asleep in the field – there’s already tech which is getting quite close to that – then that’s an attractive solution.

“Last mile freight via small delivery robots (like Nuro in the US and Starship here in the UK) might also come quickly, but longer distance freight will probably require a segregated lane. Even last mile robots come with risks, like people tripping over them.

“There’s a lot of commercial desire for robotaxis, and this is potentially a very big market. There are already genuine driverless taxis in the US now, but they have a much simpler road structure than here in the UK.

“The crucial technical bit is finding accepted ways of assuring the machine learning. I would say that, I work on it, but without that regulators and insurers won’t allow it.”

For further info, visit www.york.ac.uk/assuring-autonomy

Dr Joanna White says Highways England is currently more focused on the connected bit of connected and automated mobility (CAM).

Highways England expert predicts Level4 self-driving in towns before motorways

Our Zenzic CAM Creator series continues with Dr Joanna White, Head of Intelligent Transport Systems at Highways England.

As the body responsible for designing, building and maintaining our motorways and major A-roads, Highways England (HE) is a uniquely important player in the UK connected and automated mobility (CAM) ecosystem. Here, Head of Intelligent Transport Systems at Highways England, chartered engineer Dr Joanna White, outlines its work on CAM.

Dr Joanna White, Head of Intelligent Transport Systems at Highways England
Dr Joanna White, Head of Intelligent Transport Systems at Highways England

JW: “A key aim in improving our service is to look at how we can safely use emerging technology to better connect the country – people and places, families and friends, businesses and customers. This includes what digital channels we might use, delivering a cleaner road environment and achieving net zero carbon.

“Our connected corridor project on the A2/M2 in Kent finished 10 months ago and we are just completing the evaluation. Collaboration is vital and this was a joint project with Kent County Council (KCC), Transport for London (TfL), the Department for Transport (DfT) and others. It was also part of a wider European project, Intercor.

“We are currently more focused on the connected bit of CAM, building on the services we already provide. This includes beaming information directly into vehicles (replicating what you see on the gantries) and also what data we can anonymously collect from vehicles’ positioning sensors. Can we maintain service from one part of the network to another? Can we do it in an accurate, timely and secure way? How do people feel about it?

“We try not to choose particular technologies – whether it’s radar, lidar, cellular – we are interested in all of it. It could be 5G and, via the DfT, we work closely with the Department for Digital, Culture, Media and Sport (DCMS), which leads on that. One of the most positive government actions was the requirement for mobile operators to provide 90% coverage of the motorway network by 2026.

Highways England car interior 2
Highways England in-car upcoming junction message

“We were very proud to be involved with the HumanDrive project in which a self-driving Nissan Leaf navigated 230 miles from Cranfield to Sunderland. It was a great learning experience in how to  conduct these trials safely, underpinned by our safety risk governance. We had to identify all the risks of running such a vehicle on the strategic road network (SRN), and find ways to mitigate them. It was fascinating to see how it coped on different types of roads, kept to the lines and responded to road sign information.

“Then there’s our Connected and Autonomous Vehicles: Infrastructure Appraisal Readiness (CAVIAR) project, which has been slightly delayed due to Covid. We are building a simulation model of a section of the M1, a digital twin, and we have a real-world car equipped with all the tech which will start operating in 2021. That will collect a lot of data. This is one of our Innovation competition winning projects, run by InnovateUK.

“Within Highways England we have a designated fund for this kind of research, and that means we can invest in further trials and do the work needed to provide more vehicle-to-infrastructure (V2I) communications.

“Personally, I think that Level4 self-driving, eyes off and mind off, is years away, perhaps decades, certainly in terms of motorway environments. However, we are constantly in discussion with government on these issues, for example, we contributed to the recent consultation on Automated Lane Keeping Systems (ALKS).

“Working closely with industry and academia, we have already started off-road freight platooning and are looking to move to on-road trials. We’ve had lots of discussions about freight-only lanes and the left lane is often suggested, but you have to consider the design of the road network. There are lots of junctions close to each other, so how would that work, especially at motorway speeds? At first, I see self-driving more for deliveries at slower speeds in urban areas but, as always, we will listen to consumer demand.”

For further info see highwaysengland.co.uk.

Why digital twins are crucial to the development of ADAS and CAV.

This is no game: how driving simulations save lives

Our Zenzic CAM Creator series continues with Josh Wreford, automotive manager at driving simulation software provider, rFpro.

With digital twins so crucial to the development of advanced driver assistance systems (ADAS), carmakers including Ferrari, Ford, Honda and Toyota have turned to driving simulation software provider, rFpro. Here, automotive manager Josh Wreford explains the company’s cutting-edge work.

Josh Wreford of rFpro
Josh Wreford of rFpro

JW: “While others use gaming engines, our simulation engine has been designed specifically for the automotive industry, and particularly connected and autonomous vehicles (CAVs). That’s a big difference because gaming software can use clever tricks to make things seem more realistic, whereas our worlds are all about accuracy.

“We use survey grade laser scanning to create highly detailed virtual models and have an array of customers testing many different ADAS and CAV features, everything from Level1 right up to Level5. We can go into incredible detail, for example, with different render modes for lidar, radar and camera sensors, it is possible to simulate different wavelengths of the electromagnetic spectrum for detailed sensor modelling. It is up to the customer to decide when their system is ready for production, but we save them a lot of time and money in development.

rFpro simulation Coventry
rFpro simulation: Coventry town centre

“Safety critical situations are extremely difficult to test in the real world because it’s dangerous and crashing cars is expensive! That’s why digital twins are great for things like high speed safety critical scenarios – you can test human inputs in any situation in complete safety. Whenever you have a human in play you’re going to have problems because we’re great at making mistakes and are very unpredictable! rFpro provides high quality graphics running at high frame rates to immerse the human in the loop as much as possible. This allows accurate human inputs for test scenarios like handover to a remote driver. We can even allow multiple humans to interact by driving in the same world.

rFpro simulation Holyhead
rFpro simulation: Holyhead

“Before joining rFpro, I worked at McLaren Automotive on gearbox control software, which involved very similar control coding to ADAS. Ethical questions are always interesting, but ultimately a control engineer has to decide what the next action should be based on the exact situation. Our simulations drive robust engineering and better algorithms, so you get the best reaction no matter what occurs.”

For further info, visit rfpro.com.

Creative technologist Ushigome on future vehicle-to-pedestrian (V2P) communications.

Self-driving news flash: flickering lights to replace eye contact in facilitating trust

Our Zenzic CAM Creator series continues with Yosuke Ushigome, Director at design innovation studio Takram.

Listing his primary interest as “emerging technologies”, London-based creative technologist, Yosuke Ushigome, has been working with Toyota on future car concepts for over 10 years. Here, he gives his thoughts on the key issues in driverless car design.

Yosuke Ushigome, director Takram
Yosuke Ushigome, director Takram

YU: “We come from a user experience (UX) background and over the years our projects with Toyota have got bigger and higher level. In 2018, with the e-Palette concept, we started taking a more holistic approach to mobility and automation – an on-the-ground people perspective on the entire system, rather than the UX of an interior, exterior or service.

“There’s going to be a trend in transparency and trust. How can designers help the systems, passengers, pedestrians and others to communicate? In the past, this has usually been based around the driver and passenger, but that’s got to expand. In cars of the future, pedestrians will not be able to look into the driver’s eyes – what’s driving might not even be on the car, it might be in the cloud.

“How can you communicate interactions that facilitate trust? That’s really interesting. People pick things up from little movements in their peripheral vision, so you come back to old school ideas like patterns of flickering lights. How fast it flashes, or flashing from left to right, could give people a little nudge, maybe help them to detect danger. This kind of experimentation will definitely increase.

“Level5 autonomy seems to me to be very far off. Level4, in areas where the road system is designed for self-driving, or on private roads where there’s more separation between vehicles and pedestrians, is coming rapidly – things like deliveries between factories. Starship delivery robots are already deployed in Milton Keynes and economics will drive adoption, especially with the pandemic.

“I would like to be part of this transformation, so long as it is inclusive. There’s an opportunity to meet the needs of people left behind by our existing transport, whether that’s physical disability or economic disadvantage.”

Toyota e-Palette concept, via Takram
Toyota e-Palette concept, via Takram

Toyota had planned to showcase its e-Palette mobility solution at the Tokyo 2020 Olympic and Paralympic Games, so hopefully we’ll get to see it next summer.

For further info, visit Takram.com.

Vivacity Labs founder backs the citizen first vision of 21st century privacy.

Time for a grown-up conversation about cameras, AI, traffic flow and privacy

Our Zenzic CAM Creator series continues with the founder of Vivacity Labs, Mark Nicholson.

Vivacity uses sensors, cameras and artificial intelligence (AI) to provide “up-to-the-minute data on urban movement”, helping local councils to promote active travel, improve safety and reduce congestion. Big Brother you say? Well, it’s 2020 not 1984 and CEO Mark Nicholson is very happy to have that debate.

MN: “As the transport network becomes more complicated, local authorities need more powerful tools. Tech giants have invaded the ecosystem, and when you’re dealing with Uber and driverless cars, sending someone out with a clipboard just isn’t going to cut it. We bring new technology which tells them about their transport, so they can adapt and gain control over the ecosystem.

“We started with sensors and then video-based sensors, generating huge data sets and better quality data. We’ve looked at everything from cyclists undertaking to lockdown journey times and asked: how can we use this data to make the road system more efficient? The next phase is autonomous vehicles, because that ecosystem needs to work with both infrastructure and other road users.

“Privacy is not just a key issue in self-driving but in the whole smart city. There are basically two visions – the Chinese and the European. The Chinese vision is very invasive, it’s 1984 and that’s the point. The alternative is the European vision, with the General Data Protection Regulation (GDPR). For a while it looked like there might be a third, a corporate American vision. Google were running a smart city project in Canada, but it didn’t work out so we’re back to two models.”

If you don’t know about the Quayside project in Toronto, a much-shared Guardian article from 2019 warned of surveillance capitalism, data harvesting and the possibility that algorithms could be used to nudge behaviour in ways that favour certain businesses. You can read it here or, er, Google it.

MN: “We’re very much on the European, privacy-centric, citizen first side – an ecosystem that gives the benefits of mass data without the costs to privacy. All our data is anonymised at source, everything. Each camera or sensor unit has its own processor on board which uses AI to extract information, for example, what are the road users? The imagery is discarded within a few milliseconds, all we keep is the data. We recently looked at how socially distanced people were in Kent and, although no personal data was collected, it caused a bit of controversy.”

It did indeed. “Big Brother is watching your social distancing: Fury as traffic flow cameras are secretly switched to monitor millions of pedestrians in government-backed Covid project”, screamed the headline in the Daily Mail. We’d better get back to self-driving.

MN: “Over the last couple of years the hype around driverless cars has died down. There’s been a recognition that autonomous vehicles are not as close as Elon Musk promised. The technology is progressing though. They can drive quite well on motorways and in quiet areas, but in busy, congested areas they struggle.

“What would happen if you rolled out driverless cars today? My suspicion is they would probably perform to about the same level as human drivers. The question is: Are we happy with systemic risk rather than personal risk? Can we engineer out that risk? Can we make the infrastructure intelligent enough so it works with vehicles in even the most challenging situations?

“The best way to end the no-win scenario is to have enough data to dodge it. Most of these incidents come about due to an unforeseen element, such as a pedestrian stepping out, a cyclist skipping a red light or someone speeding round a corner. If the vehicle knows about it in advance, the trolley problem never occurs. For me it’s about having the data earlier, and how we, as representatives of infrastructure, can help to give cars that information.”

For further info, visit vivacitylabs.com.