Self-driving level visual realism – a look at rFpro’s new Ray Tracing simulation software

CCAV turn to F1’s rFpro for super realistic self-driving simulation software

A partner in not one but two of the major government-backed self-driving projects announced by CCAV in September 2023, Hampshire-based simulation software specialist rFpro is branching out from its traditional motorsport and automotive roots. MD Peter Daley explains how and why.

Peter Daley, Managing Director of rFpro
Peter Daley, Managing Director of rFpro

PD: “Yes, we’re a consortium partner in two of the Commercialising Connected and Automated Mobility Supply Chain projects – DeepSafe and Sim4CAMSens.

“DeepSafe will develop simulation-based training to help automated vehicles handle edge cases, supporting verification and validation (V&V). Project leader dRISK bring a way of analysing the full range of unexpected driving scenarios, and other partners include Imperial College London, Claytex Services and DG Cities.

“Claytex, with whom we work closely, are also taking the lead in the Sim4CAMSens project, which has a core focus on sensor modelling and evaluation. Other partners here include the University of Warwick, National Physical Laboratory, Syselek, Compound Semiconductor Applications Catapult, Oxford RF and Techworkshub.

Self-driving environments

“At rFpro, we’ve been investing in driving simulation technology for years, allowing our customers to develop, test and optimise their vehicles more quickly, efficiently and effectively than they could by relying on real-world testing alone. We create very detailed large scale digital models of real-world environments, and offer high performance software which allows people to interact with those.

rFpro day/night in Tokyo simulation
rFpro day/night in Tokyo simulation

“Our real-time simulation software is used by many leading OEMs and professional motorsports teams (including in F1), in vehicle dynamics, human factors and other use cases.  However, the level of visual realism from images rendered in real-time using rasterising technology still wasn’t high enough to be used on its own for the training and testing of automated vehicle (AV) perception systems. Our new Ray Tracing technology addresses this. 

Self-driving realism

“With Ray Tracing, we can reliably simulate the huge number of reflections created by multiple light sources in a scene, even taking into account the properties of the materials the light is hitting, and apply this to every element in the scene as perceived by a vehicle-mounted sensor moving through it.

“Ray Tracing can be applied to the modelling of cameras, radar and lidar sensors. Our solution accurately replicates things like camera shutter effects, depth of field, lens distortion and light saturation across different weather and light conditions.

“Sensor vibrations coming from the vehicle moving across an uneven road surface are allowed for, as is the effect of motion blur from the relative motion between sensor and objects such as other vehicles, pedestrians or road signs and markings. 

Self-driving level of visual realism: Motion blur
Self-driving level visual realism: Motion blur
Self-driving level of visual realism: camera shutter slant effect
Self-driving level visual realism: Camera shutter slant effect

“In effect, the new technology accurately replicates what cameras and sensors really ‘see’ and presents it in ultra-high definition (UHD). It is a big leap forward and, taken together with rFpro’s renowned real-time solution, unique in the marketplace.

“The creation and use of synthetic test and training data, on a massive scale, to supplement the real-world testing of AV perception and control systems is now realistically achievable. We are excited to be continually finding new ways to support our customers in reaching their goals in this area.”