In late March, rFpro announced that it had joined the prestigious Association for Standardization of Automation and Measuring Systems (ASAM).
Working with OEMs, Tier 1 suppliers, research institutes and engineering service providers, the Germany-based non-profit has a mission to establish common standards for the development and testing of all automotive systems.
Of particular interest to self-driving is the OpenMATERIAL project. Initiated by BMW, it aims to create a complete set of standards for the simulation-based testing of automated driving functions.
Peter Daley, Managing Director of rFpro, said: “Defining material properties is a key strength for rFpro so we are keen to be involved in OpenMATERIAL to help direct and progress this standard.
“Material definitions have been loosely structured to date, so standardising this would bring huge benefits, particularly for the development of virtual sensor models.”
ASAM CEO Marius Dupuis added: “We are pleased to accept rFpro as a member and welcome their active participation in the OpenMATERIAL project.”
A partner in not one but two of the major government-backed self-driving projects announced by CCAV in September 2023, Hampshire-based simulation software specialist rFpro is branching out from its traditional motorsport and automotive roots. MD Peter Daley explains how and why.
PD: “Yes, we’re a consortium partner in two of the Commercialising Connected and Automated Mobility Supply Chain projects – DeepSafe and Sim4CAMSens.
“DeepSafe will develop simulation-based training to help automated vehicles handle edge cases, supporting verification and validation (V&V). Project leader dRISK bring a way of analysing the full range of unexpected driving scenarios, and other partners include Imperial College London, Claytex Services and DG Cities.
“Claytex, with whom we work closely, are also taking the lead in the Sim4CAMSens project, which has a core focus on sensor modelling and evaluation. Other partners here include the University of Warwick, National Physical Laboratory, Syselek, Compound Semiconductor Applications Catapult, Oxford RF and Techworkshub.
Self-driving environments
“At rFpro, we’ve been investing in driving simulation technology for years, allowing our customers to develop, test and optimise their vehicles more quickly, efficiently and effectively than they could by relying on real-world testing alone. We create very detailed large scale digital models of real-world environments, and offer high performance software which allows people to interact with those.
“Our real-time simulation software is used by many leading OEMs and professional motorsports teams (including in F1), in vehicle dynamics, human factors and other use cases. However, the level of visual realism from images rendered in real-time using rasterising technology still wasn’t high enough to be used on its own for the training and testing of automated vehicle (AV) perception systems. Our new Ray Tracing technology addresses this.
Self-driving realism
“With Ray Tracing, we can reliably simulate the huge number of reflections created by multiple light sources in a scene, even taking into account the properties of the materials the light is hitting, and apply this to every element in the scene as perceived by a vehicle-mounted sensor moving through it.
“Ray Tracing can be applied to the modelling of cameras, radar and lidar sensors. Our solution accurately replicates things like camera shutter effects, depth of field, lens distortion and light saturation across different weather and light conditions.
“Sensor vibrations coming from the vehicle moving across an uneven road surface are allowed for, as is the effect of motion blur from the relative motion between sensor and objects such as other vehicles, pedestrians or road signs and markings.
“In effect, the new technology accurately replicates what cameras and sensors really ‘see’ and presents it in ultra-high definition (UHD). It is a big leap forward and, taken together with rFpro’s renowned real-time solution, unique in the marketplace.
“The creation and use of synthetic test and training data, on a massive scale, to supplement the real-world testing of AV perception and control systems is now realistically achievable. We are excited to be continually finding new ways to support our customers in reaching their goals in this area.”