Visit Support Centre Visit Support Centre Find a Distributor Find a Distributor Contact us Contact us

ADAS and autonomous vehicle testing: what does the future hold?

Blogs April 25, 2019

It has become an almost unavoidable cliché to say ADAS and autonomous vehicle development is evolving at an accelerated rate. Self-driving car trials are under way in many cities around the world as governments and the automotive industry itself strive to make roads safer. But as the automated systems and sensor arrays become more complex, so too have the methods required to develop, test and evaluate them.

So what does the future hold for ADAS and autonomous vehicle developers? Some of today’s production cars are already sold with bundled ADAS functionality offering Level 2 and even, under certain conditions, Level 3 autonomy, and it is already the case that the evaluation of some ADAS functions has become a key part of NCAP’s and the NHTSA’s testing.

Embraced by legislation

The recent confirmation that the European Commission is mandating a range of safety features for new vehicles sold in the EU from 2022, among them forward collision warning/autonomous emergency braking (FCW/AEB) and lane-keeping assist, will further accelerate the development and testing workload. In the United States, meanwhile, 20 automotive manufacturers have so far pledged to equip all their vehicles with standard AEB by the same date.

The engineers developing those systems already have a number of tools at their disposal, not least OxTS’ RT-series of GNSS/INS products. Alongside those, driving robots and guided platforms, such as those produced by AB Dynamics and widely adopted across the automotive industry, have become essential pieces of equipment.

Key to the harnessing of accurate, reliable validation data is the execution of precise, repeatable test scenarios. Tests that require one or more vehicles or mobile targets to follow a prescribed course can be performed with centimetre-level accuracy time and again using path-following steering robots and guided-platform mounted soft targets.

As testing becomes increasingly complex to address the enormous number of potential scenarios a partially or fully autonomous car could face in urban or highway situations, the use of driving robots and guided platforms is set to expand.

Sensor fusion

Another development on the horizon is the creation of multiple-target vulnerable road user (VRU) test scenarios. It is already the case that a series of cyclist and pedestrian detection AEB tests are part of the NCAP ADAS protocols. For now, sensor technology is limited in its scope to detect multiple or complex VRU targets, and the current tests are carried out using single child, adult and cyclist target dummies, both stationary and mounted on controllable robotic platforms.In future, however, fully autonomous vehicles will be required to deal with far more challenging urban scenarios, so in turn, more complex multiple-target VRU test protocols will need to be developed in order to fully evaluate the systems.

The sensor technologies currently in use – primarily radar, camera and ultrasonic, with short-range LiDAR appearing in some applications – each have their limitations. Radar, for example, can struggle to differentiate between overlapping objects and it is angularly sensitive, while cameras can struggle in low light and conditions of poor visibility, and LiDAR’s performance can be affected by bad weather.

In order to mitigate those weaknesses and provide the full redundancy that a driverless vehicle will require, inputs from different kinds of sensors are being combined – the process known as sensor fusion – to provide a consistent and accurate picture of potential hazards and the surrounding environment.

Sensor fusion is already happening – combined radar and camera sensors are becoming commonplace – but as compact and cost-effective solid-state LiDAR becomes more readily available for production vehicle applications, these too are set to be more widely adopted.  These more complex systems will require rigorous testing and validation – something OxTS is already addressing with its recently launched Multiple Sensor Points feature – and place further demands on developers and engineers.

A collaborative approach

Due to the enormous challenges the industry is facing and the infrastructure changes that will be required to support fully self-driving cars, more collaboration and knowledge-sharing is required between automotive developers and technology companies, governments, legislators and highways agencies, among many others.

This collaborative approach is well underway. US-based automated vehicle technology researcher VSI Labs, for example, has been collating and disseminating technologies and research from across the autonomous vehicle spectrum, with the goal of facilitating and accelerating development. VSI’s own research includes the analysis of key automated systems, such as the aforementioned sensor fusion, high-definition mapping-based precision localisation and the by-wire control systems that self-driving vehicles will require. The scope for growth in testing and validation is enormous.

Finally, it is also true that ADAS testing will happen more and more in the virtual environment as simulator technology is developed to accommodate the necessary driving scenarios. While this work will be invaluable and further accelerate the advancement of ADAS and autonomous technologies, there will always be a place for testing and validation in the real world.

For more information on how OxTS can help with your ADAS and autonomous vehicle testing, both now and in the future, click here.

return to top

Return to top

,