Visit Support Centre Visit Support Centre Find a Distributor Find a Distributor Contact us Contact us

ADAS sensors

Industry Articles November 5, 2020

Sensor technology is a key driver of ADAS development. ADAS and autonomous driving functions feed off a continuous stream of information about the environment surrounding the vehicle, and it’s the sensors’ job to provide that.

The sensor is required to detect not only everything the driver can see but also that which the driver can’t – or hasn’t noticed. There are a number of different kinds of sensor already in use, each with their own advantages and disadvantages in terms of capability, cost and packaging, and it is increasingly the case that more than one type of sensor is used for each ADAS function. Each type of sensor has acknowledged strengths and weaknesses, so by combining different technologies it’s possible to refine the ADAS functions. This fusion of sensor technology is rapidly becoming the norm – the task then is to process the influx of data from multiple sources both accurately and quickly.

Another factor to consider is the robustness and durability of the sensors. While some sensors can be inside the vehicle’s cabin, many require mounting externally towards the vehicle’s extremes, in vulnerable areas such as the corners of bumpers and behind the grille, and these can be hostile environments for high-tech equipment. The automotive insurance and repair industry has also raised concerns about the issue of expensive sensor replacement or recalibration if the vehicle is involved in an accident.

The growing uptake of ADAS and the ongoing development of autonomous vehicles is driving the advance of sensor technology at an accelerated rate. In terms of object detection and classification, many systems already in use are still operating at a relatively basic level and there is a long way to go before ADAS functionality can make the jump to fully autonomous applications. Current systems can, for example, struggle to identify pedestrians beyond a very specific form. They may fail to recognise a person wearing clothing that significantly alters their outline, if they are carrying a large object or if they are below a certain height. As the technology develops, however, these limitations will inevitably be addressed.

Current ADAS sensor technology can be divided into four main categories, which we’ll look at in a bit more detail.

Radar

Perhaps the best-recognised of all the technologies currently adopted for ADAS sensors is radar. Radar – an acronym for radio detection and ranging – is a well-established technology that detects objects by measuring the time it takes for transmitted radio waves to reflect back off any objects in their path. Radar was first developed concurrently by several nations for military use in the lead-up to the Second World War, but today it has many applications on land, in the sea, in the air and in space. Radar has been in use in automotive systems for some years now, so the hardware is well developed and relatively affordable, making it attractive to car manufacturers.

For ADAS applications radar can be divided into three categories: short-range radar (SRR), mid-range radar (MMR) and long-range radar (LRR).
SRR systems traditionally used microwaves in the region of 24 GHz but there has been an industry shift towards 77 Ghz due to, among other things, the 24 GHz frequency’s limited bandwidth and changing regulatory requirements. SRRs have a useful range of around 10 metres but up to 30 metres, making them suitable for blind spot detection, lane-change assist, park assist and cross-traffic monitoring systems.

MRR and LRR ADAS functions already use the 77 GHz frequency, which offers higher resolution (relatively speaking) and greater accuracy for speed and distance measurements. MRR operates between 30 metres and 80 metres, while LRR systems have a range extending up to 200 metres in some cases, making them suitable for systems such as adaptive cruise control, forward collision warning and automatic emergency braking. One of LRR’s disadvantages is that it’s measurement angel decreases with range, so some functions, such as adaptive cruise control, combine inputs from both SRR and LRR sensors.

Aside from it being a proven technology, radar’s other key advantages for ADAS use are its ability to function effectively in poor weather, such as rain, snow and fog, and at night. It’s limitations, however, are equally well acknowledged by the industry, namely that radar doesn’t offer sufficient resolution to identify what an object is, only to say that it’s there. It also has a limited field of view in automotive applications, so a number of sensors are required on the vehicle in order to provide appropriate coverage. Additionally, SRR using the 24 GHz frequency struggles to differentiate between multiple targets.

Ultrasonic

Ultrasonic sensors use reflected sound waves to calculate the distance to objects. Of all the ADAS sensor technologies ultrasonics are the oldest and most well-established – bats have been using it for around 50 million years – and ultrasound systems generally have an enormous range of applications in both industry, scientific research and medicine.

Ultrasonic sensors, also known as ultrasonic transducers, have a relatively short effective operating range – around 2 metres – so they are typically used in low-speed systems. Their use in parking sensors has been widespread for some time, but they have also found a place in more complex ADAS functions such as park assist, self-parking and some blind-spot monitoring applications. Ultrasonic sensors are cost-effective and relatively robust and reliable, plus they are unaffected by night-time or other challenging light conditions, such as bright, low sunlight.

Given the limited range of established ultrasonic sensors, however, some manufacturers are abandoning them in favour of short-range radar. This is particularly the case with the latest rear cross-traffic/pedestrian alert systems that combine existing parking sensor technology with additional blind-spot detection, although recent developments in ultrasonic technology have seen the ranges of some sensors extend to 8-10 metres or so, making them suitable for such applications.

Lidar

Lidar (a contraction of ‘laser’ and ‘radar’, or an acronym for, variously, ‘light detection and ranging’ or ‘laser imaging, detection and ranging’ – take your pick) works on essentially the same principle as radar but swaps electromagnetic waves for lasers to generate a high-resolution 3D image of the surrounding environment. Lidar was first developed in the 1960s for meteorological, surveying and mapping use but has more recently been adopted for ADAS and autonomous vehicle development applications. Broadly speaking the automotive industry – with the exception of Tesla – is hedging its bets that Lidar is the best solution for ADAS and autonomous applications.

There are two basic types of Lidar but both adopt the same fundamental principle of measuring reflected laser light. In the first instance, a pulsed laser is emitted onto a rotating mirror which radiates the laser beam in multiple directions. These systems are extremely effective, with a range of 300 metres or more and, if roof-mounted, offer a clear, 360° field of view. Their size, however, prohibits their use for ADAS functions on production vehicles and they are also expensive. A more compact and ADAS-friendly variation of the same theme uses a microelectromechanical systems (MEMS) technology-based rotating mirror to radiate the laser beam.

The second type is known as solid-state Lidar, of which a couple of variations are being developed. One fires a single laser through an optical phased array in order to direct the beam in multiple directions, while the other, so-called flash Lidar, uses a single pulse, or flash, of laser to create its image.

Each of the two main systems has its advantages and disadvantages. Solid-state Lidar is preferable for automotive use not least for being more robust – but in each case the emitted laser is reflected back off any objects within range and is received by a highly sensitive photodetector, after which the information is converted into a 3D model of the immediate environment.

It is the detail and resolution of that 3D model that give Lidar the potential to be such a powerful tool. With the appropriate analytical algorithms, a Lidar system has the ability to detect objects, differentiate between them and accurately track them, all in high-resolution 3D. Lidar also works well in rain and snow, although it can be adversely affected by fog, and its function is unaffected at night.

Historically Lidar has been prohibitively expensive for use in production automotive applications, but it is slowly becoming more common in ADAS development as the technology is refined and costs come down. Prototype fully autonomous cars have already made use of the bulky roof-mounted Lidar systems to good effect, but such a set-up is impractical and prohibitively expensive for commercial ADAS applications. For the time being the Lidar systems that are compact enough – and affordable enough – to be packaged out of sight on production vehicles have a relatively limited range measured in the tens of metres rather than hundreds and are therefore only effective at lower speeds.

Cameras

Camera-based solutions have gained traction as the ADAS developer’s sensor technology of choice. They have their limitations – namely their susceptibility to compromised performance in poor weather and low or challenging light conditions – but the technology, while relatively new compared with, say, radar or ultrasonic sensors, is already capable and versatile. Unlike the other sensors here, cameras are the only ones able to identify colour and contrast information, which makes them ideally suited to capturing road sign and road marking information, and they also offer the resolution to classify objects such as pedestrians, cyclists and motorcyclists. Cameras are also extremely cost-effective, which makes them particularly attractive to volume-selling vehicle manufacturers. Due to the limitations of the technology, the data from cameras sensors is increasingly being combined with radar to provide a more robust and reliable data stream across a wider variety of conditions.

Cameras are used in both monocular and, increasingly, binocular ADAS applications. Forward-facing monocular camera systems feature in medium- to long-range functions such as lane-keeping assistance, cross-traffic alert and traffic sign recognition systems. Rear-facing cameras have enjoyed widespread adoption primarily as a reversing aid for the driver. A mirror-image view of the area behind the car is displayed on a dashboard-mounted screen, in some cases augmented with positional graphics relative to steering wheel movement to provide parking guidance.

Forward-facing Binocular or stereo cameras are a more recent development. A pair of cameras is able to present an essentially 3D image that provides the information necessary to calculate complex depth information such as the distance to a moving object, making them suitable for active cruise control and forward collision warning applications.

Another branch of camera technology that has established a foothold in ADAS development is thermal imaging. Instead of using visible light, or what little of it that might be available, thermal imaging cameras are ideally suited to detecting humans and animals, particularly in conditions of poor visibility or at night, or simply in an otherwise busy and cluttered driving environment. Again, the technology is well-established and widely in use across the automotive industry, first appearing as passive night vision assist systems on premium-brand models around 10 years ago.

Thermal imaging cameras have a range of up to 300 metres or so and are unaffected by fog, dust, glare from low sun and, of course, complete darkness, and they have a valuable role to play in the ADAS developers arsenal of sensor technologies.

This article is part of the ‘What is ADAS?‘ series.

Read the next section in the ‘What is ADAS?’ series: Advanced driver assistance systems on the road 
return to top

Return to top

,