Why it is necessary to integrate an inertial measurement unit with imaging systems on an autonomous vehicle

February 23, 2016

For more than ten years there has been an increasing impetus in the development of autonomous vehicles.  Autonomous vehicle developers and enthusiasts believe they can provide a safer, more efficient mode of transportation to a wide range of citizens. Autonomous vehicles are enabled through the utilization of a wide range of technologies.

One of the key technology components to an autonomous vehicle is the sensory systems that are used to help the vehicle to understand its environment and context within it.  Typical sensor systems that adorn the outside of the vehicle will include a variety of cameras, lasers and ultrasonic systems. Inside the vehicle and not-visible to the lay passenger is a suite of data processing and analysis capabilities that are interpreting input from the sensors and using the outputs from this analysis to make decisions in the place of a human operator.  A key component of this centralized system is the inertial measurement unit (IMU), the system that monitors the dynamically changing movements of the vehicle.

Oxford Technical Solutions (OxTS) has been providing inertial measurement technologies to the automotive market for nearly 20 years.  Over this period, OxTS has supplied inertial systems to various autonomous vehicle development projects such as the DARPA Challenges (in 2005 and 2007), the self-driving Volkswagen Golf 53+1 in 2006, the unmanned vehicle ‘Wildcat’ from BAE Systems; together with numerous systems for the military and several leading automotive manufacturers.  OxTS is therefore well placed to illustrate why an inertial measurement unit is essential to the safe and successful autonomous operations.

Maturity of mechanisms for driver assistance

Companies such as Google are designing, from the ground-up, fully autonomous vehicles for riding rather than driving.  Many of the sensing capabilities deployed are already being implemented on the advanced driver assistance systems (ADAS) being developed by many of the major automotive manufacturers.  Imaging systems (such as cameras and LiDAR) are typically used to recognize objects on the horizon, whether they be to simply identify upcoming hard structures or even sense the presence of pedestrians or other potential collision hazards in the trajectory of the vehicle.  As the ‘eyes’ of the vehicle, the imaging systems deployed on an individual vehicle are increasing in both their number and sophistication.  In a recent market-sizing report, Yole Développement (Yole) predicts that the value of the market for sensors on autonomous vehicles will be $2.6B by the end of 2015, increasing to $36B (with $12B being imaging sensors alone) by 2030. However, while imaging systems enable the assessment of the shape and form of objects and distance between the vehicle and sensed objects; an autonomous vehicle still needs to know where it is located within the environment  and how it is moving through its environment such that follow decisions and actions can be taken in a timely fashion to respond.  Taking account of the vehicle movements (vehicle dynamics) not only validates information from the imaging sensors, but also ensures safe passage of the autonomous vehicle between obstructions that are seen and along its intended route.

Context, Context, Context

An autonomous vehicle has to keep track of a number of things as it travels along a route from start to finish safely.  At the highest level, the autonomous vehicle needs to keep track of overall progress along the route.  This can be supported through global navigation satellite systems (GNSS).  The GNSS can provide input pertaining to vehicle speed however, it is not reliable enough to provide continuously accurate information about exact location and speed.

At the next level down is a more local context and includes the things that a driver would be able to observe with their eyes from the driver’s seat.  At this level, the autonomous vehicle must understand the immediate environment surrounding the vehicle in the context of fixed and moving features that are nearby or are soon to be nearby.  Fixed features includes things such as lane markings, curbs, traffic control and parked cars.  Moving features could be other moving road users and pedestrians in motion.  Typically, the sensor systems mounted on the top of the autonomous vehicle look outward to understand this context and include cameras, LiDAR and ultrasonic systems.

Finally, the context of the vehicle and how it is moving through the immediate environment needs to be understood.  There are a variety of sensors on board the vehicle to assist with this including one very critical system, the IMU.  The IMU is a sensor in its own right providing data pertaining to the vehicle’s dynamics such that important calculations pertaining to changes in vehicle trajectory can be used to anticipate safety issues (for example, changes in slip angle).

The IMU also provides information to better understand the location of the other sensor systems in the context of the vehicle’s current position.  The IMU provides some of the glue to correctly correlate data feeds from various sensors in relation to the vehicle’s dynamics in space and time at the point of a reading being sent to the autonomous vehicle’s real-time analytics systems.  Not being able to correlate and compute various sensor readings in combination, based on the vehicles dynamics, is dangerous and can lead to safety issues and contribute to a poor ride experience.

Consequences of not understanding vehicle dynamics

Autonomous vehicles must not only understand vehicle dynamics in terms of position, orientation, direction and velocity of the vehicle, but also whether changes in the relationship between these factors is leading to an unsafe situation either for occupants of the vehicle, bystanders, or other road users.  The safest ride is the most stable ride.  To ensure overall stability in vehicle motion, an indicative measure that can first alert preventative measures for safe travel is slip angle.

Slip angle is the measurement that compares angles between the forward velocity vector and heading; that is, the angle between a rolling wheel’s actual direction of travel and the direction towards which it is pointing.  Increases in slip angle can be caused by any of the continual changes in the relationship between the vehicle, the road’s surface and structure, or interactions between other road users.

A 0.5° change in slip angle can be significant enough to trigger skidding, spins or rollover (especially in the case of sport utility vehicles or taller trucks) if corrective action is not taken in sufficient time.  The safe travel of the vehicle will usually be compromised once the vehicle starts traveling in a direction that is different to that which is intended.  Consequences can include (in addition to skidding, spins and rollover) an increased likelihood of collision with obstructions other than those identified on the route by the imaging system, and impacts with parts of the vehicle such as doors, which are not well designed to take an impact.

Monitoring slip angle is one way that an ADAS or completely autonomous driving system can make an assessment about whether preventative measures such as changes to the steer angle or emergency braking mechanisms should be deployed.  There may be imaging-based routing and collision avoidance systems on the vehicle, but if the dynamics of the vehicle become too extreme, an incident such as a skid or a spin could take place prior to alerts from the imaging sensors.  Once an incident is triggered, it may be too late to recover, regardless of whether the location of obstructions is known.

Inertial measurement units for monitoring vehicle dynamics

As described, the safe function of any automated driving system is dependent on continuously knowing the dynamics of the vehicle in terms of location, position on the road, direction, orientation and velocity of the vehicle.  To measure these characteristics necessitates the use of an inertial measurement unit (IMU).

Comprised of an assembly of gyroscopes and accelerometers, the IMU will provide a continuous stream of data related to the linear acceleration of the vehicle on three principal axes, together with the three sets of rotation parameters (pitch, roll and heading).  Data from the IMU will provide additional measurements related to distance travelled by the autonomous vehicle that:

  • Accounts for the most likely position of the vehicle when GNSS data is not available in challenging GNSS environments.
  • Provides data related to the velocity and the extent of acceleration towards obstructions that are sensed by on-board imaging systems.
  • Measures angular data between the direction in which the vehicle is pointing (heading) and where it is actually going (track).

Why use an IMU on an autonomous vehicle

Most importantly, the IMU provides the data which enables an automated driving system to not only know where it is, but also how it’s moving.  This is necessary for any sensing technology to identify routes and obstructions, and provide the feedback required to the driving system to continually adjust its parameters to drive a route safely.

To fulfil these requirements, OxTS inertial measurement systems provide the real-time capability to continuously stream and measure orientations and positions to centimetre-level accuracy and velocity measured to 2 cm/s accuracy.  Through employing algorithms that blend inertial data with available raw GNSS information into a single uninterrupted stream of reliable and accurate navigation messages, the autonomous system will continually receive the required information pertinent to position, orientations, direction and velocity.  Importantly, when GNSS is unavailable or there are omissions in information received from imaging sensors, OxTS technologies bridge these omissions and still provide the necessary continuous data stream to the autonomous driving system.  When considering the use of this information, in conjunction with that from the imaging sensors on board the autonomous vehicle, inertial data provided from the OxTS system makes it possible:

  • To provide navigation data to image processing algorithms at a sampling frequency of at least 100Hz.
  • Provide algorithms used in ADAS systems to understand whether the vehicle is actually moving towards an obstruction or obstacle, and at what speed.
  • Inform emergency systems of the vehicle of dynamic situations requiring corrective action (for example, skids), before an alert from the imaging-based sensors on board.
  • OxTS technologies are either deployed as complete turn-key inertial and GNSS navigation solutions, such as the RT3000 or a custom integration utilizing the same xOEM IMU that is core to OxTS’s systems.

Regardless of the type of deployment, with an OxTS system on board, the autonomous driving system is receiving the information that it needs for safe “go-anywhere” operations.