Self-driving cars require a variety of sensor systems in order for the cars to safely navigate roads and deal with the wide range of objects and conditions that they encounter. Two competing technologies that cars use to identify and locate objects on the scene are radar and LiDAR.
Radar uses transmitted radio waves to locate objects and LiDAR bounces laser beams off of objects. Each has its shortcomings. Radar has the problem that only a small fraction of the transmitted signals gets reflected back to the sensor, so that there is frequently insufficient data to fully characterize a scene. LiDAR has the problem that it is an optical system that does not work well in fog, dust, rain, or snow. It is also much more expensive than radar.
Researchers at the University of California San Diego have developed a new system that they describe as a LiDAR-like radar. The system consists of two radar sensors placed on a car’s hood and spaced about 1.5 meters apart. This configuration enables the system to see more space and detail than a single radar sensor.
Having two radars at different vantage points with an overlapping field of view creates a region of high-resolution with a high probability of detecting the objects that are present. The system also overcomes noise problems of conventional radar systems.
The researchers developed new algorithms that can fuse the information from two different radar sensors and produce a new image free of noise.
Self-driving cars have to combine detection technologies like radar with cameras and ultrasonic sensors. Duplicating the capabilities that people use in order to safely drive a car is a complex problem requiring a combination of multiple sensors and sophisticated software.
Photo, posted January 2, 2014, courtesy of Bradley Gordon via Flickr.