Researchers ask industry to develop 3D infrared sensors precise enough for unmanned vehicle navigation
The Invisible Headlights program seeks to exploit ambient thermal light with a passive 3D sensor accurate and fast enough for autonomous navigation.
ARLINGTON, Va. – U.S. military researchers are asking for industry’s help in developing computationally intensive 3D infrared sensors that use triangulation and ambient signals in thermal images to create sensors accurate enough for unmanned vehicle navigation.
Officials of the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va., issued a broad agency announcement on Friday (HR001120S0045) for the Invisible Headlights project.
Invisible Headlines seeks to quantify the measurable information from ambient thermal emissions to create 3D vision. The project seeks to help researchers understand the useful information contained in ambient thermal emissions, and enable passive 3D vision for unmanned vehicle navigation.
Autonomous and semi-autonomous systems require active illumination to navigate at night or underground, which can make them vulnerable to the enemy because it can detected by adversaries miles away.
The Invisible Headlights program seeks to eliminate this vulnerability by discovering how to exploit ambient thermal light using a totally passive 3D sensor that is accurate enough and fast enough to support autonomous navigation.
The Invisible Headlights approach is fundamentally different from previous efforts because conventional infrared sensors fail, by design, to collect almost all available information from ambient thermal emissions.
Instead, the Invisible Headlights electro-optical approach seeks to use non-target artifacts in the scene that previously might have been considered clutter to provide the signal necessary to enable 3D vision. It will quantify the available information in ambient thermal emissions, determine how much of that information is useful for building a 3D model of a scene, define the trade space of sensor designs, develop new sensors for increased measurement diversity, and validate all this data in field tests.
The project has two primary goals: understand the useful information in ambient thermal emissions; and enable passive 3D vision for autonomous navigation.
Quantifying the measurable information from ambient thermal emissions is non-trivial and very environment-dependent, DARPA researchers say. It depends on the temperature, chemical composition, geometry, and atmosphere surrounding the objects in a scene.
Moreover, it depends on the aperture, position, measurement modes, and performance of the sensor. In practice, these environments are so complex that measurement, rather than modeling, is necessary to characterize the actual signal variation.
An ideal sensor might be able to extract many orders of magnitude more data about the environment than is attainable using conventional infrared sensors.
Enabling passive 3D vision for autonomous navigation will require near-zero noise and orders of magnitude greater measurement diversity than conventional sensors — particularly at high speeds. More than likely this will require completely new types of infrared sensors.
The project has two technical areas: near-term passive 3D vision; and sensors for measurement hyperdiversity.
Contractors involved in the project will modify and extend existing sensors to enable passive 3D vision sensors with the potential for low-speed applications to 25 miles per hour.
Contractors also will develop near-zero-noise sensors capable of orders of magnitude more measurements per second than conventional sensors to enable fast, high-spatial-resolution, high-spectral-resolution measurement of an environment and high-speed 3D vision. If successful, these new sensors will enable 3D vision for speeds greater than 25 miles per hour.
The project has three phases: an 18-month effort to determine if thermal emissions contain sufficient information to enable autonomous driving at night or underground; a 21-month effort to refine models, create experimental designs, and conduct tests to show that real systems can measure the information necessary for 3D vision; and an 18-month effort to build and test passive demonstration systems.
Dr. Hans C. Mumm