With their patent-pending technique that enhances conventional machine vision and perception, Purdue University researchers are making strides in the fields of robotics and autonomy.
Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering in the Elmore Family School of Electrical and Computer Engineering, and research scientist Fanglin Bao have developed HADAR, or heat-assisted detection and ranging. Their research was featured on the cover of the July 26 issue of the peer-reviewed journal Nature.
A video about HADAR is available on YouTube. Nature also has released a podcast episode that includes an interview with Jacob.
Jacob said it is expected that one in 10 vehicles will be automated and that there will be 20 million robot helpers that serve people by 2030.
“Each of these agents will collect information about its surrounding scene through advanced sensors to make decisions without human intervention,” Jacob said. “However, simultaneous perception of the scene by numerous agents is fundamentally prohibitive.”
LiDAR, or light detection and ranging, radar, and sonar are examples of traditional active sensors that produce and then receive signals in order to gather 3D data about a scene. The disadvantages of these techniques, such as signal interference and threats to people’s eye safety, get worse as they are scaled up.
Contrarily, video cameras that rely on light from the sun or other sources of lighting are useful, but low-light situations like at night, in the fog, or in the rain provide a significant challenge.
Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect.’ Thermal pictures of a person’s face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture, and features is a roadblock for machine perception using heat radiation.
Fanglin Bao
Traditional thermal imaging is a fully passive sensing technique that gathers heat radiation that is unseen to the naked eye and comes from all the objects in a scene. It can sense through darkness, inclement weather, and solar glare. But Jacob said fundamental challenges hinder its use today.
“Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect,’” Bao said. “Thermal pictures of a person’s face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture, and features is a roadblock for machine perception using heat radiation.”
HADAR combines thermal physics, infrared imaging, and machine learning to pave the way to fully passive and physics-aware machine perception.
“Our work builds the information-theoretic foundations of thermal perception to show that pitch darkness carries the same amount of information as broad daylight. Evolution has made human beings biased toward the daytime. Machine perception of the future will overcome this long-standing dichotomy between day and night,” Jacob said.
Bao said, “HADAR vividly recovers the texture from the cluttered heat signal and accurately disentangles temperature, emissivity and texture, or TeX, of all objects in a scene. It sees texture and depth through the darkness as if it were day and also perceives physical attributes beyond RGB, or red, green, and blue, visible imaging or conventional thermal sensing. It is surprising that it is possible to see through pitch darkness like broad daylight.”
The team tested HADAR TeX vision using an off-road nighttime scene.
“HADAR TeX vision recovered textures and overcame the ghosting effect,” Bao said. “It recovered fine textures such as water ripples, bark wrinkles and culverts in addition to details about the grassy land.”
Additional improvements to HADAR are improving the size of the hardware and the data collection speed.
“The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation,” Bao said. “To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster. The current sensor takes around one second to create one image, but for autonomous cars, we need around 30 to 60-hertz frame rate or frames per second.”
HADAR TeX vision’s initial applications are automated vehicles and robots that interact with humans in complex environments. The technique could be improved for use in applications in geosciences, agriculture, defense, healthcare, and wildlife monitoring.
Jacob and Bao disclosed HADAR TeX to the Purdue Innovates Office of Technology Commercialization, which has applied for a patent on intellectual property. Industry partners seeking to further develop the innovations should contact Dipak Narula.