Austin, TX September 24, 2008 — University of Texas professor Hao Ling and Ph.D. candidate, Shobha Ram, are one step closer to making x-ray vision a reality. They are perfecting radar systems that can detect human activities through barriers and convert the signals to virtual renderings similar to that of a video game.
"There are several ongoing research programs in through-wall imaging, but they focus on building hardware sensors with very specific capabilities, says Ling. "That's expensive. What we want to do in this project is to first understand how human movements are manifested in radar data. Then utilize this knowledge to generate an image of a human."
Radar signals, on the left, are turned into an animation of a person walking, on the right. In the radar signals, the torso, which has less movement, is in the thicker orange color. The arms and legs, which move more, are in the thinner yellow color.
(Photo Credit: Hao Ling)
Doppler based radio frequency radar systems are particularly suited for tracking moving humans. They suppress background clutter from stationary objects and provide enough detail to show the dynamic movements of different body parts, in the form of "microDopplers".
"A human has very complex motion dynamics. When walking, the arms and legs move very differently than the torso, and these subtle, minute movements translate into unique microDoppler signatures," Ling says.
Ling and Ram built a physics-based Doppler radar simulator using computer animation data of human motions. Then they incorporated barrier characteristics into the simulation model. Finally, they validated the results with a previously developed Doppler radar testbed with live human movements in line-of-sight situations and behind barriers. Several former and present graduate students including Youngwook Kim, Craig Christianson, Nick Whitelonis, and Yang Li also contributed to the project.
"MicroDoppler signatures could become important tools for monitoring human activities over long durations," says Ram. "The radar simulator, in particular, is a flexible, inexpensive tool we can use to optimize the sensor configurations and signal processing algorithms needed for generating an accurate virtual image of a human behind different types of barriers."
Ultimately, this technology has important applications in search and rescue missions, law enforcement operations, and physical surveillance.
Source: University of Texas at Austin, Electrical & Computer Engineering
University of Texas researchers, Dr. Hao Ling and Shobha Ram, in the acoustics lab.
(Photo Credit: David Liu)