Penn
Engineers have developed a system that lets robots see around corners using
radio waves processed by AI, a capability that could improve the safety and
performance of driverless cars as well as robots operating in cluttered indoor
settings like warehouses and factories.
The system, called HoloRadar, enables robots to reconstruct three-dimensional
scenes outside their direct line of sight, such as pedestrians rounding a
corner. Unlike previous approaches to non-line-of-sight (NLOS) perception that rely on visible light,
HoloRadar works reliably in darkness and under variable lighting conditions.
"Robots and autonomous vehicles need to see beyond what's directly in front of them," says Mingmin Zhao, Assistant Professor in Computer and Information Science (CIS) and senior author of a paper describing HoloRadar, presented at the 39th annual Conference on Neural Information Processing Systems (NeurIPS). "This capability is essential to help robots and autonomous vehicles make safer decisions in real time."
HoloRadar allows robots to see around corners in
varied lighting conditions by relying on radio signals and AI. Credit: Sylvia
Zhang and WAVES Lab, Penn Engineering
Turning walls into mirrors
At the heart of HoloRadar is a
counterintuitive insight into radio waves. Compared to visible light, radio
signals have much longer wavelengths, a property traditionally seen as a
disadvantage for imaging because it limits resolution. Zhao's team realized
that, for peering around corners, those longer wavelengths are actually an
advantage.
"Because radio waves are so
much larger than the tiny surface variations in walls," says Haowen Lai, a
doctoral student in CIS and co-author of the new paper, "those surfaces
effectively become mirrors that reflect radio signals in predictable
ways."
In practical terms, this means that
flat surfaces like walls, floors, and ceilings can bounce radio signals around
corners, carrying information about hidden spaces back to a robot. HoloRadar
captures these reflections and reconstructs what lies beyond direct view.
"It's similar to how human drivers sometimes rely on mirrors stationed at blind intersections," says Lai. "Because HoloRadar uses radio waves, the environment itself becomes full of mirrors, without actually having to change the environment."
Designed for in-the-wild operations
In recent years, other researchers
have demonstrated systems with similar capabilities, typically by using visible light. Those systems
analyze shadows or indirect reflections, making them highly dependent on
lighting conditions. Other attempts to use radio signals have relied on slow and
bulky scanning equipment, limiting real-world applications.
"HoloRadar is designed to work
in the kinds of environments robots actually operate in," says Zhao.
"This system is mobile, runs in real time, and doesn't depend on
controlled lighting."
HoloRadar augments the safety of autonomous robots by complementing existing sensors rather than replacing them. While autonomous vehicles already use LiDAR, a sensing system that uses lasers to detect objects in the vehicles' direct line of sight, HoloRadar adds an additional layer of perception by revealing what those sensors cannot see, giving machines more time to react to potential hazards.
A single radio pulse can bounce
multiple times before returning to the sensor, creating a tangled set of
reflections that are difficult to untangle using traditional signal-processing
methods alone.
To solve this problem, the team
developed a custom AI system that combines machine learning with physics-based
modeling. In the first stage, the system enhances the resolution of raw radio
signals and identifies multiple "returns" corresponding to different
reflection paths. In the second stage, the system uses a physics-guided model
to trace those reflections backward, undoing the mirror-like effects of the
environment and reconstructing the actual 3D scene.
"In some sense, the challenge
is similar to walking into a room full of mirrors," says Zitong Lan, a
doctoral student in Electrical and Systems Engineering (ESE) and co-author of
the paper. "You see many copies of the same object reflected in different
places, and the hard part is figuring out where things really are. Our system
learns how to reverse that process in a physics-grounded way."
By explicitly modeling how radio
waves bounce off surfaces, the AI can distinguish between direct and indirect
reflections and determine the correct physical locations of a variety of
objects, including people.
From the lab to the real world
The researchers tested HoloRadar on
a mobile robot in real indoor environments, including hallways and building
corners. In these settings, the system successfully reconstructed walls,
corridors, and hidden human subjects located outside the robot's line of sight.
Future work will explore outdoor
scenarios, such as intersections and urban streets, where longer distances and
more dynamic conditions introduce additional challenges.
"This is an important step toward giving robots a more complete understanding of their surroundings," says Zhao. "Our long-term goal is to enable machines to operate safely and intelligently in the dynamic and complex environments humans navigate every day."
Source: Robots use radio signals and AI to see around corners



No comments:
Post a Comment