Duke University’s General Robotics Lab is pushing autonomous navigation to new frontiers with its most advanced creation yet: a quadruped robot within their WildFusion framework. More than just a visually guided machine, this four-legged marvel integrates sight, touch, sound, and internal motion-sensing to confidently traverse unpredictable outdoor terrains .
Imagine a robot not just seeing a forest, but feeling its contours and hearing its textures. WildFusion’s sensors include RGB cameras, LiDAR, inertial measurement units (IMUs), contact microphones, and tactile foot sensors . As it steps, it listens to the crunch of leaves, gauges subtle vibrations, and senses uneven ground—all in service of smarter path selection . This fusion of multimodal input builds a richer, continuous environmental map—crucial when camera- or LiDAR-based navigation alone falls short .
In field trials at Eno River State Park, WildFusion’s robot displayed remarkable agility—steering clear of obstacles such as logs and mud patches that typically stump visual-based systems . “These real-world tests proved WildFusion’s remarkable ability to accurately predict traversability,” noted lead PhD student Yanabaihui Liu .
Supported by Dickinson Family Assistant Professor Boyuan Chen—who heads the General Robotics Lab—this initiative represents a conceptual leap: robots with embodied intelligence, simultaneously integrating body and brain . Chen emphasizes the necessity of cohesive “body‑and‑brain” systems that can adapt and learn within unstructured environments.
The team presented their work recently at the IEEE International Conference on Robotics and Automation (ICRA 2025), where WildFusion garnered attention for its multisensory, implicit 3D mapping—marking a new chapter in autonomous navigation .
Looking forward, Duke researchers are already planning expansions: equipping the robot with thermal sensors or humidity detectors to heighten environmental awareness, and exploring applications in disaster response and rugged infrastructure inspection .
WildFusion and its quadruped platform may well rewrite expectations for outdoor robots. Instead of stumbling through rough terrain, robots will be feeling their way—thanks to Duke’s innovative fusion of senses. It’s a bold step toward robotics that truly sense the world, often in ways humans take for granted, and adapt in real time with grace and intelligence. Whether guarding against hazards in disaster zones or exploring remote wilderness, Duke’s cutting-edge robotics are turning science fiction
into field reality.


















