We exploit the complementary strengths of vision and proprioception to develop a point-goal navigation system for legged robots, called VP-Nav. Legged systems are capable of traversing more complex terrain than wheeled robots, but to fully utilize this capability, we need a high-level path planner in the navigation system to be aware of the walking capabilities of the low-level locomotion policy in varying environments. We achieve this by using proprioceptive feedback to ensure the safety of the planned path by sensing unexpected obstacles like glass walls, terrain properties like slipperiness or softness of the ground and robot properties like extra payload that are likely missed by vision. The navigation system uses onboard cameras to generate an occupancy map and a corresponding cost map to reach the goal. A fast marching planner then generates a target path. A velocity command generator takes this as input to generate the desired velocity for the walking policy. A safety advisor module adds sensed unexpected obstacles to the occupancy map and environment-determined speed limits to the velocity command generator. We show superior performance compared to wheeled robot baselines, and ablation studies which have disjoint high-level planning and low-level control. We also show the real-world deployment of VP-Nav on a quadruped robot with onboard sensors and computation.
Ashish Kumar, Zipeng Fu, Deepak Pathak, Jitendra Malik RSS 2021 PDF | Video | Project Page |
Zipeng Fu, Ashish Kumar, Jitendra Malik, Deepak Pathak CoRL 2021 PDF | Video | Project Page |
@inproceedings{fu2022coupling,
title = {Coupling Vision and Proprioception for Navigation of Legged Robots},
author = {Zipeng Fu and Ashish Kumar and Ananye Agarwal and Haozhi Qi and Jitendra Malik and Deepak Pathak},
booktitle = {{CVPR}}
year = 2022
}