We present TerraPN, a novel method to learn the surface characteristics (texture, bumpiness, deformability, etc.) of complex outdoor terrains for autonomous robot navigation. Our method predicts navigability cost maps for different surfaces using patches of RGB images, odometry, and IMU data. Our method dynamically varies the resolution of the output cost map based on the scene to improve its computational efficiency. We present a novel extension to the Dynamic-Window Approach (DWA-O) to account for a surface's navigability cost while computing robot trajectories. DWA-O also dynamically modulates the robot's acceleration limits based on the variation in the robot-terrain interactions. In terms of perception, our method learns to predict navigability costs in ∼20 minutes for five different surfaces, compared to 3-4 hours for previous scene segmentation methods and leads to a decrease in inference time. In terms of navigation, our method outperforms previous works in terms of vibration costs and generates robot velocities suitable for different surfaces.
TerraPN: Unstructured terrain navigation through Online Self-Supervised Learning.
Adarsh Jagan Sathyamoorthy, Kasun Weerakoon, Tianrui Guan, Jing Liang, Dinesh Manocha