Texture-Aware SLAM Using Stereo Imagery And Inertial Information

Abstract

We present a gaze control method that augments an existing stereo and inertial Simultaneous Localization And Mapping (SLAM) system by directing the stereo camera towards feature-rich regions of the scene. Our integrated active SLAM system is based on careful triangulation of visual features, existing successful nonlinear optimization, and visual loop closing frameworks. It relies on the tight coupling of IMU measurements with constraints imposed by visual correspondences from both stereo and motion. Alongside the SLAM system, the gaze control module also runs in real-time and includes an efficient online classifier that segments the scene into texture classes and assigns a quality score to each class that correlates with the availability of reliable features for tracking. Based on this quality score, the gaze selection module controls a pan-tilt unit that directs the camera to focus on high-reward texture classes. We validate our system in both indoor and outdoor spaces, and we show that active gaze control crucially improves the robustness and long-term operation of the localization system.

Publication
Proceedings of the Conference on Computer and Robot Vision (CRV)