Collaborative Localization and Mapping with Heterogeneous Depth Sensors

Detta är en Master-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Författare: David Villagrá Guilarte; [2020]

Nyckelord: ;

Sammanfattning: Simultaneous Localization and Mapping (SLAM) is the process in which a robot and other devices navigate in environments by simultaneously building a map of the surroundings and localizing itself within it. SLAM for single agents and specific sensors has matured during these last two decades. Nevertheless, the increasing demand for applications that require a high number of devices working together, with different types of sensors, have initiated and accelerated the interest for collaborative SLAM, and SLAM with heterogeneous sensors. This thesis proposes a collaborative SLAM framework that works with heterogeneous depth-based sensors, in particular, 3D LiDARs and stereo cameras. The framework is based on the SegMap framework, which makes use of a structural 3D segment representation of the map, and has a centralized structure that enables online multi-robot applications. Stereo-LiDAR support is enabled in the framework by a Stereo Estimation sub-module, which obtains a 3D point cloud from a stereo camera. Filtering of the stereo 3D point cloud and parameter optimization is performed in order to enhance the matching of segments from the stereo camera and 3D LiDAR. The system was evaluated on the KITTI dataset, in an offline fashion through its possible configurations. The results show that a vehicle containing a 3D LiDAR can be localized on a map created by a stereo camera, and vice-versa, enabling the generation of loop closures successfully when in an heterogeneous SLAM scenario. Furthermore, the influence of the system configuration and parameters of the framework on the heterogeneous localization performance is presented.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)