Efficient 2D SLAM for a Mobile Robot with a Downwards Facing Camera

Detta är en Master-uppsats från Lunds universitet/Matematik LTH

Sammanfattning: As digital cameras become cheaper and better, computers more powerful, and robots more abundant the merging of these three techniques also becomes more common and capable. The combination of these techniques is often inspired by the human visual system and often strives to give machines the same capabilities that humans already have, such as object identification, navigation, limb coordination, and event detection. One such field that is particularly popular is that of SLAM, or Simultaneous Localization and Mapping, which has high-profile applications in self-driving cars and delivery drones. This thesis proposes and describes an online SLAM algorithm for a specific scenario: that of a robot with a downwards facing camera exploring a flat surface (e.g., a floor). The method is based on building homographies from robot odometry data, which are then used to rectify the images so that the tilt of the camera with regards to the floor is eliminated, thereby moving the problem from 3D to 2D. The 2D pose of the robot in the plane is estimated using registrations of SURF features, and then a bundle adjustment algorithm is used to consolidate the most recent measurements with the older ones in order to optimize the map. The algorithm is implemented and tested with an AR.Drone 2.0 quadcopter. The results are mixed, but hardware seems to be the limiting factor: the algorithm performs well and runs at 5-20 Hz on a i5 desktop computer; but the bad quality, high compression and low resolution of the drone’s bottom camera makes the algorithm unstable and this cannot be overcome, even with several tiers of outlier filtering.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)