Pose estimation of a VTOL UAV using IMU, Camera and GPS

Detta är en Master-uppsats från Reglerteknik

Sammanfattning: When an autonomous vehicle has a mission to perform, it is of high importance that the robot has good knowledge about its position. Without good knowledge of the position, it will not be able to navigate properly and the data that it gathers, which could be of importance for the mission, might not be usable. A helicopter could for example be used to collect laser data from the terrain beneath it, which could be used to produce a 3D map of the terrain. If the knowledge about the position and orientation of the helicopter is poor then the collected laser data will be useless since it is unknown what the laser actually measures. A successful solution to position and orientation (pose) estimation of an autonomous helicopter, using an inertial measurement unit (IMU), a camera and a GPS, is proposed in this thesis. The problem is to estimate the unknown pose using sensors that measure different physical attributes and give readings containing noise. An extended Kalman filter solution to the simultaneous localisation and mapping (SLAM) is used to fuse data from the different sensors and estimate the pose of the robot. The scale invariant feature transform (SIFT) is used for feature extraction and the unified inverse depth parametrisation (UIDP) model is used to parametrise the landmarks. The orientation of the robot is described by quaternions. To be able to evaluate the performance of the filter an ABB industrial robot has been used as reference. The pose of the end tool of the robot is known with high accuracy and gives a good ground truth so that the estimations can be evaluated. The results shows that the algorithm performs well and that the pose is estimated with good accuracy.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)