Camera Calibration for Zone Positioning and 2D-SLAM : Autonomous Warehouse Solutions for Toyota Material Handling

Detta är en Kandidat-uppsats från Linköpings universitet/Institutionen för systemteknik

Sammanfattning: The aim of this thesis is to investigate how well a generic monocular camera, placed on the vehicle, can be employed to localize an autonomous vehicle in a warehouse setting. The main function is to ascertain which zone the vehicle is currently in, as well as update the status when entering a new zone. Two zones are defined, where one has a lower allowed top velocity and the other a higher one. For this purpose ArUco markers are used to signal the system as to where it currently is. Markers are strategically placed around the laboratory area to saturate the environment with possible detections. Multiple sequences are recorded while varying camera placement, angles, and paths to determine the optimal number and placement of markers. In addition to this, a SLAM solution is tested in order to explore what benefits can be found. The idea is to provide fine-grained localization as well as a map of the warehouse environment, to provide more options for further development. To solve the SLAM problem, an implemented particle filter approach initializes a set of particles uniformly distributed within the world frame. For each frame, the particles undergo pose prediction, weight assignment based on likelihood, and resampling. This iterative process gradually converges the particles toward the camera's true position. Visual odometry techniques are used to estimate the camera's ego-motion. The process involves acquiring a sequence of images, detecting distinctive features, matching features between consecutive frames, estimating camera motion, and optionally applying local optimization techniques for further refinement. The implementation shows promise and all test cases performed during the project have been successful as for the zone localization. The SLAM solution can detect and track specific features or landmarks over consecutive frames. By triangulating the positions of these features, their depth and distance can be determined. However, the visualization of these features on a top-down map, which was part of the plan, has not been completed yet despite finishing the particle filter implementation.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)