Local Feature Correspondence on Side-Scan Sonar Seafloor Images

Detta är en Master-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Författare: Li Ling; [2021]

Nyckelord: ;

Sammanfattning: In underwater environments, the perception and navigation systems are heavily dependent on the acoustic wave based sonar technology. Side-scan sonar (SSS) provides high-resolution, photo-realistic images of the seafloor at a relatively cheap price. These images could be considered potential candidates for place recognition and navigation of autonomous underwater vehicles (AUVs). Local feature correspondence matching, or the detection, description and matching of keypoints in overlapping images is a necessary building block for AUV navigation. Recent deep learning based research has resulted in state-of-the-art local correspondence models for camera images. For SSS images, however, deep learning based studies are limited and handcrafted methods such as SIFT and RootSIFT still dominate the field. In this study, SSS images taken from a seafloor area with bottom trawling marks were used for correspondence matching. D2-Net, a detect-and-describe VGG16 based network architecture designed for and tested on camera image correspondence was fine-tuned for SSS image correspondence. Using triplet margin ranking loss, the network was trained to simultaneously detect salient keypoints and produce similar descriptors for corresponding pixels and dissimilar descriptors for non-corresponding pixels. When evaluated on the nontrivial SSS images pairs in the test dataset, the best performing D2-Net based network was found to outperform the RootSIFT baseline in terms of number of detected keypoints, keypoint repeatability and mean matching accuracy at above 10 pixel threshold. 

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)