Download: PDF.
“SONIC: Sonar Image Correspondence using Pose Supervised Learning for Imaging Sonars” by S. Gode, A. Hinduja, and M. Kaess. In Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA, (Yokohama, Japan), May 2024, pp. 3766-3772.
In this paper, we address the challenging problem of data association for underwater SLAM through a novel method for sonar image correspondence using learned features. We introduce SONIC (SONar Image Correspondence), a pose-supervised network designed to yield robust feature correspondence capable of withstanding viewpoint variations. The inherent complexity of the underwater environment stems from the dynamic and frequently limited visibility conditions, restricting vision to a few meters of often featureless expanses. This makes camera-based systems suboptimal in most open water application scenarios. Consequently, multibeam imaging sonars emerge as the preferred choice for perception sensors. However, they too are not without their limitations. While imaging sonars offer superior long-range visibility compared to cameras, their measurements can appear different from varying viewpoints. This inherent variability presents formidable challenges in data association, particularly for feature-based methods. Our method demonstrates significantly better performance in generating correspondences for sonar images which will pave the way for more accurate loop closure constraints and sonar-based place recognition. Code as well as simulated and real-world datasets are made public on https://github.com/rpl-cmu/sonic to facilitate further development in the field.
Download: PDF.
BibTeX entry:
@inproceedings{Gode24icra, author = {S. Gode and A. Hinduja and M. Kaess}, title = {{SONIC}: Sonar Image Correspondence using Pose Supervised Learning for Imaging Sonars}, booktitle = {Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA}, pages = {3766-3772}, address = {Yokohama, Japan}, month = may, year = {2024} }