Download: PDF.
“Long-term Visual Map Sparsification with Heterogeneous GNN” by M.-F. Chang, Y. Zhao, R. Shah, J.J. Engel, M. Kaess, and S. Lucey. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), (New Orleans, LA, USA), June 2022, pp. 2406-2415.
We address the problem of map sparsification for long-term visual localization. For map sparsification, a commonly employed assumption is that the pre-build map and the later captured localization query are consistent. However, this assumption can be easily violated in the dynamic world. Additionally, the map size grows as new data accumulate through time, causing large data overhead in the long term. In this paper, we aim to overcome the environmental changes and reduce the map size at the same time by selecting points that are valuable to future localization. Inspired by the recent progress in Graph Neural Network (GNN), we propose the first work that models SfM maps as heterogeneous graphs and predicts 3D point importance scores with a GNN, which enables us to directly exploit the rich information in the SfM map graph. Two novel supervisions are proposed: 1) a data-fitting term for selecting valuable points to future localization based on training queries; 2) a K-Cover term for selecting sparse points with full-map coverage. The experiments show that our method selected map points on stable and widely visible structures and outperformed baselines in localization performance.
Download: PDF.
BibTeX entry:
@inproceedings{Chang22cvpr, author = {M.-F. Chang and Y. Zhao and R. Shah and J.J. Engel and M. Kaess and S. Lucey}, title = {Long-term Visual Map Sparsification with Heterogeneous {GNN}}, booktitle = {Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)}, pages = {2406-2415}, address = {New Orleans, LA, USA}, month = jun, year = {2022} }