Information Sparsification in Visual-Inertial Odometry

Download: PDF.

“Information Sparsification in Visual-Inertial Odometry” by J. Hsiung, M. Hsiao, E. Westman, R. Valencia, and M. Kaess. In Proc. IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems, IROS, (Madrid, Spain), Oct. 2018, pp. 1146-1153. Best conference paper finalist (one of six).

Abstract

In this paper, we present a novel approach to tightly couple visual and inertial measurements in a fixed-lag visual-inertial odometry (VIO) framework using information sparsification. To bound computational complexity, fixed-lag smoothers typically marginalize out variables, but consequently introduce a densely connected linear prior which significantly deteriorates accuracy and efficiency. Current state-of-the-art approaches account for the issue by selectively discarding measurements and marginalizing additional variables. However, such strategies are sub-optimal from an information- theoretic perspective. Instead, our approach performs a dense marginalization step and preserves the information content of the dense prior. Our method sparsifies the dense prior with a nonlinear factor graph by minimizing the information loss. The resulting factor graph maintains information sparsity, structural similarity, and nonlinearity. To validate our approach, we conduct real-time drone tests and perform comparisons to current state-of-the-art fixed-lag VIO methods in the EuRoC visual-inertial dataset. The experimental results show that the proposed method achieves competitive and superior accuracy in almost all trials. We include a detailed run-time analysis to demonstrate that the proposed algorithm is suitable for real-time applications.

Download: PDF.

BibTeX entry:

@inproceedings{Hsiung18iros,
   author = {J. Hsiung and M. Hsiao and E. Westman and R. Valencia and M.
	Kaess},
   title = {Information Sparsification in Visual-Inertial Odometry},
   booktitle = {Proc. IEEE/RSJ Intl. Conf. on Intelligent Robots and
	Systems, IROS},
   pages = {1146-1153},
   address = {Madrid, Spain},
   month = oct,
   year = {2018},
   note = {Best conference paper finalist (one of six).}
}
Last updated: November 10, 2024