Download: PDF.
“EDPLVO: Efficient Direct Point-Line Visual Odometry” by L. Zhou, G. Huang, Y. Mao, S. Wang, and M. Kaess. In Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA, (Philadelphia, PA, USA), May 2022, pp. 7559-7565. Outstanding navigation paper award.
This paper introduces an efficient direct visual odometry (VO) algorithm using points and lines. Pixels on lines are generally adopted in direct methods. However, the original photometric error is only defined for points. It seems difficult to extend it to lines. In previous works, the collinear constraints for points on lines are either ignored [1] or introduce heavy computational load into the resulting optimization system [2]. This paper extends the photometric error for lines. We prove that the 3D points of the points on a 2D line are determined by the inverse depths of the endpoints of the 2D line, and derive a closed-form solution for this problem. This property can significantly reduce the number of variables to speed up the optimization, and can make the collinear constraint exactly satisfied. Furthermore, we introduce a two-step method to further accelerate the optimization, and prove the convergence of this method. The experimental results show that our algorithm outperforms the state-of-the-art direct VO algorithms.
Download: PDF.
BibTeX entry:
@inproceedings{Zhou22icra, author = {L. Zhou and G. Huang and Y. Mao and S. Wang and M. Kaess}, title = {{EDPLVO}: Efficient Direct Point-Line Visual Odometry}, booktitle = {Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA}, pages = {7559-7565}, address = {Philadelphia, PA, USA}, month = may, year = {2022}, note = {Outstanding navigation paper award.} }