Download: PDF.
“GPU Accelerated Robust Scene Reconstruction” by W. Dong, J. Park, Y. Yang, and M. Kaess. In Proc. IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems, IROS, (Macao), Nov. 2019, pp. 7863-7870.
We propose a fast and accurate 3D reconstruction system that takes a sequence of RGB-D frames and produces a globally consistent camera trajectory and a dense 3D geometry. We redesign core modules of a state-of-the-art offline reconstruction pipeline to maximally exploit the power of GPU. We introduce GPU accelerated core modules that include RGB-D odometry, geometric feature extraction and matching, point cloud registration, volumetric integration, and mesh extraction. Therefore, while being able to reproduce the results of the high-fidelity offline reconstruction system, our system runs more than 10 times faster on average. Nearly 10Hz can be achieved in medium size indoor scenes, making our offline system even comparable to online Simultaneous Localization and Mapping (SLAM) systems in terms of the speed. Experimental results show that our system produces more accurate results than several state-of-the-art online systems. The system is open source at https://github.com/theNded/Open3D.
Download: PDF.
BibTeX entry:
@inproceedings{Dong19iros, author = {W. Dong and J. Park and Y. Yang and M. Kaess}, title = {{GPU} Accelerated Robust Scene Reconstruction}, booktitle = {Proc. IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems, IROS}, pages = {7863-7870}, address = {Macao}, month = nov, year = {2019} }