Mallory LindahlMonday, July 22, 2024Print this page.
Ji Zhang, a systems scientist in the Robotics Institute (RI), received the Robotics: Science and Systems (RSS) 2024 Test of Time Award for his work on LOAM: Lidar Odometry and Mapping in Real-Time.
RSS introduced the Test of Time Award to acknowledge papers published at least 10 years ago that had the greatest impact on robotics design or new approaches to problem solving. The award aims to cultivate discussion about robotics developments by reflecting on the past and looking forward to future endeavors in the field. As this year's awardee, Zhang presented a virtual keynote during the conference and led a Test of Time panel session devoted to his work on LOAM. RSS was held July 15–19 in the Netherlands.
Zhang's work on LOAM began in 2013, when he joined his Ph.D. advisor, Sanjiv Singh, on an agricultural robotics project. Their goal was to use simultaneous location and mapping (SLAM) to automate an electrical utility vehicle and have it drive between rows of trees in orchards. To achieve this goal, Singh and Zhang modified a 2D lidar to become 3D by adding a spinning mechanism to sense the surrounding environment.
The next year, RSS published their first LOAM paper, which expanded on the real-time method for odometry and mapping using a two-axis lidar in six degrees of freedom. The method addressed real-time mapping challenges by developing systems that enabled the robot to move in multiple directions and cover a 360-degree horizontal plane, helping it to map environments more efficiently. This paper has remained critical to SLAM research, leading to its Test of Time award.
Zhang continued to develop, improve and refine the approach over the years. He is currently working with Wenshan Wang, also an RI systems scientist, to take a fully built system and move it further toward the intersection of computer vision, natural language understanding and autonomous navigation. The pair are developing high-level AI modules that understand environmental scenes and language while autonomously guiding the robot.
"We're trying to build a high-level AI module that can understand people's natural language and understand the scene with camera vision technologies to guide the vehicle to navigate autonomously," Zhang said.
The 2014 paper is available on the RI website, where you can also read the full story about the award.
Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu