PatchGraph: In-hand Tactile Tracking with Learned Surface Normals

Download: PDF.

“PatchGraph: In-hand Tactile Tracking with Learned Surface Normals” by P. Sodhi, M. Kaess, M. Mukadam, and S. Anderson. In Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA, (Philadelphia, PA, USA), May 2022, pp. 2164-2170.

Abstract

We address the problem of tracking 3D object poses from touch during in-hand manipulations. Specifically, we look at tracking small objects using vision-based tactile sensors that provide high-dimensional tactile image measurements at the point of contact. While prior work has relied on a-priori information about the object being localized, we remove this requirement. Our key insight is that an object is composed of several local surface patches, each informative enough to achieve reliable object tracking. Moreover, we can recover the geometry of this local patch online by extracting local surface normal information embedded in each tactile image. We propose a novel two-stage approach. First, we learn a mapping from tactile images to surface normals using an image translation network. Second, we use these surface normals within a factor graph to both reconstruct a local patch map and use it to infer 3D object poses. We demonstrate reliable object tracking for over 100 contact sequences across unique shapes with four objects in simulation and two objects in the real-world.

Download: PDF.

BibTeX entry:

@inproceedings{Sodhi22icra,
   author = {P. Sodhi and M. Kaess and M. Mukadam and S. Anderson},
   title = {Patch{G}raph: In-hand Tactile Tracking with Learned Surface
	Normals},
   booktitle = {Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA},
   pages = {2164-2170},
   address = {Philadelphia, PA, USA},
   month = may,
   year = {2022}
}
Last updated: November 10, 2024