Learning Tactile Models for Factor Graph-based State Estimation

12/07/2020
by   Paloma Sodhi, et al.
0

We address the problem of estimating object pose from touch during manipulation under occlusion. Vision-based tactile sensors provide rich, local measurements at the point of contact. A single such measurement, however, contains limited information and multiple measurements are needed to infer latent object state. We solve this inference problem using a factor graph. In order to incorporate tactile measurements in the graph, we need local observation models that can map high-dimensional tactile images onto a low-dimensional state space. Prior work has used low-dimensional force measurements or hand-designed functions to interpret tactile measurements. These methods, however, can be brittle and difficult to scale across objects and sensors. Our key insight is to directly learn tactile observation models that predict the relative pose of the sensor given a pair of tactile images. These relative poses can then be incorporated as factors within a factor graph. We propose a two-stage approach: first we learn local tactile observation models supervised with ground truth data, and then integrate these models along with physics and geometric factors within a factor graph optimizer. We demonstrate reliable object tracking using only tactile feedback for over 150 real-world planar pushing sequences with varying trajectories across three object shapes. Supplementary video: https://youtu.be/gp5fuIZTXMA

READ FULL TEXT

page 1

page 4

page 5

page 8

research
11/15/2021

PatchGraph: In-hand tactile tracking with learned surface normals

We address the problem of tracking 3D object poses from touch during in-...
research
03/29/2021

Ground Encoding: Learned Factor Graph-based Models for Localizing Ground Penetrating Radar

We address the problem of robot localization using ground penetrating ra...
research
03/14/2023

FingerSLAM: Closed-loop Unknown Object Localization and Reconstruction from Visuo-tactile Feedback

In this paper, we address the problem of using visuo-tactile feedback fo...
research
11/13/2020

Tactile SLAM: Real-time inference of shape and pose from planar pushing

Tactile perception is central to robot manipulation in unstructured envi...
research
08/04/2021

LEO: Learning Energy-based Models in Graph Optimization

We address the problem of learning observation models end-to-end for est...
research
03/08/2019

Joint Inference of Kinematic and Force Trajectories with Visuo-Tactile Sensing

To perform complex tasks, robots must be able to interact with and manip...
research
09/09/2019

Ground truth force distribution for learning-based tactile sensing: a finite element approach

Skin-like tactile sensors provide robots with rich feedback related to t...

Please sign up or login with your details

Forgot password? Click here to reset