Tactile-Filter: Interactive Tactile Perception for Part Mating

by   Kei Ota, et al.

Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks. Our tactile sensing provides us with a lot of information regarding contact formations as well as geometric information about objects during any interaction. With this motivation, vision-based tactile sensors are being widely used for various robotic perception and control tasks. In this paper, we present a method for interactive perception using vision-based tactile sensors for multi-object assembly. In particular, we are interested in tactile perception during part mating, where a robot can use tactile sensors and a feedback mechanism using particle filter to incrementally improve its estimate of objects that fit together for assembly. To do this, we first train a deep neural network that makes use of tactile images to predict the probabilistic correspondence between arbitrarily shaped objects that fit together. The trained model is used to design a particle filter which is used twofold. First, given one partial (or non-unique) observation of the hole, it incrementally improves the estimate of the correct peg by sampling more tactile observations. Second, it selects the next action for the robot to sample the next touch (and thus image) which results in maximum uncertainty reduction to minimize the number of interactions during the perception task. We evaluate our method on several part-mating tasks for assembly using a robot equipped with a vision-based tactile sensor. We also show the efficiency of the proposed action selection method against a naive method. See supplementary video at https://www.youtube.com/watch?v=jMVBg_e3gLw .


page 1

page 6

page 7

page 8

page 11


A Tactile Sensing Foot for Single Robot Leg Stabilization

Tactile sensing on human feet is crucial for motion control, however, ha...

Cable Routing and Assembly using Tactile-driven Motion Primitives

Manipulating cables is challenging for robots because of the infinite de...

See to Touch: Learning Tactile Dexterity through Visual Incentives

Equipping multi-fingered robots with tactile sensing is crucial for achi...

Rotational Slippage Prediction from Segmentation of Tactile Images

Adding tactile sensors to a robotic system is becoming a common practice...

Uncertainty-aware deep learning for robot touch: Application to Bayesian tactile servo control

This work investigates uncertainty-aware deep learning (DL) in tactile r...

MidasTouch: Monte-Carlo inference over distributions across sliding touch

We present MidasTouch, a tactile perception system for online global loc...

TacMMs: Tactile Mobile Manipulators for Warehouse Automation

Multi-robot platforms are playing an increasingly important role in ware...

Please sign up or login with your details

Forgot password? Click here to reset