Active Clothing Material Perception using Tactile Sensing and Deep Learning

11/02/2017
by   Wenzhen Yuan, et al.
0

Humans represent the objects in the same category using their properties, in order to well discriminate and understand them, and an intelligent robot should be able to do the same. In this work, we propose a robot system that can automatically perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. The target properties are the physical properties, like thickness, fuzziness, softness and endurance, and semantic properties like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and make 6616 exploring iterations on them. To learn the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can automatically explore the unknown clothes and learn their properties. This work proposes a new architecture for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset