Dexterity from Touch: Self-Supervised Pre-Training of Tactile Representations with Robotic Play

Irmak Guzey, Ben Evans, Soumith Chintala, Lerrel Pinto

Research output: Contribution to journalConference articlepeer-review

Abstract

Teaching dexterity to multi-fingered robots has been a longstanding challenge in robotics. Most prominent work in this area focuses on learning controllers or policies that either operate on visual observations or state estimates derived from vision. However, such methods perform poorly on fine-grained manipulation tasks that require reasoning about contact forces or about objects occluded by the hand itself. In this work, we present T-DEX, a new approach for tactile-based dexterity, that operates in two phases. In the first phase, we collect 2.5 hours of play data, which is used to train self-supervised tactile encoders. This is necessary to bring high-dimensional tactile readings to a lower-dimensional embedding. In the second phase, given a handful of demonstrations for a dexterous task, we learn non-parametric policies that combine the tactile observations with visual ones. Across five challenging dexterous tasks, we show that our tactile-based dexterity models outperform purely vision and torque-based models by an average of 1.7X. Finally, we provide a detailed analysis on factors critical to T-DEX including the importance of play data, architectures, and representation learning.

Original languageEnglish (US)
JournalProceedings of Machine Learning Research
Volume229
StatePublished - 2023
Event7th Conference on Robot Learning, CoRL 2023 - Atlanta, United States
Duration: Nov 6 2023Nov 9 2023

Keywords

  • Dexterity
  • Manipulation
  • Tactile

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Dexterity from Touch: Self-Supervised Pre-Training of Tactile Representations with Robotic Play'. Together they form a unique fingerprint.

Cite this