Real-Time Soft Body 3D Proprioception via Deep Vision-Based Sensing

Ruoyu Wang, Shiheng Wang, Songyu Du, Erdong Xiao, Wenzhen Yuan, Chen Feng

Research output: Contribution to journalArticlepeer-review

Abstract

Soft bodies made from flexible and deformable materials are popular in many robotics applications, but their proprioceptive sensing has been a long-standing challenge. In other words, there has hardly been a method to measure and model the high-dimensional 3D shapes of soft bodies with internal sensors. We propose a framework to measure the high-resolution 3D shapes of soft bodies in real-time with embedded cameras. The cameras capture visual patterns inside a soft body, and a convolutional neural network (CNN) produces a latent code representing the deformation state, which can then be used to reconstruct the body's 3D shape using another neural network. We test the framework on various soft bodies, such as a Baymax-shaped toy, a latex balloon, and some soft robot fingers, and achieve real-time computation (\leq2.5 ms/frame) for robust shape estimation with high precision (\leq1% relative error) and high resolution. We believe the method could be applied to soft robotics and human-robot interaction for proprioceptive shape sensing. Our code is available at: https://ai4ce.github.io/DeepSoRo/.

Original languageEnglish (US)
Article number9006921
Pages (from-to)3382-3389
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume5
Issue number2
DOIs
StatePublished - Apr 2020

Keywords

  • 3D deep learning
  • Modeling
  • and learning for soft robots
  • control
  • deep learning in robotics and automation

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Biomedical Engineering
  • Human-Computer Interaction
  • Mechanical Engineering
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Control and Optimization
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Real-Time Soft Body 3D Proprioception via Deep Vision-Based Sensing'. Together they form a unique fingerprint.

Cite this