Cooperative localization and mapping of MAVs using RGB-D sensors

Giuseppe Loianno, Justin Thomas, Vijay Kumar

Research output: Contribution to journalConference article

Abstract

The fusion of IMU and RGB-D sensors presents an interesting combination of information to achieve autonomous localization and mapping using robotic platforms such as ground robots and flying vehicles. In this paper, we present a software framework for cooperative localization and mapping while simultaneously using multiple aerial platforms. We employ a monocular visual odometry algorithm to solve the localization task, where the depth data flow associated to the RGB image is used to estimate the scale factor associated with the visual information. The current framework enables autonomous onboard control of each vehicle with cooperative localization and mapping. We present a methodology that provides both a sparse map generated by the monocular SLAM and a multiple resolution dense map generated by the associated depth. The localization algorithm and both 3D mapping algorithms work in parallel improving the system real-time reliability. We present experimental results to show the effectiveness of the proposed approach using two quadrotors platforms.

Original languageEnglish (US)
Article number7139761
Pages (from-to)4021-4028
Number of pages8
JournalProceedings - IEEE International Conference on Robotics and Automation
Volume2015-June
Issue numberJune
DOIs
StatePublished - Jun 29 2015
Event2015 IEEE International Conference on Robotics and Automation, ICRA 2015 - Seattle, United States
Duration: May 26 2015May 30 2015

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Cooperative localization and mapping of MAVs using RGB-D sensors'. Together they form a unique fingerprint.

  • Cite this