TY - JOUR
T1 - Tracking and Relative Localization of Drone Swarms with a Vision-Based Headset
AU - Pavliv, Maxim
AU - Schiano, Fabrizio
AU - Reardon, Christopher
AU - Floreano, Dario
AU - Loianno, Giuseppe
N1 - Funding Information:
Manuscript received October 15, 2020; accepted December 13, 2020. Date of publication January 14, 2021; date of current version February 24, 2021. This letter was recommended for publication in part by the Associate Editor B. Duncan and Editor P. Pounds upon evaluation of the reviewers’ comments. This work was supported in part by Qualcomm Research, the ARL Grant DCIST CRA W911NF-17-2-0181, in part by the Technology Innovation Institute, Nokia, and NYU Wireless, and in part by the Swiss National Science Foundation (SNSF) under Grant 200021-155907. (Corresponding author: Giuseppe Loianno.) Maxim Pavliv is with the New York University, Tandon School of Engineering, Brooklyn, NY 11201 USA, and also with the Ecole Polytchnique Federale de Lausanne, Route Cantonale, 1015 Lausanne, Switzerland (e-mail: mp5516@nyu.edu).
Funding Information:
This work was supported in part by Qualcomm Research, theARLGrant DCISTCRA W911NF-17-2-0181, in part by the Technology Innovation Institute, Nokia, and NYU Wireless, and in part by the Swiss National Science Foundation (SNSF) under Grant 200021-155907.
Publisher Copyright:
© 2016 IEEE.
PY - 2021/4
Y1 - 2021/4
N2 - We address the detection, tracking, and relative localization of the agents of a drone swarm from a human perspective using a headset equipped with a single camera and an Inertial Measurement Unit (IMU). We train and deploy a deep neural network detector on image data to detect the drones. A joint probabilistic data association filter resolves the detection problems and couples this information with the headset IMU data to track the agents. In order to estimate the drones' relative poses in 3D space with respect to the human, we use an additional deep neural network that processes image regions of the drones provided by the tracker. Finally, to speed up the deep neural networks' training, we introduce an automated labeling process relying on a motion capture system. Several experimental results validate the effectiveness of the proposed approach. The approach is real-time, does not rely on any communication between the human and the drones, and can scale to a large number of agents, often called swarms. It can be used to spatially task a swarm of drones and also employed without a headset for formation control and coordination of terrestrial vehicles.
AB - We address the detection, tracking, and relative localization of the agents of a drone swarm from a human perspective using a headset equipped with a single camera and an Inertial Measurement Unit (IMU). We train and deploy a deep neural network detector on image data to detect the drones. A joint probabilistic data association filter resolves the detection problems and couples this information with the headset IMU data to track the agents. In order to estimate the drones' relative poses in 3D space with respect to the human, we use an additional deep neural network that processes image regions of the drones provided by the tracker. Finally, to speed up the deep neural networks' training, we introduce an automated labeling process relying on a motion capture system. Several experimental results validate the effectiveness of the proposed approach. The approach is real-time, does not rely on any communication between the human and the drones, and can scale to a large number of agents, often called swarms. It can be used to spatially task a swarm of drones and also employed without a headset for formation control and coordination of terrestrial vehicles.
KW - Aerial systems
KW - applications
KW - human-centered robotics
KW - localization
UR - http://www.scopus.com/inward/record.url?scp=85099723615&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85099723615&partnerID=8YFLogxK
U2 - 10.1109/LRA.2021.3051565
DO - 10.1109/LRA.2021.3051565
M3 - Article
AN - SCOPUS:85099723615
VL - 6
SP - 1455
EP - 1462
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
SN - 2377-3766
IS - 2
M1 - 9324934
ER -