TY - GEN
T1 - Vision-based fast navigation of micro aerial vehicles
AU - Loianno, Giuseppe
AU - Kumar, Vijay
N1 - Publisher Copyright:
© 2016 SPIE.
PY - 2016
Y1 - 2016
N2 - We address the key challenges for autonomous fast flight for Micro Aerial Vehicles (MAVs) in 3-D, cluttered environments. For complete autonomy, the system must identify the vehicle's state at high rates, using either absolute or relative asynchronous on-board sensor measurements, use these state estimates for feedback control, and plan trajectories to the destination. State estimation requires information from different sensors to be fused, exploiting information from different, possible asynchronous sensors at different rates. In this work, we present techniques in the area of planning, control and visual-inertial state estimation for fast navigation of MAVs. We demonstrate how to solve on-board, on a small computational unit, the pose estimation, control and planning problems for MAVs, using a minimal sensor suite for autonomous navigation composed of a single camera and IMU. Additionally, we show that a consumer electronic device such as a smartphone can alternatively be employed for both sensing and computation. Experimental results validate the proposed techniques. Any consumer, provided with a smartphone, can autonomously drive a quadrotor platform at high speed, without GPS, and concurrently build 3-D maps, using a suitably designed app.
AB - We address the key challenges for autonomous fast flight for Micro Aerial Vehicles (MAVs) in 3-D, cluttered environments. For complete autonomy, the system must identify the vehicle's state at high rates, using either absolute or relative asynchronous on-board sensor measurements, use these state estimates for feedback control, and plan trajectories to the destination. State estimation requires information from different sensors to be fused, exploiting information from different, possible asynchronous sensors at different rates. In this work, we present techniques in the area of planning, control and visual-inertial state estimation for fast navigation of MAVs. We demonstrate how to solve on-board, on a small computational unit, the pose estimation, control and planning problems for MAVs, using a minimal sensor suite for autonomous navigation composed of a single camera and IMU. Additionally, we show that a consumer electronic device such as a smartphone can alternatively be employed for both sensing and computation. Experimental results validate the proposed techniques. Any consumer, provided with a smartphone, can autonomously drive a quadrotor platform at high speed, without GPS, and concurrently build 3-D maps, using a suitably designed app.
KW - Navigation
KW - UAVs
KW - Visual localization
UR - http://www.scopus.com/inward/record.url?scp=84991491453&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84991491453&partnerID=8YFLogxK
U2 - 10.1117/12.2224520
DO - 10.1117/12.2224520
M3 - Conference contribution
AN - SCOPUS:84991491453
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - Micro- and Nanotechnology Sensors, Systems, and Applications VIII
A2 - George, Thomas
A2 - Dutta, Achyut K.
A2 - Islam, M. Saif
PB - SPIE
T2 - Micro- and Nanotechnology Sensors, Systems, and Applications VIII Conference
Y2 - 17 April 2016 through 21 April 2016
ER -