TY - GEN
T1 - Adaptive Adversarial Videos on Roadside Billboards
T2 - 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019
AU - Patel, Naman
AU - Krishnamurthy, Prashanth
AU - Garg, Siddharth
AU - Khorrami, Farshad
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - Deep neural networks (DNNs) are being incorporated into various autonomous systems like self-driving cars and robots. However, there is a rising concern about the robustness of these systems because of their susceptibility to adversarial attacks on DNNs. Past research has established that DNNs used for classification and object detection are prone to attacks causing targeted misclassification. In this paper, we show the effectiveness of an adversarial dynamic attack on an end-to-end trained DNN controlling an autonomous vehicle. We launch the attack by installing a billboard on the roadside and displaying videos to approaching vehicles to cause the DNN controller in the vehicle to generate steering commands that cause, for example, unintended lane changes or motion off the road causing accidents. The billboard has an integrated camera estimating the pose of the on-coming vehicle. The approach enables dynamic adversarial perturbation that adapts to the relative pose of the vehicle and uses the dynamics of the vehicle to steer it along adversary-chosen trajectories while being robust to variations in view, lighting, and weather. We demonstrate the effectiveness of the attack on a recently published off-the-shelf end-to-end learning-based autonomous navigation system in a high-fidelity simulator, CARLA (CAR Learning to Act). The proposed approach may also be applied to other systems driven by an end-to-end trained network.
AB - Deep neural networks (DNNs) are being incorporated into various autonomous systems like self-driving cars and robots. However, there is a rising concern about the robustness of these systems because of their susceptibility to adversarial attacks on DNNs. Past research has established that DNNs used for classification and object detection are prone to attacks causing targeted misclassification. In this paper, we show the effectiveness of an adversarial dynamic attack on an end-to-end trained DNN controlling an autonomous vehicle. We launch the attack by installing a billboard on the roadside and displaying videos to approaching vehicles to cause the DNN controller in the vehicle to generate steering commands that cause, for example, unintended lane changes or motion off the road causing accidents. The billboard has an integrated camera estimating the pose of the on-coming vehicle. The approach enables dynamic adversarial perturbation that adapts to the relative pose of the vehicle and uses the dynamics of the vehicle to steer it along adversary-chosen trajectories while being robust to variations in view, lighting, and weather. We demonstrate the effectiveness of the attack on a recently published off-the-shelf end-to-end learning-based autonomous navigation system in a high-fidelity simulator, CARLA (CAR Learning to Act). The proposed approach may also be applied to other systems driven by an end-to-end trained network.
UR - http://www.scopus.com/inward/record.url?scp=85081157283&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081157283&partnerID=8YFLogxK
U2 - 10.1109/IROS40897.2019.8968267
DO - 10.1109/IROS40897.2019.8968267
M3 - Conference contribution
AN - SCOPUS:85081157283
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 5916
EP - 5921
BT - 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 3 November 2019 through 8 November 2019
ER -