TY - GEN
T1 - fakeWeather
T2 - 2022 International Joint Conference on Neural Networks, IJCNN 2022
AU - Marchisio, Alberto
AU - Caramia, Giovanni
AU - Martina, Maurizio
AU - Shafique, Muhammad
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Recently, Deep Neural Networks (DNNs) have achieved remarkable performances in many applications, while several studies have enhanced their vulnerabilities to malicious attacks. In this paper, we emulate the effects of natural weather conditions to introduce plausible perturbations that mislead the DNNs. By observing the effects of such atmospheric perturbations on the camera lenses, we model the patterns to create different masks that fake the effects of rain, snow, and hail. Even though the perturbations introduced by our attacks are visible, their presence remains unnoticed due to their association with natural events, which can be especially catastrophic for fully-autonomous and unmanned vehicles. We test our proposed fake Weather attacks on multiple Convolutional Neural Network and Capsule Network models, and report noticeable accuracy drops in the presence of such adversarial perturbations. Our work introduces a new security threat for DNNs, which is especially severe for safety-critical applications and autonomous systems.
AB - Recently, Deep Neural Networks (DNNs) have achieved remarkable performances in many applications, while several studies have enhanced their vulnerabilities to malicious attacks. In this paper, we emulate the effects of natural weather conditions to introduce plausible perturbations that mislead the DNNs. By observing the effects of such atmospheric perturbations on the camera lenses, we model the patterns to create different masks that fake the effects of rain, snow, and hail. Even though the perturbations introduced by our attacks are visible, their presence remains unnoticed due to their association with natural events, which can be especially catastrophic for fully-autonomous and unmanned vehicles. We test our proposed fake Weather attacks on multiple Convolutional Neural Network and Capsule Network models, and report noticeable accuracy drops in the presence of such adversarial perturbations. Our work introduces a new security threat for DNNs, which is especially severe for safety-critical applications and autonomous systems.
KW - Adversarial Attacks
KW - Deep Neural Networks
KW - Hail
KW - Rain
KW - Snow
KW - Weather
UR - http://www.scopus.com/inward/record.url?scp=85140789733&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85140789733&partnerID=8YFLogxK
U2 - 10.1109/IJCNN55064.2022.9892612
DO - 10.1109/IJCNN55064.2022.9892612
M3 - Conference contribution
AN - SCOPUS:85140789733
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 18 July 2022 through 23 July 2022
ER -