Physical Adversarial Attacks for Camera-Based Smart Systems: Current Trends, Categorization, Applications, Research Challenges, and Future Outlook

Amira Guesmi, Muhammad Abdullah Hanif, Bassem Ouni, Muhammad Shafique

Research output: Contribution to journalArticlepeer-review


Deep Neural Networks (DNNs) have shown impressive performance in computer vision tasks; however, their vulnerability to adversarial attacks raises concerns regarding their security and reliability. Extensive research has shown that DNNs can be compromised by carefully crafted perturbations, leading to significant performance degradation in both digital and physical domains. Therefore, ensuring the security of DNN-based systems is crucial, particularly in safety-critical domains such as autonomous driving, robotics, smart homes/cities, smart industries, video surveillance, and healthcare. In this paper, we present a comprehensive survey of the current trends focusing specifically on physical adversarial attacks. We aim to provide a thorough understanding of the concept of physical adversarial attacks, analyzing their key characteristics and distinguishing features. Furthermore, we explore the specific requirements and challenges associated with executing attacks in the physical world. Our article delves into various physical adversarial attack methods, categorized according to their target tasks in different applications, including classification, detection, face recognition, semantic segmentation and depth estimation. We assess the performance of these attack methods in terms of their effectiveness, stealthiness, and robustness. We examine how each technique strives to ensure the successful manipulation of DNNs while mitigating the risk of detection and withstanding real-world distortions. Lastly, we discuss the current challenges and outline potential future research directions in the field of physical adversarial attacks. We highlight the need for enhanced defense mechanisms, the exploration of novel attack strategies, the evaluation of attacks in different application domains, and the establishment of standardized benchmarks and evaluation criteria for physical adversarial attacks. Through this comprehensive survey, we aim to provide a valuable resource for researchers, practitioners, and policymakers to gain a holistic understanding of physical adversarial attacks in computer vision and facilitate the development of robust and secure DNN-based systems.

Original languageEnglish (US)
Pages (from-to)109617-109668
Number of pages52
JournalIEEE Access
StatePublished - 2023


  • Machine learning security
  • adversarial eyeglasses
  • adversarial makeup
  • adversarial masks
  • adversarial t-shirts
  • camera-based vision system robustness
  • camouflage techniques
  • computer vision
  • expectation over transformation
  • face recognition
  • image classification
  • imaging device manipulation
  • light manipulation-based attacks
  • monocular depth estimation
  • object detection
  • optical flow
  • patch-based attacks
  • person detection
  • person re-identification
  • physical adversarial attacks
  • physical adversarial prints
  • security
  • semantic segmentation
  • smart systems
  • stealthy physical attacks
  • sticker-based attacks
  • trustworthy AI
  • vehicle detection

ASJC Scopus subject areas

  • General Computer Science
  • General Materials Science
  • General Engineering


Dive into the research topics of 'Physical Adversarial Attacks for Camera-Based Smart Systems: Current Trends, Categorization, Applications, Research Challenges, and Future Outlook'. Together they form a unique fingerprint.

Cite this