TY - GEN
T1 - Robust Collaborative Perception without External Localization and Clock Devices
AU - Lei, Zixing
AU - Ni, Zhenyang
AU - Han, Ruize
AU - Tang, Shuo
AU - Feng, Chen
AU - Chen, Siheng
AU - Wang, Yanfeng
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - A consistent spatial-temporal coordination across multiple agents is fundamental for collaborative perception, which seeks to improve perception abilities through information exchange among agents. To achieve this spatial-temporal alignment, traditional methods depend on external devices to provide localization and clock signals. However, hardware-generated signals could be vulnerable to noise and potentially malicious attack, jeopardizing the precision of spatial-temporal alignment. Rather than relying on external hardwares, this work proposes a novel approach: aligning by recognizing the inherent geometric patterns within the perceptual data of various agents. Following this spirit, we propose a robust collaborative perception system that operates independently of external localization and clock devices. The key module of our system, FreeAlign, constructs a salient object graph for each agent based on its detected boxes and uses a graph neural network to identify common subgraphs between agents, leading to accurate relative pose and time. We validate FreeAlign on both real-world and simulated datasets. The results show that, the FreeAlign empowered robust collaborative perception system perform comparably to systems relying on precise localization and clock devices. Code will be released.
AB - A consistent spatial-temporal coordination across multiple agents is fundamental for collaborative perception, which seeks to improve perception abilities through information exchange among agents. To achieve this spatial-temporal alignment, traditional methods depend on external devices to provide localization and clock signals. However, hardware-generated signals could be vulnerable to noise and potentially malicious attack, jeopardizing the precision of spatial-temporal alignment. Rather than relying on external hardwares, this work proposes a novel approach: aligning by recognizing the inherent geometric patterns within the perceptual data of various agents. Following this spirit, we propose a robust collaborative perception system that operates independently of external localization and clock devices. The key module of our system, FreeAlign, constructs a salient object graph for each agent based on its detected boxes and uses a graph neural network to identify common subgraphs between agents, leading to accurate relative pose and time. We validate FreeAlign on both real-world and simulated datasets. The results show that, the FreeAlign empowered robust collaborative perception system perform comparably to systems relying on precise localization and clock devices. Code will be released.
UR - http://www.scopus.com/inward/record.url?scp=85202446521&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85202446521&partnerID=8YFLogxK
U2 - 10.1109/ICRA57147.2024.10610635
DO - 10.1109/ICRA57147.2024.10610635
M3 - Conference contribution
AN - SCOPUS:85202446521
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 7280
EP - 7286
BT - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
Y2 - 13 May 2024 through 17 May 2024
ER -