TY - GEN
T1 - A Game-Theoretic Analysis of Auditing Differentially Private Algorithms with Epistemically Disparate Herd
AU - Yang, Ya Ting
AU - Zhang, Tao
AU - Zhu, Quanyan
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - Privacy-preserving AI algorithms are widely adopted in various domains, but the lack of transparency might pose accountability issues. While auditing algorithms can address this issue, machine-based audit approaches are often costly and time-consuming. Herd audit, on the other hand, offers an alternative solution by harnessing collective intelligence. Nevertheless, the presence of epistemic disparity among auditors, resulting in varying levels of expertise and access to knowledge, may impact audit performance. An effective herd audit will establish a credible accountability threat for algorithm developers, incentivizing them to uphold their claims. In this study, our objective is to develop a systematic framework that examines the impact of herd audit on algorithm developers using the Stackelberg game approach. The optimal strategy for auditors emphasizes the importance of easy access to relevant information, as it increases the auditors’ confidence in the audit process. Similarly, the optimal choice for developers suggests that herd audit is viable when auditors face lower costs in acquiring knowledge. By enhancing transparency and accountability, herd audit contributes to the responsible development of privacy-preserving algorithms.
AB - Privacy-preserving AI algorithms are widely adopted in various domains, but the lack of transparency might pose accountability issues. While auditing algorithms can address this issue, machine-based audit approaches are often costly and time-consuming. Herd audit, on the other hand, offers an alternative solution by harnessing collective intelligence. Nevertheless, the presence of epistemic disparity among auditors, resulting in varying levels of expertise and access to knowledge, may impact audit performance. An effective herd audit will establish a credible accountability threat for algorithm developers, incentivizing them to uphold their claims. In this study, our objective is to develop a systematic framework that examines the impact of herd audit on algorithm developers using the Stackelberg game approach. The optimal strategy for auditors emphasizes the importance of easy access to relevant information, as it increases the auditors’ confidence in the audit process. Similarly, the optimal choice for developers suggests that herd audit is viable when auditors face lower costs in acquiring knowledge. By enhancing transparency and accountability, herd audit contributes to the responsible development of privacy-preserving algorithms.
UR - http://www.scopus.com/inward/record.url?scp=85181977411&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85181977411&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-50670-3_18
DO - 10.1007/978-3-031-50670-3_18
M3 - Conference contribution
AN - SCOPUS:85181977411
SN - 9783031506697
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 349
EP - 368
BT - Decision and Game Theory for Security - 14th International Conference, GameSec 2023, Proceedings
A2 - Fu, Jie
A2 - Kroupa, Tomas
A2 - Hayel, Yezekael
PB - Springer Science and Business Media Deutschland GmbH
T2 - 14th International Conference on Decision and Game Theory for Security, GameSec 2023
Y2 - 18 October 2023 through 20 October 2023
ER -