TY - GEN
T1 - Exponential Separations in Symmetric Neural Networks
AU - Zweig, Aaron
AU - Bruna, Joan
N1 - Publisher Copyright:
© 2022 Neural information processing systems foundation. All rights reserved.
PY - 2022
Y1 - 2022
N2 - In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network [21] architecture as a natural generalization of the DeepSets [32] architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size N with elements in dimension D, which can be efficiently approximated by the former architecture, but provably requires width exponential in N and D for the latter.
AB - In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network [21] architecture as a natural generalization of the DeepSets [32] architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size N with elements in dimension D, which can be efficiently approximated by the former architecture, but provably requires width exponential in N and D for the latter.
UR - http://www.scopus.com/inward/record.url?scp=85163183825&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85163183825&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85163183825
T3 - Advances in Neural Information Processing Systems
BT - Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022
A2 - Koyejo, S.
A2 - Mohamed, S.
A2 - Agarwal, A.
A2 - Belgrave, D.
A2 - Cho, K.
A2 - Oh, A.
PB - Neural information processing systems foundation
T2 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022
Y2 - 28 November 2022 through 9 December 2022
ER -