TY - JOUR
T1 - On the equivalence between graph isomorphism testing and function approximation with GNNs
AU - Chen, Zhengdao
AU - Villar, Soledad
AU - Chen, Lei
AU - Bruna, Joan
N1 - Funding Information:
Acknowledgements We would like to thank Haggai Maron and Thomas Kipf for fruitful discussions and for pointing us towards G-invariant networks as powerful models to study representational power in graphs. We thank Prof. Michael M. Bronstein for supporting this research with computing resources. This work was partially supported by NSF grant RI-IIS 1816753, NSF CAREER CIF 1845360, the Alfred P. Sloan Fellowship, Samsung GRP and Samsung Electronics. SV was partially funded by EOARD FA9550-18-1-7007 and the Simons Collaboration Algorithms and Geometry.
Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.
PY - 2019
Y1 - 2019
N2 - Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints. Using this framework, we compare the expressive power of different classes of GNNs as well as other methods on graphs. In particular, we prove that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree. We then extend them to a new architecture, Ring-GNN, which succeeds in distinguishing these graphs as well as for tasks on real-world datasets.
AB - Graph neural networks (GNNs) have achieved lots of success on graph-structured data. In light of this, there has been increasing interest in studying their representation power. One line of work focuses on the universal approximation of permutation-invariant functions by certain classes of GNNs, and another demonstrates the limitation of GNNs via graph isomorphism tests. Our work connects these two perspectives and proves their equivalence. We further develop a framework of the representation power of GNNs with the language of sigma-algebra, which incorporates both viewpoints. Using this framework, we compare the expressive power of different classes of GNNs as well as other methods on graphs. In particular, we prove that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree. We then extend them to a new architecture, Ring-GNN, which succeeds in distinguishing these graphs as well as for tasks on real-world datasets.
UR - http://www.scopus.com/inward/record.url?scp=85087202303&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85087202303&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85087202303
SN - 1049-5258
VL - 32
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Y2 - 8 December 2019 through 14 December 2019
ER -