TY - JOUR
T1 - Stability Properties of Graph Neural Networks
AU - Gama, Fernando
AU - Bruna, Joan
AU - Ribeiro, Alejandro
N1 - Funding Information:
Manuscript received September 17, 2019; revised April 22, 2020 and July 8, 2020; accepted September 17, 2020. Date of publication September 25, 2020; date of current version October 9, 2020. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Vincent Gripon. Fernando Gama and Alejandro Ribeiro are supported by NSF CCF 1717120, ARO W911NF1710438, ARL DCIST CRA W911NF-17-2-0181, ISTC-WAS and Intel DevCloud. Joan Bruna is partially supported by the Alfred P. Sloan Foundation, NSF RI-1816753, NSF CAREER CIF 1845360, and Samsung Electronics. (Corresponding author: Fernando Gama.) Fernando Gama is with the Electrical Engineering and Computer Sciences Department, University of California, Berkeley, CA 94709, USA (e-mail: fgama@berkeley.edu).
Publisher Copyright:
© 1991-2012 IEEE.
PY - 2020
Y1 - 2020
N2 - Graph neural networks (GNNs) have emerged as a powerful tool for nonlinear processing of graph signals, exhibiting success in recommender systems, power outage prediction, and motion planning, among others. GNNs consist of a cascade of layers, each of which applies a graph convolution, followed by a pointwise nonlinearity. In this work, we study the impact that changes in the underlying topology have on the output of the GNN. First, we show that GNNs are permutation equivariant, which implies that they effectively exploit internal symmetries of the underlying topology. Then, we prove that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies. These are two properties that cannot simultaneously hold when using only linear graph filters, which are either discriminative or stable, thus explaining the superior performance of GNNs.
AB - Graph neural networks (GNNs) have emerged as a powerful tool for nonlinear processing of graph signals, exhibiting success in recommender systems, power outage prediction, and motion planning, among others. GNNs consist of a cascade of layers, each of which applies a graph convolution, followed by a pointwise nonlinearity. In this work, we study the impact that changes in the underlying topology have on the output of the GNN. First, we show that GNNs are permutation equivariant, which implies that they effectively exploit internal symmetries of the underlying topology. Then, we prove that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies. These are two properties that cannot simultaneously hold when using only linear graph filters, which are either discriminative or stable, thus explaining the superior performance of GNNs.
KW - Graph convolutions
KW - graph filters
KW - graph neural networks
KW - graph signal processing
KW - network data
KW - stability
UR - http://www.scopus.com/inward/record.url?scp=85093650918&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85093650918&partnerID=8YFLogxK
U2 - 10.1109/TSP.2020.3026980
DO - 10.1109/TSP.2020.3026980
M3 - Article
AN - SCOPUS:85093650918
SN - 1053-587X
VL - 68
SP - 5680
EP - 5695
JO - IRE Transactions on Audio
JF - IRE Transactions on Audio
M1 - 9206091
ER -