Stability Properties of Graph Neural Networks

Fernando Gama, Joan Bruna, Alejandro Ribeiro

Research output: Contribution to journalArticlepeer-review

Abstract

Graph neural networks (GNNs) have emerged as a powerful tool for nonlinear processing of graph signals, exhibiting success in recommender systems, power outage prediction, and motion planning, among others. GNNs consist of a cascade of layers, each of which applies a graph convolution, followed by a pointwise nonlinearity. In this work, we study the impact that changes in the underlying topology have on the output of the GNN. First, we show that GNNs are permutation equivariant, which implies that they effectively exploit internal symmetries of the underlying topology. Then, we prove that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies. These are two properties that cannot simultaneously hold when using only linear graph filters, which are either discriminative or stable, thus explaining the superior performance of GNNs.

Original languageEnglish (US)
Article number9206091
Pages (from-to)5680-5695
Number of pages16
JournalIEEE Transactions on Signal Processing
Volume68
DOIs
StatePublished - 2020

Keywords

  • Graph convolutions
  • graph filters
  • graph neural networks
  • graph signal processing
  • network data
  • stability

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Stability Properties of Graph Neural Networks'. Together they form a unique fingerprint.

Cite this