Learning the relevant substructures for tasks on graph data

Lei Chen, Zhengdao Chen, Joan Bruna

Research output: Contribution to journalConference articlepeer-review


Focusing on graph-structured prediction tasks, we demonstrate the ability of neural networks to provide both strong predictive performance and easy interpretability, two properties often at odds in modern deep architectures. We formulate the latter by the ability to extract the relevant substructures for a given task, inspired by biology and chemistry applications. To do so, we utilize the Local Relational Pooling (LRP) model, which is recently introduced with motivations from substructure counting. In this work, we demonstrate that LRP models can be used on challenging graph classification tasks to provide both state-of-the-art performance and interpretability, through the detection of the relevant substructures used by the network to make its decisions. Besides their broad applications (biology, chemistry, fraud detection, etc.), these models also raise new theoretical questions related to compressed sensing and to computational thresholds on random graphs.

Original languageEnglish (US)
Pages (from-to)8528-8532
Number of pages5
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
StatePublished - 2021
Event2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada
Duration: Jun 6 2021Jun 11 2021


  • Graph
  • Pooling
  • Substructure

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Learning the relevant substructures for tasks on graph data'. Together they form a unique fingerprint.

Cite this