Community detection in hypergraphs, spiked tensor models, and Sum-of-Squares

Chiheon Kim, Afonso S. Bandeira, Michel X. Goemans

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We study the problem of community detection in hypergraphs under a stochastic block model. Similarly to how the stochastic block model in graphs suggests studying spiked random matrices, our model motivates investigating statistical and computational limits of exact recovery in certain spiked tensor models. In contrast with the matrix case, the spiked model naturally arising from community detection in hypergraphs is different from the one arising in the so-called tensor Principal Component Analysis model. We investigate the effectiveness of algorithms in the Sum-of-Squares hierarchy on these models. Interestingly, our results suggest that these two apparently similar models might exhibit very different computational to statistical gaps.

Original languageEnglish (US)
Title of host publication2017 12th International Conference on Sampling Theory and Applications, SampTA 2017
EditorsGholamreza Anbarjafari, Andi Kivinukk, Gert Tamberg
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages124-128
Number of pages5
ISBN (Electronic)9781538615652
DOIs
StatePublished - Sep 1 2017
Event12th International Conference on Sampling Theory and Applications, SampTA 2017 - Tallinn, Estonia
Duration: Jul 3 2017Jul 7 2017

Publication series

Name2017 12th International Conference on Sampling Theory and Applications, SampTA 2017

Other

Other12th International Conference on Sampling Theory and Applications, SampTA 2017
Country/TerritoryEstonia
CityTallinn
Period7/3/177/7/17

ASJC Scopus subject areas

  • Signal Processing
  • Statistics, Probability and Uncertainty
  • Analysis
  • Statistics and Probability
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Community detection in hypergraphs, spiked tensor models, and Sum-of-Squares'. Together they form a unique fingerprint.

Cite this