IPTV Channel Zapping Recommendation with Attention Mechanism

Guangyu Li, Lina Qiu, Chenguang Yu, Houwei Cao, Yong Liu, Can Yang

Research output: Contribution to journalArticlepeer-review

Abstract

Internet Protocol TV (IPTV) normally has the advantage of providing far more TV channels than the traditional TV services, while as the other side of the coin it has the problem of information overload. Users of IPTV usually have difficulties finding channels matching their interests. In this paper, using a large IPTV dataset, we analyze channel zapping behaviors of IPTV users and discover various patterns that can be used to generate more accurate channel zapping recommendations. Based on user behavior analysis, we develop several base and fusion recommender systems that generate in real-time a short list of channels for users to consider whenever they want to switch channels. A deep neural network model that consists of a 'Recommender System Attention (RS Attention)' module and a 'Channel Attention' module capturing the static and dynamic user switching behaviors is also developed to further improve the recommendation accuracy. Evaluation on the IPTV dataset demonstrates that our fusion recommender can achieve \text{41}\% hit ratio with only three candidate channels, and our attention neural network model further pushes it up to \text{45}\%. Our recommender systems only take as input user channel zapping sequences, and can be easily adopted by IPTV systems with low data and computation overheads.

Original languageEnglish (US)
Article number9055049
Pages (from-to)538-549
Number of pages12
JournalIEEE Transactions on Multimedia
Volume23
DOIs
StatePublished - 2021

Keywords

  • attention mechanism
  • fusion method
  • IPTV
  • neural networks
  • realtime recommendation
  • recommender system

ASJC Scopus subject areas

  • Signal Processing
  • Media Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'IPTV Channel Zapping Recommendation with Attention Mechanism'. Together they form a unique fingerprint.

Cite this