Evaluating pretrained transformer models for citation recommendation

Rodrigo Nogueira, Zhiying Jiang, Kyunghyun Cho, Jimmy Lin

Research output: Contribution to journalConference articlepeer-review

Abstract

Citation recommendation systems for the scientific literature, to help authors find papers that should be cited, have the potential to speed up discoveries and uncover new routes for scientific exploration. We treat this task as a ranking problem, which we tackle with a two-stage approach: candidate generation followed by re-ranking. Within this framework, we adapt to the scientific domain a proven combination based on “bag of words” retrieval followed by re-scoring with a BERT model. We experimentally show the effects of domain adaptation, both in terms of pretraining on in-domain data and exploiting in-domain vocabulary. In addition, we evaluate eleven pretrained transformer models and analyze some unexpected failure cases. On three different collections from different scientific disciplines, our models perform close to or at the state of the art in the citation recommendation task.

Original languageEnglish (US)
Pages (from-to)89-100
Number of pages12
JournalCEUR Workshop Proceedings
Volume2591
StatePublished - 2020
Event10th International Workshop on Bibliometric-Enhanced Information Retrieval, BIR 2020 - Lisbon, Portugal
Duration: Apr 14 2020 → …

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Evaluating pretrained transformer models for citation recommendation'. Together they form a unique fingerprint.

Cite this