Abstract
Citation recommendation systems for the scientific literature, to help authors find papers that should be cited, have the potential to speed up discoveries and uncover new routes for scientific exploration. We treat this task as a ranking problem, which we tackle with a two-stage approach: candidate generation followed by re-ranking. Within this framework, we adapt to the scientific domain a proven combination based on “bag of words” retrieval followed by re-scoring with a BERT model. We experimentally show the effects of domain adaptation, both in terms of pretraining on in-domain data and exploiting in-domain vocabulary. In addition, we evaluate eleven pretrained transformer models and analyze some unexpected failure cases. On three different collections from different scientific disciplines, our models perform close to or at the state of the art in the citation recommendation task.
Original language | English (US) |
---|---|
Pages (from-to) | 89-100 |
Number of pages | 12 |
Journal | CEUR Workshop Proceedings |
Volume | 2591 |
State | Published - 2020 |
Event | 10th International Workshop on Bibliometric-Enhanced Information Retrieval, BIR 2020 - Lisbon, Portugal Duration: Apr 14 2020 → … |
ASJC Scopus subject areas
- Computer Science(all)