CiteBART: Learning to Generate Citations for Local Citation Recommendation

Ege Yiğit Çelik, Selma Tekir


Abstract
Local citation recommendation (LCR) suggests a set of papers for a citation placeholder within a given context. This paper introduces CiteBART, citation-specific pre-training within an encoder-decoder architecture, where author-date citation tokens are masked to learn to reconstruct them to fulfill LCR. The global version (CiteBART-Global) extends the local context with the citing paper’s title and abstract to enrich the learning signal. CiteBART-Global achieves state-of-the-art performance on LCR benchmarks except for the FullTextPeerRead dataset, which is quite small to see the advantage of generative pre-training. The effect is significant in the larger benchmarks, e.g., Refseer and ArXiv., with the Refseer pre-trained model emerging as the best-performing model. We perform comprehensive experiments, including an ablation study, a qualitative analysis, and a taxonomy of hallucinations with detailed statistics. Our analyses confirm that CiteBART-Global has a cross-dataset generalization capability; the macro hallucination rate (MaHR) at the top-3 predictions is 4%, and when the ground-truth is in the top-k prediction list, the hallucination tendency in the other predictions drops significantly. We publicly share our code, base datasets, global datasets, and pre-trained models to support reproducibility.
Anthology ID:
2025.emnlp-main.89
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1703–1719
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.89/
DOI:
Bibkey:
Cite (ACL):
Ege Yiğit Çelik and Selma Tekir. 2025. CiteBART: Learning to Generate Citations for Local Citation Recommendation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 1703–1719, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
CiteBART: Learning to Generate Citations for Local Citation Recommendation (Çelik & Tekir, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.89.pdf
Checklist:
 2025.emnlp-main.89.checklist.pdf