QuoteR: A Benchmark of Quote Recommendation for Writing
Fanchao Qi, Yanhui Yang, Jing Yi, Zhili Cheng, Zhiyuan Liu, Maosong Sun
Abstract
It is very common to use quotations (quotes) to make our writings more elegant or convincing. To help people find appropriate quotes efficiently, the task of quote recommendation is presented, aiming to recommend quotes that fit the current context of writing. There have been various quote recommendation approaches, but they are evaluated on different unpublished datasets. To facilitate the research on this task, we build a large and fully open quote recommendation dataset called QuoteR, which comprises three parts including English, standard Chinese and classical Chinese. Any part of it is larger than previous unpublished counterparts. We conduct an extensive evaluation of existing quote recommendation methods on QuoteR. Furthermore, we propose a new quote recommendation model that significantly outperforms previous methods on all three parts of QuoteR. All the code and data of this paper can be obtained at https://github.com/thunlp/QuoteR.- Anthology ID:
- 2022.acl-long.27
- Volume:
- Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Smaranda Muresan, Preslav Nakov, Aline Villavicencio
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 336–348
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2022.acl-long.27/
- DOI:
- 10.18653/v1/2022.acl-long.27
- Cite (ACL):
- Fanchao Qi, Yanhui Yang, Jing Yi, Zhili Cheng, Zhiyuan Liu, and Maosong Sun. 2022. QuoteR: A Benchmark of Quote Recommendation for Writing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 336–348, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- QuoteR: A Benchmark of Quote Recommendation for Writing (Qi et al., ACL 2022)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2022.acl-long.27.pdf
- Code
- thunlp/quoter
- Data
- BookCorpus