Abstract
News recommendation is one of the widest commercialization in natural language processing research area, which aims to recommend news according to user interests. New recall plays an important role in news recommendation. It is to recall candidates from a very large news database. Recent researches of news recall mostly adopt dual-encoder architecture as it provides a much faster recall scheme, and they encode each word equally. However, these works remain two challenges: irrelevant word distraction and weak dual-encoder interaction. Therefore, we propose a model Topic-aware Attention and powerful Dual-encoder Interaction for Recall in news recommendation (TADI). To avoid irrelevant word distraction, TADI designs a Topic-aware Attention (TA) which weights words according to news topics. To enhance dual-encoder interaction, TADI provides a cheap yet powerful interaction module, namely Dual-encoder Interaction (DI). DI helps dual encoders interact powerfully based on two aux targets. After performance comparisons between TADI and state-of-the-arts in a series of experiments, we verify the effectiveness of TADI.- Anthology ID:
- 2023.findings-emnlp.1047
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 15647–15658
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.1047
- DOI:
- 10.18653/v1/2023.findings-emnlp.1047
- Cite (ACL):
- Junxiang Jiang. 2023. TADI: Topic-aware Attention and Powerful Dual-encoder Interaction for Recall in News Recommendation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15647–15658, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- TADI: Topic-aware Attention and Powerful Dual-encoder Interaction for Recall in News Recommendation (Jiang, Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2023.findings-emnlp.1047.pdf