Two Birds with One Stone: Unified Model Learning for Both Recall and Ranking in News Recommendation

Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang


Abstract
Recall and ranking are two critical steps in personalized news recommendation. Most existing news recommender systems conduct personalized news recall and ranking separately with different models. However, maintaining multiple models leads to high computational cost and poses great challenges to meeting the online latency requirement of news recommender systems. In order to handle this problem, in this paper we propose UniRec, a unified method for recall and ranking in news recommendation. In our method, we first infer user embedding for ranking from the historical news click behaviors of a user using a user encoder model. Then we derive the user embedding for recall from the obtained user embedding for ranking by using it as the attention query to select a set of basis user embeddings which encode different general user interests and synthesize them into a user embedding for recall. The extensive experiments on benchmark dataset demonstrate that our method can improve both efficiency and effectiveness for recall and ranking in news recommendation.
Anthology ID:
2022.findings-acl.274
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3474–3480
Language:
URL:
https://aclanthology.org/2022.findings-acl.274
DOI:
10.18653/v1/2022.findings-acl.274
Bibkey:
Cite (ACL):
Chuhan Wu, Fangzhao Wu, Tao Qi, and Yongfeng Huang. 2022. Two Birds with One Stone: Unified Model Learning for Both Recall and Ranking in News Recommendation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3474–3480, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Two Birds with One Stone: Unified Model Learning for Both Recall and Ranking in News Recommendation (Wu et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-acl.274.pdf
Data
MIND