PTUM: Pre-training User Model from Unlabeled User Behaviors via Self-supervision
Chuhan Wu, Fangzhao Wu, Tao Qi, Jianxun Lian, Yongfeng Huang, Xing Xie
Abstract
User modeling is critical for many personalized web services. Many existing methods model users based on their behaviors and the labeled data of target tasks. However, these methods cannot exploit useful information in unlabeled user behavior data, and their performance may be not optimal when labeled data is scarce. Motivated by pre-trained language models which are pre-trained on large-scale unlabeled corpus to empower many downstream tasks, in this paper we propose to pre-train user models from large-scale unlabeled user behaviors data. We propose two self-supervision tasks for user model pre-training. The first one is masked behavior prediction, which can model the relatedness between historical behaviors. The second one is next K behavior prediction, which can model the relatedness between past and future behaviors. The pre-trained user models are finetuned in downstream tasks to learn task-specific user representations. Experimental results on two real-world datasets validate the effectiveness of our proposed user model pre-training method.- Anthology ID:
- 2020.findings-emnlp.174
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Trevor Cohn, Yulan He, Yang Liu
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1939–1944
- Language:
- URL:
- https://aclanthology.org/2020.findings-emnlp.174
- DOI:
- 10.18653/v1/2020.findings-emnlp.174
- Cite (ACL):
- Chuhan Wu, Fangzhao Wu, Tao Qi, Jianxun Lian, Yongfeng Huang, and Xing Xie. 2020. PTUM: Pre-training User Model from Unlabeled User Behaviors via Self-supervision. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1939–1944, Online. Association for Computational Linguistics.
- Cite (Informal):
- PTUM: Pre-training User Model from Unlabeled User Behaviors via Self-supervision (Wu et al., Findings 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2020.findings-emnlp.174.pdf
- Code
- wuch15/PTUM