Abstract
Generating informative and appropriate responses is challenging but important for building human-like dialogue systems. Although various knowledge-grounded conversation models have been proposed, these models have limitations in utilizing knowledge that infrequently occurs in the training data, not to mention integrating unseen knowledge into conversation generation. In this paper, we propose an Entity-Agnostic Representation Learning (EARL) method to introduce knowledge graphs to informative conversation generation. Unlike traditional approaches that parameterize the specific representation for each entity, EARL utilizes the context of conversations and the relational structure of knowledge graphs to learn the category representation for entities, which is generalized to incorporating unseen entities in knowledge graphs into conversation generation. Automatic and manual evaluations demonstrate that our model can generate more informative, coherent, and natural responses than baseline models.- Anthology ID:
- 2021.emnlp-main.184
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2383–2395
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.184
- DOI:
- 10.18653/v1/2021.emnlp-main.184
- Cite (ACL):
- Hao Zhou, Minlie Huang, Yong Liu, Wei Chen, and Xiaoyan Zhu. 2021. EARL: Informative Knowledge-Grounded Conversation Generation with Entity-Agnostic Representation Learning. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2383–2395, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- EARL: Informative Knowledge-Grounded Conversation Generation with Entity-Agnostic Representation Learning (Zhou et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2021.emnlp-main.184.pdf
- Code
- thu-coai/earl
- Data
- OpenDialKG