History Semantic Graph Enhanced Conversational KBQA with Temporal Information Modeling

Hao Sun, Yang Li, Liwei Deng, Bowen Li, Binyuan Hui, Binhua Li, Yunshi Lan, Yan Zhang, Yongbin Li


Abstract
Context information modeling is an important task in conversational KBQA. However, existing methods usually assume the independence of utterances and model them in isolation. In this paper, we propose a History Semantic Graph Enhanced KBQA model (HSGE) that is able to effectively model long-range semantic dependencies in conversation history while maintaining low computational cost. The framework incorporates a context-aware encoder, which employs a dynamic memory decay mechanism and models context at different levels of granularity. We evaluate HSGE on a widely used benchmark dataset for complex sequential question answering. Experimental results demonstrate that it outperforms existing baselines averaged on all question types.
Anthology ID:
2023.acl-long.195
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3521–3533
Language:
URL:
https://aclanthology.org/2023.acl-long.195
DOI:
10.18653/v1/2023.acl-long.195
Bibkey:
Cite (ACL):
Hao Sun, Yang Li, Liwei Deng, Bowen Li, Binyuan Hui, Binhua Li, Yunshi Lan, Yan Zhang, and Yongbin Li. 2023. History Semantic Graph Enhanced Conversational KBQA with Temporal Information Modeling. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3521–3533, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
History Semantic Graph Enhanced Conversational KBQA with Temporal Information Modeling (Sun et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.acl-long.195.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2023.acl-long.195.mp4