Abstract
When multiple conversations occur simultaneously, a listener must decide which conversation each utterance is part of in order to interpret and respond to it appropriately. This task is referred as dialogue disentanglement. A significant drawback of previous studies on disentanglement lies in that they only focus on pair-wise relationships between utterances while neglecting the conversation structure which is important for conversation structure modeling. In this paper, we propose a hierarchical model, named Dialogue BERT (DIALBERT), which integrates the local and global semantics in the context range by using BERT to encode each message-pair and using BiLSTM to aggregate the chronological context information into the output of BERT. In order to integrate the conversation structure information into the model, two types of loss of conversation-structure loss and tree-structure loss are designed. In this way, our model can implicitly learn and leverage the conversation structures without being restricted to the lack of explicit access to such structures during the inference stage. Experimental results on two large datasets show that our method outperforms previous methods by substantial margins, achieving great performance on dialogue disentanglement.- Anthology ID:
- 2022.dialdoc-1.6
- Volume:
- Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Song Feng, Hui Wan, Caixia Yuan, Han Yu
- Venue:
- dialdoc
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 54–64
- Language:
- URL:
- https://aclanthology.org/2022.dialdoc-1.6
- DOI:
- 10.18653/v1/2022.dialdoc-1.6
- Cite (ACL):
- Tianda Li, Jia-Chen Gu, Zhen-Hua Ling, and Quan Liu. 2022. Conversation- and Tree-Structure Losses for Dialogue Disentanglement. In Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering, pages 54–64, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Conversation- and Tree-Structure Losses for Dialogue Disentanglement (Li et al., dialdoc 2022)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/2022.dialdoc-1.6.pdf