Abstract
The mechanisms underlying human communication have been under investigation for decades, but the answer to how understanding between locutors emerges remains incomplete. Interaction theories suggest the development of a structural alignment between the speakers, allowing for the construction of a shared knowledge base (common ground). In this paper, we propose to apply metrics derived from information theory to quantify the amount of information exchanged between participants, the dynamics of information exchanges, to provide an objective way to measure the common ground instantiation. We focus on a corpus of free conversations augmented with prosodic segmentation and an expert annotation of thematic episodes. We show that during free conversations, the amount of information remains globally constant at the scale of the conversation, but varies depending on the thematic structuring, underlining the role of the speaker introducing the theme. We propose an original methodology applied to uncontrolled material.- Anthology ID:
- 2022.conll-1.15
- Volume:
- Proceedings of the 26th Conference on Computational Natural Language Learning (CoNLL)
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates (Hybrid)
- Venue:
- CoNLL
- SIG:
- SIGNLL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 213–227
- Language:
- URL:
- https://aclanthology.org/2022.conll-1.15
- DOI:
- Cite (ACL):
- Eliot Maës, Philippe Blache, and Leonor Becerra. 2022. Shared knowledge in natural conversations: can entropy metrics shed light on information transfers?. In Proceedings of the 26th Conference on Computational Natural Language Learning (CoNLL), pages 213–227, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
- Cite (Informal):
- Shared knowledge in natural conversations: can entropy metrics shed light on information transfers? (Maës et al., CoNLL 2022)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2022.conll-1.15.pdf