Entity-based Neural Local Coherence Modeling

Sungho Jeon, Michael Strube


Abstract
In this paper, we propose an entity-based neural local coherence model which is linguistically more sound than previously proposed neural coherence models. Recent neural coherence models encode the input document using large-scale pretrained language models. Hence their basis for computing local coherence are words and even sub-words. The analysis of their output shows that these models frequently compute coherence on the basis of connections between (sub-)words which, from a linguistic perspective, should not play a role. Still, these models achieve state-of-the-art performance in several end applications. In contrast to these models, we compute coherence on the basis of entities by constraining the input to noun phrases and proper names. This provides us with an explicit representation of the most important items in sentences leading to the notion of focus. This brings our model linguistically in line with pre-neural models of computing coherence. It also gives us better insight into the behaviour of the model thus leading to better explainability. Our approach is also in accord with a recent study (O’Connor and Andreas, 2021), which shows that most usable information is captured by nouns and verbs in transformer-based language models. We evaluate our model on three downstream tasks showing that it is not only linguistically more sound than previous models but also that it outperforms them in end applications.
Anthology ID:
2022.acl-long.537
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7787–7805
Language:
URL:
https://aclanthology.org/2022.acl-long.537
DOI:
10.18653/v1/2022.acl-long.537
Bibkey:
Cite (ACL):
Sungho Jeon and Michael Strube. 2022. Entity-based Neural Local Coherence Modeling. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7787–7805, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Entity-based Neural Local Coherence Modeling (Jeon & Strube, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.537.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.537.mp4
Code
 sdeva14/acl22-entity-neural-local-cohe
Data
GCDC