A Closer Look at Claim Decomposition

Miriam Wanner, Seth Ebner, Zhengping Jiang, Mark Dredze, Benjamin Van Durme


Abstract
As generated text becomes more commonplace, it is increasingly important to evaluate how well-supported such text is by external knowledge sources. Many approaches for evaluating textual support rely on some method for decomposing text into its individual subclaims which are scored against a trusted reference. We investigate how various methods of claim decomposition—especially LLM-based methods—affect the result of an evaluation approach such as the recently proposed FActScore, finding that it is sensitive to the decomposition method used. This sensitivity arises because such metrics attribute overall textual support to the model that generated the text even though error can also come from the metric’s decomposition step. To measure decomposition quality, we introduce an adaptation of FActScore, which we call DecompScore. We then propose an LLM-based approach to generating decompositions inspired by Bertrand Russell’s theory of logical atomism and neo-Davidsonian semantics and demonstrate its improved decomposition quality over previous methods.
Anthology ID:
2024.starsem-1.13
Volume:
Proceedings of the 13th Joint Conference on Lexical and Computational Semantics (*SEM 2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Danushka Bollegala, Vered Shwartz
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
153–175
Language:
URL:
https://aclanthology.org/2024.starsem-1.13
DOI:
Bibkey:
Cite (ACL):
Miriam Wanner, Seth Ebner, Zhengping Jiang, Mark Dredze, and Benjamin Van Durme. 2024. A Closer Look at Claim Decomposition. In Proceedings of the 13th Joint Conference on Lexical and Computational Semantics (*SEM 2024), pages 153–175, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
A Closer Look at Claim Decomposition (Wanner et al., *SEM 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/jeptaln-2024-ingestion/2024.starsem-1.13.pdf