Linearization Order Matters for AMR-to-Text Generation Input

Justin DeBenedetto


Abstract
Abstract Meaning Representation (AMR) is a semantic graph formalism designed to capture sentence meaning using a directed graph. Many systems treat AMR-to-text generation as a sequence-to-sequence problem, drawing upon existing models. The largest AMR dataset (AMR 3.0) provides a sequence format which is considered equivalent to the graph format. However, due to the position-sensitive nature of sequence-to-sequence models, graph traversal order affects system performance. In this work we explore the effect that different, valid orderings have on the performance of sequence-to-sequence AMR-to-text systems and find that changing the traversal order can result in a BLEU score drop of up to 17.5 on a state-of-the-art system.
Anthology ID:
2024.umrpw-1.1
Volume:
Proceedings of the 2024 UMR Parsing Workshop
Month:
June
Year:
2024
Address:
Boulder, Colorado
Editors:
Nianwen Xue, James Martin
Venues:
UMRPW | WS
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–7
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2024.umrpw-1.1/
DOI:
Bibkey:
Cite (ACL):
Justin DeBenedetto. 2024. Linearization Order Matters for AMR-to-Text Generation Input. In Proceedings of the 2024 UMR Parsing Workshop, pages 1–7, Boulder, Colorado. Association for Computational Linguistics.
Cite (Informal):
Linearization Order Matters for AMR-to-Text Generation Input (DeBenedetto, UMRPW 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2024.umrpw-1.1.pdf