AMR Parsing with Causal Hierarchical Attention and Pointers

Chao Lou, Kewei Tu


Abstract
Translation-based AMR parsers have recently gained popularity due to their simplicity and effectiveness. They predict linearized graphs as free texts, avoiding explicit structure modeling. However, this simplicity neglects structural locality in AMR graphs and introduces unnecessary tokens to represent coreferences. In this paper, we introduce new target forms of AMR parsing and a novel model, CHAP, which is equipped with causal hierarchical attention and the pointer mechanism, enabling the integration of structures into the Transformer decoder. We empirically explore various alternative modeling options. Experiments show that our model outperforms baseline models on four out of five benchmarks in the setting of no additional data.
Anthology ID:
2023.emnlp-main.553
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8942–8955
Language:
URL:
https://aclanthology.org/2023.emnlp-main.553
DOI:
10.18653/v1/2023.emnlp-main.553
Bibkey:
Cite (ACL):
Chao Lou and Kewei Tu. 2023. AMR Parsing with Causal Hierarchical Attention and Pointers. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 8942–8955, Singapore. Association for Computational Linguistics.
Cite (Informal):
AMR Parsing with Causal Hierarchical Attention and Pointers (Lou & Tu, EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.553.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.553.mp4