Efficient AMR Parsing with CLAP: Compact Linearization with an Adaptable Parser

Abelardo Carlos Martinez Lorenzo, Roberto Navigli


Abstract
Sequence-to-sequence models have become the de facto standard for Abstract Meaning Representation (AMR) parsing due to their high-quality performance. However, these systems face efficiency challenges because of their large model size and computational time, which limit their accessibility within the research community. This paper aims to break down these barriers by introducing a novel linearization and system that significantly enhances the efficiency and accessibility of previous AMR parsers. First, we propose our novel Compact linearization that simplifies encoding, thereby reducing the number of tokens by between 40% and 50%. Second, we present CLAP, an innovative modular system that maintains the model’s high performance while achieving remarkable 80% reduction in training and inference times. Furthermore, CLAP is compatible with multiple autoregressive Language Models (LM) and tokenizers, such as BART, T5, and others. These advancements underscore the importance of optimizing sequence-to-sequence models in AMR parsing, thus democratizing access to high-quality semantic analysis. Our code is publicly available at https://github.com/SapienzaNLP/clap/.
Anthology ID:
2024.lrec-main.495
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
5578–5584
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.lrec-main.495/
DOI:
Bibkey:
Cite (ACL):
Abelardo Carlos Martinez Lorenzo and Roberto Navigli. 2024. Efficient AMR Parsing with CLAP: Compact Linearization with an Adaptable Parser. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 5578–5584, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Efficient AMR Parsing with CLAP: Compact Linearization with an Adaptable Parser (Martinez Lorenzo & Navigli, LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.lrec-main.495.pdf