Contrastive Learning on LLM Back Generation Treebank for Cross-domain Constituency Parsing

Peiming Guo, Meishan Zhang, Jianling Li, Min Zhang, Yue Zhang


Abstract
Cross-domain constituency parsing is still an unsolved challenge in computational linguistics since the available multi-domain constituency treebank is limited. We investigate automatic treebank generation by large language models (LLMs) in this paper. The performance of LLMs on constituency parsing is poor, therefore we propose a novel treebank generation method, LLM back generation, which is similar to the reverse process of constituency parsing. LLM back generation takes the incomplete cross-domain constituency tree with only domain keyword leaf nodes as input and fills the missing words to generate the cross-domain constituency treebank. Besides, we also introduce a span-level contrastive learning pre-training strategy to make full use of the LLM back generation treebank for cross-domain constituency parsing. We verify the effectiveness of our LLM back generation treebank coupled with contrastive learning pre-training on five target domains of MCTB. Experimental results show that our approach achieves state-of-the-art performance on average results compared with various baselines.
Anthology ID:
2025.acl-long.1331
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27446–27458
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1331/
DOI:
Bibkey:
Cite (ACL):
Peiming Guo, Meishan Zhang, Jianling Li, Min Zhang, and Yue Zhang. 2025. Contrastive Learning on LLM Back Generation Treebank for Cross-domain Constituency Parsing. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 27446–27458, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Contrastive Learning on LLM Back Generation Treebank for Cross-domain Constituency Parsing (Guo et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1331.pdf