Learning Semantic Role Labeling from Compatible Label Sequences
Tao Li, Ghazaleh Kazeminejad, Susan Brown, Vivek Srikumar, Martha Palmer
Abstract
Semantic role labeling (SRL) has multiple disjoint label sets, e.g., VerbNet and PropBank. Creating these datasets is challenging, therefore a natural question is how to use each one to help the other. Prior work has shown that cross-task interaction helps, but only explored multitask learning so far. A common issue with multi-task setup is that argument sequences are still separately decoded, running the risk of generating structurally inconsistent label sequences (as per lexicons like Semlink). In this paper, we eliminate such issue with a framework that jointly models VerbNet and PropBank labels as one sequence. In this setup, we show that enforcing Semlink constraints during decoding constantly improves the overall F1. With special input constructions, our joint model infers VerbNet arguments from given PropBank arguments with over 99 F1. For learning, we propose a constrained marginal model that learns with knowledge defined in Semlink to further benefit from the large amounts of PropBank-only data. On the joint benchmark based on CoNLL05, our models achieve state-of-the-art F1’s, outperforming the prior best in-domain model by 3.5 (VerbNet) and 0.8 (PropBank). For out-of-domain generalization, our models surpass the prior best by 3.4 (VerbNet) and 0.2 (PropBank).- Anthology ID:
- 2023.findings-emnlp.1041
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 15561–15572
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.1041
- DOI:
- 10.18653/v1/2023.findings-emnlp.1041
- Cite (ACL):
- Tao Li, Ghazaleh Kazeminejad, Susan Brown, Vivek Srikumar, and Martha Palmer. 2023. Learning Semantic Role Labeling from Compatible Label Sequences. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15561–15572, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Learning Semantic Role Labeling from Compatible Label Sequences (Li et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2023.findings-emnlp.1041.pdf