Abstract
The automation of extracting argument structures faces a pair of challenges on (1) encoding long-term contexts to facilitate comprehensive understanding, and (2) improving data efficiency since constructing high-quality argument structures is time-consuming. In this work, we propose a novel context-aware Transformer-based argument structure prediction model which, on five different domains, significantly outperforms models that rely on features or only encode limited contexts. To tackle the difficulty of data annotation, we examine two complementary methods: (i) transfer learning to leverage existing annotated data to boost model performance in a new target domain, and (ii) active learning to strategically identify a small amount of samples for annotation. We further propose model-independent sample acquisition strategies, which can be generalized to diverse domains. With extensive experiments, we show that our simple-yet-effective acquisition strategies yield competitive results against three strong comparisons. Combined with transfer learning, substantial F1 score boost (5-25) can be further achieved during the early iterations of active learning across domains.- Anthology ID:
- 2022.findings-acl.36
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2022
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Smaranda Muresan, Preslav Nakov, Aline Villavicencio
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 423–437
- Language:
- URL:
- https://aclanthology.org/2022.findings-acl.36
- DOI:
- 10.18653/v1/2022.findings-acl.36
- Cite (ACL):
- Xinyu Hua and Lu Wang. 2022. Efficient Argument Structure Extraction with Transfer Learning and Active Learning. In Findings of the Association for Computational Linguistics: ACL 2022, pages 423–437, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Efficient Argument Structure Extraction with Transfer Learning and Active Learning (Hua & Wang, Findings 2022)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-acl.36.pdf
- Data
- CDCP