Abstract
Recently, pre-training contextualized encoders with language model (LM) objectives has been shown an effective semi-supervised method for structured prediction. In this work, we empirically explore an alternative pre-training method for contextualized encoders. Instead of predicting words in LMs, we “mask out” and predict word order information, with a local ordering strategy and word-selecting objectives. With evaluations on three typical structured prediction tasks (dependency parsing, POS tagging, and NER) over four languages (English, Finnish, Czech, and Italian), we show that our method is consistently beneficial. We further conduct detailed error analysis, including one that examines a specific type of parsing error where the head is misidentified. The results show that pre-trained contextual encoders can bring improvements in a structured way, suggesting that they may be able to capture higher-order patterns and feature combinations from unlabeled data.- Anthology ID:
- 2020.findings-emnlp.160
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Trevor Cohn, Yulan He, Yang Liu
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1770–1783
- Language:
- URL:
- https://aclanthology.org/2020.findings-emnlp.160
- DOI:
- 10.18653/v1/2020.findings-emnlp.160
- Cite (ACL):
- Zhisong Zhang, Xiang Kong, Lori Levin, and Eduard Hovy. 2020. An Empirical Exploration of Local Ordering Pre-training for Structured Prediction. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1770–1783, Online. Association for Computational Linguistics.
- Cite (Informal):
- An Empirical Exploration of Local Ordering Pre-training for Structured Prediction (Zhang et al., Findings 2020)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2020.findings-emnlp.160.pdf
- Code
- zzsfornlp/zmsp