Abstract
We consider a multilingual weakly supervised learning scenario where knowledge from annotated corpora in a resource-rich language is transferred via bitext to guide the learning in other languages. Past approaches project labels across bitext and use them as features or gold labels for training. We propose a new method that projects model expectations rather than labels, which facilities transfer of model uncertainty across language boundaries. We encode expectations as constraints and train a discriminative CRF model using Generalized Expectation Criteria (Mann and McCallum, 2010). Evaluated on standard Chinese-English and German-English NER datasets, our method demonstrates F1 scores of 64% and 60% when no labeled data is used. Attaining the same accuracy with supervised CRFs requires 12k and 1.5k labeled sentences. Furthermore, when combined with labeled examples, our method yields significant improvements over state-of-the-art supervised methods, achieving best reported numbers to date on Chinese OntoNotes and German CoNLL-03 datasets.- Anthology ID:
- Q14-1005
- Volume:
- Transactions of the Association for Computational Linguistics, Volume 2
- Month:
- Year:
- 2014
- Address:
- Cambridge, MA
- Editors:
- Dekang Lin, Michael Collins, Lillian Lee
- Venue:
- TACL
- SIG:
- Publisher:
- MIT Press
- Note:
- Pages:
- 55–66
- Language:
- URL:
- https://aclanthology.org/Q14-1005
- DOI:
- 10.1162/tacl_a_00165
- Cite (ACL):
- Mengqiu Wang and Christopher D. Manning. 2014. Cross-lingual Projected Expectation Regularization for Weakly Supervised Learning. Transactions of the Association for Computational Linguistics, 2:55–66.
- Cite (Informal):
- Cross-lingual Projected Expectation Regularization for Weakly Supervised Learning (Wang & Manning, TACL 2014)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/Q14-1005.pdf