Abstract
We present a novel supervised word alignment method based on cross-language span prediction. We first formalize a word alignment problem as a collection of independent predictions from a token in the source sentence to a span in the target sentence. Since this step is equivalent to a SQuAD v2.0 style question answering task, we solve it using the multilingual BERT, which is fine-tuned on manually created gold word alignment data. It is nontrivial to obtain accurate alignment from a set of independently predicted spans. We greatly improved the word alignment accuracy by adding to the question the source token’s context and symmetrizing two directional predictions. In experiments using five word alignment datasets from among Chinese, Japanese, German, Romanian, French, and English, we show that our proposed method significantly outperformed previous supervised and unsupervised word alignment methods without any bitexts for pretraining. For example, we achieved 86.7 F1 score for the Chinese-English data, which is 13.3 points higher than the previous state-of-the-art supervised method.- Anthology ID:
- 2020.emnlp-main.41
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 555–565
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.41
- DOI:
- 10.18653/v1/2020.emnlp-main.41
- Cite (ACL):
- Masaaki Nagata, Katsuki Chousa, and Masaaki Nishino. 2020. A Supervised Word Alignment Method based on Cross-Language Span Prediction using Multilingual BERT. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 555–565, Online. Association for Computational Linguistics.
- Cite (Informal):
- A Supervised Word Alignment Method based on Cross-Language Span Prediction using Multilingual BERT (Nagata et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2020.emnlp-main.41.pdf
- Data
- SQuAD