Abstract
Global co-occurrence information is the primary source of structural information on multilingual corpora, and we find that analogical/parallel compound words across languages have similar co-occurrence counts/frequencies (normalized) giving weak but stable self-supervision for cross-lingual transfer. Following the observation, we aim at associating contextualized representations with relevant (contextualized) representations across languages with the help of co-occurrence counts. The result is MLM-GC (MLM with Global Co-occurrence) pre-training that the model learns local bidirectional information from MLM and global co-occurrence information from a log-bilinear regression. Experiments show that MLM-GC pre-training substantially outperforms MLM pre-training for 4 downstream cross-lingual tasks and 1 additional monolingual task, showing the advantages of forming isomorphic spaces across languages.- Anthology ID:
- 2023.findings-acl.475
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7526–7543
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.475
- DOI:
- 10.18653/v1/2023.findings-acl.475
- Cite (ACL):
- Xi Ai and Bin Fang. 2023. Multilingual Pre-training with Self-supervision from Global Co-occurrence Information. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7526–7543, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Multilingual Pre-training with Self-supervision from Global Co-occurrence Information (Ai & Fang, Findings 2023)
- PDF:
- https://preview.aclanthology.org/corrections-2024-04/2023.findings-acl.475.pdf