Improving Cross-lingual Transfer with Contrastive Negative Learning and Self-training

Guanlin Li, Xuechen Zhao, Amir Jafari, Wenhao Shao, Reza Farahbakhsh, Noel Crespi


Abstract
Recent studies improve the cross-lingual transfer learning by better aligning the internal representations within the multilingual model or exploring the information of the target language using self-training. However, the alignment-based methods exhibit intrinsic limitations such as non-transferable linguistic elements, while most of the self-training based methods ignore the useful information hidden in the low-confidence samples. To address this issue, we propose CoNLST (Contrastive Negative Learning and Self-Training) to leverage the information of low-confidence samples. Specifically, we extend the negative learning to the metric space by selecting negative pairs based on the complementary labels and then employ self-training to iteratively train the model to converge on the obtained clean pseudo-labels. We evaluate our approach on the widely-adopted cross-lingual benchmark XNLI. The experiment results show that our method improves upon the baseline models and can serve as a beneficial complement to the alignment-based methods.
Anthology ID:
2024.lrec-main.769
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
8781–8791
Language:
URL:
https://aclanthology.org/2024.lrec-main.769
DOI:
Bibkey:
Cite (ACL):
Guanlin Li, Xuechen Zhao, Amir Jafari, Wenhao Shao, Reza Farahbakhsh, and Noel Crespi. 2024. Improving Cross-lingual Transfer with Contrastive Negative Learning and Self-training. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 8781–8791, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Improving Cross-lingual Transfer with Contrastive Negative Learning and Self-training (Li et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2024.lrec-main.769.pdf