@inproceedings{gonen-etal-2020-greek,
    title = "It{'}s not {G}reek to m{BERT}: Inducing Word-Level Translations from Multilingual {BERT}",
    author = "Gonen, Hila  and
      Ravfogel, Shauli  and
      Elazar, Yanai  and
      Goldberg, Yoav",
    editor = "Alishahi, Afra  and
      Belinkov, Yonatan  and
      Chrupa{\l}a, Grzegorz  and
      Hupkes, Dieuwke  and
      Pinter, Yuval  and
      Sajjad, Hassan",
    booktitle = "Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP",
    month = nov,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.blackboxnlp-1.5/",
    doi = "10.18653/v1/2020.blackboxnlp-1.5",
    pages = "45--56",
    abstract = "Recent works have demonstrated that multilingual BERT (mBERT) learns rich cross-lingual representations, that allow for transfer across languages. We study the word-level translation information embedded in mBERT and present two simple methods that expose remarkable translation capabilities with no fine-tuning. The results suggest that most of this information is encoded in a non-linear way, while some of it can also be recovered with purely linear tools. As part of our analysis, we test the hypothesis that mBERT learns representations which contain both a language-encoding component and an abstract, cross-lingual component, and explicitly identify an empirical language-identity subspace within mBERT representations."
}Markdown (Informal)
[It’s not Greek to mBERT: Inducing Word-Level Translations from Multilingual BERT](https://preview.aclanthology.org/ingest-emnlp/2020.blackboxnlp-1.5/) (Gonen et al., BlackboxNLP 2020)
ACL