@inproceedings{tran-bisazza-2019-zero,
    title = "Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations",
    author = "Tran, Ke  and
      Bisazza, Arianna",
    editor = "Cherry, Colin  and
      Durrett, Greg  and
      Foster, George  and
      Haffari, Reza  and
      Khadivi, Shahram  and
      Peng, Nanyun  and
      Ren, Xiang  and
      Swayamdipta, Swabha",
    booktitle = "Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D19-6132/",
    doi = "10.18653/v1/D19-6132",
    pages = "281--288",
    abstract = "We investigate whether off-the-shelf deep bidirectional sentence representations (Devlin et al., 2019) trained on a massively multilingual corpus (multilingual BERT) enable the development of an unsupervised universal dependency parser. This approach only leverages a mix of monolingual corpora in many languages and does not require any translation data making it applicable to low-resource languages. In our experiments we outperform the best CoNLL 2018 language-specific systems in all of the shared task{'}s six truly low-resource languages while using a single system. However, we also find that (i) parsing accuracy still varies dramatically when changing the training languages and (ii) in some target languages zero-shot transfer fails under all tested conditions, raising concerns on the `universality' of the whole approach."
}Markdown (Informal)
[Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations](https://preview.aclanthology.org/iwcs-25-ingestion/D19-6132/) (Tran & Bisazza, 2019)
ACL