@inproceedings{li-etal-2019-semi-supervised-domain,
    title = "Semi-supervised Domain Adaptation for Dependency Parsing",
    author = "Li, Zhenghua  and
      Peng, Xue  and
      Zhang, Min  and
      Wang, Rui  and
      Si, Luo",
    editor = "Korhonen, Anna  and
      Traum, David  and
      M{\`a}rquez, Llu{\'i}s",
    booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
    month = jul,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/P19-1229/",
    doi = "10.18653/v1/P19-1229",
    pages = "2386--2395",
    abstract = "During the past decades, due to the lack of sufficient labeled data, most studies on cross-domain parsing focus on unsupervised domain adaptation, assuming there is no target-domain training data. However, unsupervised approaches make limited progress so far due to the intrinsic difficulty of both domain adaptation and parsing. This paper tackles the semi-supervised domain adaptation problem for Chinese dependency parsing, based on two newly-annotated large-scale domain-aware datasets. We propose a simple domain embedding approach to merge the source- and target-domain training data, which is shown to be more effective than both direct corpus concatenation and multi-task learning. In order to utilize unlabeled target-domain data, we employ the recent contextualized word representations and show that a simple fine-tuning procedure can further boost cross-domain parsing accuracy by large margin."
}Markdown (Informal)
[Semi-supervised Domain Adaptation for Dependency Parsing](https://preview.aclanthology.org/iwcs-25-ingestion/P19-1229/) (Li et al., ACL 2019)
ACL
- Zhenghua Li, Xue Peng, Min Zhang, Rui Wang, and Luo Si. 2019. Semi-supervised Domain Adaptation for Dependency Parsing. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2386–2395, Florence, Italy. Association for Computational Linguistics.