@inproceedings{jo-cinarel-2019-delta,
    title = "Delta-training: Simple Semi-Supervised Text Classification using Pretrained Word Embeddings",
    author = "Jo, Hwiyeol  and
      Cinarel, Ceyda",
    editor = "Inui, Kentaro  and
      Jiang, Jing  and
      Ng, Vincent  and
      Wan, Xiaojun",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/D19-1347/",
    doi = "10.18653/v1/D19-1347",
    pages = "3458--3463",
    abstract = "We propose a novel and simple method for semi-supervised text classification. The method stems from the hypothesis that a classifier with pretrained word embeddings always outperforms the same classifier with randomly initialized word embeddings, as empirically observed in NLP tasks. Our method first builds two sets of classifiers as a form of model ensemble, and then initializes their word embeddings differently: one using random, the other using pretrained word embeddings. We focus on different predictions between the two classifiers on unlabeled data while following the self-training framework. We also use early-stopping in meta-epoch to improve the performance of our method. Our method, Delta-training, outperforms the self-training and the co-training framework in 4 different text classification datasets, showing robustness against error accumulation."
}Markdown (Informal)
[Delta-training: Simple Semi-Supervised Text Classification using Pretrained Word Embeddings](https://preview.aclanthology.org/ingest-emnlp/D19-1347/) (Jo & Cinarel, EMNLP-IJCNLP 2019)
ACL