@inproceedings{desai-etal-2019-evaluating,
    title = "Evaluating Lottery Tickets Under Distributional Shifts",
    author = "Desai, Shrey  and
      Zhan, Hongyuan  and
      Aly, Ahmed",
    editor = "Cherry, Colin  and
      Durrett, Greg  and
      Foster, George  and
      Haffari, Reza  and
      Khadivi, Shahram  and
      Peng, Nanyun  and
      Ren, Xiang  and
      Swayamdipta, Swabha",
    booktitle = "Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D19-6117/",
    doi = "10.18653/v1/D19-6117",
    pages = "153--162",
    abstract = "The Lottery Ticket Hypothesis suggests large, over-parameterized neural networks consist of small, sparse subnetworks that can be trained in isolation to reach a similar (or better) test accuracy. However, the initialization and generalizability of the obtained sparse subnetworks have been recently called into question. Our work focuses on evaluating the initialization of sparse subnetworks under distributional shifts. Specifically, we investigate the extent to which a sparse subnetwork obtained in a source domain can be re-trained in isolation in a dissimilar, target domain. In addition, we examine the effects of different initialization strategies at transfer-time. Our experiments show that sparse subnetworks obtained through lottery ticket training do not simply overfit to particular domains, but rather reflect an inductive bias of deep neural networks that can be exploited in multiple domains."
}Markdown (Informal)
[Evaluating Lottery Tickets Under Distributional Shifts](https://preview.aclanthology.org/iwcs-25-ingestion/D19-6117/) (Desai et al., 2019)
ACL
- Shrey Desai, Hongyuan Zhan, and Ahmed Aly. 2019. Evaluating Lottery Tickets Under Distributional Shifts. In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), pages 153–162, Hong Kong, China. Association for Computational Linguistics.