@inproceedings{mishra-chakraborty-2021-local-pruning,
    title = "Does local pruning offer task-specific models to learn effectively ?",
    author = "Mishra, Abhishek Kumar  and
      Chakraborty, Mohna",
    editor = "Djabri, Souhila  and
      Gimadi, Dinara  and
      Mihaylova, Tsvetomila  and
      Nikolova-Koleva, Ivelina",
    booktitle = "Proceedings of the Student Research Workshop Associated with RANLP 2021",
    month = sep,
    year = "2021",
    address = "Online",
    publisher = "INCOMA Ltd.",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.ranlp-srw.17/",
    pages = "118--125",
    abstract = "The need to deploy large-scale pre-trained models on edge devices under limited computational resources has led to substantial research to compress these large models. However, less attention has been given to compress the task-specific models. In this work, we investigate the different methods of unstructured pruning on task-specific models for Aspect-based Sentiment Analysis (ABSA) tasks. Specifically, we analyze differences in the learning dynamics of pruned models by using the standard pruning techniques to achieve high-performing sparse networks. We develop a hypothesis to demonstrate the effectiveness of local pruning over global pruning considering a simple CNN model. Later, we utilize the hypothesis to demonstrate the efficacy of the pruned state-of-the-art model compared to the over-parameterized state-of-the-art model under two settings, the first considering the baselines for the same task used for generating the hypothesis, i.e., aspect extraction and the second considering a different task, i.e., sentiment analysis. We also provide discussion related to the generalization of the pruning hypothesis."
}Markdown (Informal)
[Does local pruning offer task-specific models to learn effectively ?](https://preview.aclanthology.org/ingest-emnlp/2021.ranlp-srw.17/) (Mishra & Chakraborty, RANLP 2021)
ACL