Khushbu Saxena
2020
TopicBERT for Energy Efficient Document Classification
Yatin Chaudhary
|
Pankaj Gupta
|
Khushbu Saxena
|
Vivek Kulkarni
|
Thomas Runkler
|
Hinrich Schütze
Findings of the Association for Computational Linguistics: EMNLP 2020
Prior research notes that BERT’s computational cost grows quadratically with sequence length thus leading to longer training times, higher GPU memory constraints and carbon emissions. While recent work seeks to address these scalability issues at pre-training, these issues are also prominent in fine-tuning especially for long sequence tasks like document classification. Our work thus focuses on optimizing the computational cost of fine-tuning for document classification. We achieve this by complementary learning of both topic and language models in a unified framework, named TopicBERT. This significantly reduces the number of self-attention operations – a main performance bottleneck. Consequently, our model achieves a 1.4x ( 40%) speedup with 40% reduction in CO2 emission while retaining 99.9% performance over 5 datasets.
2019
Neural Architectures for Fine-Grained Propaganda Detection in News
Pankaj Gupta
|
Khushbu Saxena
|
Usama Yaseen
|
Thomas Runkler
|
Hinrich Schütze
Proceedings of the Second Workshop on Natural Language Processing for Internet Freedom: Censorship, Disinformation, and Propaganda
This paper describes our system (MIC-CIS) details and results of participation in the fine grained propaganda detection shared task 2019. To address the tasks of sentence (SLC) and fragment level (FLC) propaganda detection, we explore different neural architectures (e.g., CNN, LSTM-CRF and BERT) and extract linguistic (e.g., part-of-speech, named entity, readability, sentiment, emotion, etc.), layout and topical features. Specifically, we have designed multi-granularity and multi-tasking neural architectures to jointly perform both the sentence and fragment level propaganda detection. Additionally, we investigate different ensemble schemes such as majority-voting, relax-voting, etc. to boost overall system performance. Compared to the other participating systems, our submissions are ranked 3rd and 4th in FLC and SLC tasks, respectively.
Search
Co-authors
- Pankaj Gupta 2
- Thomas Runkler 2
- Hinrich Schütze 2
- Yatin Chaudhary 1
- Vivek Kulkarni 1
- show all...