@inproceedings{dowlagar-mamidi-2020-multilingual,
    title = "Multilingual Pre-Trained Transformers and Convolutional {NN} Classification Models for Technical Domain Identification",
    author = "Dowlagar, Suman  and
      Mamidi, Radhika",
    editor = "Sharma, Dipti Misra  and
      Ekbal, Asif  and
      Arora, Karunesh  and
      Naskar, Sudip Kumar  and
      Ganguly, Dipankar  and
      L, Sobha  and
      Mamidi, Radhika  and
      Arora, Sunita  and
      Mishra, Pruthwik  and
      Mujadia, Vandan",
    booktitle = "Proceedings of the 17th International Conference on Natural Language Processing (ICON): TechDOfication 2020 Shared Task",
    month = dec,
    year = "2020",
    address = "Patna, India",
    publisher = "NLP Association of India (NLPAI)",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.icon-techdofication.4/",
    pages = "16--20",
    abstract = "In this paper, we present a transfer learning system to perform technical domain identification on multilingual text data. We have submitted two runs, one uses the transformer model BERT, and the other uses XLM-ROBERTa with the CNN model for text classification. These models allowed us to identify the domain of the given sentences for the ICON 2020 shared Task, TechDOfication: Technical Domain Identification. Our system ranked the best for the subtasks 1d, 1g for the given TechDOfication dataset."
}Markdown (Informal)
[Multilingual Pre-Trained Transformers and Convolutional NN Classification Models for Technical Domain Identification](https://preview.aclanthology.org/ingest-emnlp/2020.icon-techdofication.4/) (Dowlagar & Mamidi, ICON 2020)
ACL