@inproceedings{kodali-manukonda-2025-bytesizedllm,
    title = "byte{S}ized{LLM}@{D}ravidian{L}ang{T}ech 2025: Fake News Detection in {D}ravidian Languages Using Transliteration-Aware {XLM}-{R}o{BERT}a and Attention-{B}i{LSTM}",
    author = "Kodali, Rohith Gowtham  and
      Manukonda, Durga Prasad",
    editor = "Chakravarthi, Bharathi Raja  and
      Priyadharshini, Ruba  and
      Madasamy, Anand Kumar  and
      Thavareesan, Sajeetha  and
      Sherly, Elizabeth  and
      Rajiakodi, Saranya  and
      Palani, Balasubramanian  and
      Subramanian, Malliga  and
      Cn, Subalalitha  and
      Chinnappa, Dhivya",
    booktitle = "Proceedings of the Fifth Workshop on Speech, Vision, and Language Technologies for Dravidian Languages",
    month = may,
    year = "2025",
    address = "Acoma, The Albuquerque Convention Center, Albuquerque, New Mexico",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.dravidianlangtech-1.11/",
    doi = "10.18653/v1/2025.dravidianlangtech-1.11",
    pages = "62--67",
    ISBN = "979-8-89176-228-2",
    abstract = "This research introduces an innovative Attention BiLSTM-XLM-RoBERTa model for tackling the challenge of fake news detection in Malayalam datasets. By fine-tuning XLM-RoBERTa with Masked Language Modeling (MLM) on transliteration-aware data, the model effectively bridges linguistic and script diversity, seamlessly integrating native, Romanized, and mixed-script text. Although most of the training data is monolingual, the proposed approach demonstrates robust performance in handling diverse script variations. Achieving a macro F1-score of 0.5775 and securing top rankings in the shared task, this work highlights the potential of multilingual models in addressing resource-scarce language challenges and sets a foundation for future advancements in fake news detection."
}Markdown (Informal)
[byteSizedLLM@DravidianLangTech 2025: Fake News Detection in Dravidian Languages Using Transliteration-Aware XLM-RoBERTa and Attention-BiLSTM](https://preview.aclanthology.org/ingest-emnlp/2025.dravidianlangtech-1.11/) (Kodali & Manukonda, DravidianLangTech 2025)
ACL