GAttention: Gated Attention for the Detection of Abusive Language
Horacio Jarquín Vásquez, Hugo Jair Escalante, Manuel Montes, Mario Ezra Aragon
Abstract
Abusive language online creates toxic environments and exacerbates social tensions, underscoring the need for robust NLP models to interpret nuanced linguistic cues. This paper introduces GAttention, a novel Gated Attention mechanism that combines the strengths of Contextual attention and Self-attention mechanisms to address the limitations of existing attention models within the text classification task. GAttention capitalizes on local and global query vectors by integrating the internal relationships within a sequence (Self-attention) and the global relationships among distinct sequences (Contextual attention). This combination allows for a more nuanced understanding and processing of sequence elements, which is particularly beneficial in context-sensitive text classification tasks such as the case of abusive language detection. By applying this mechanism to transformer-based encoder models, we showcase how it enhances the model’s ability to discern subtle nuances and contextual clues essential for identifying abusive language, a challenging and increasingly relevant NLP task.- Anthology ID:
- 2025.findings-emnlp.1105
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2025
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou, China
- Editors:
- Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 20314–20329
- Language:
- URL:
- https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1105/
- DOI:
- 10.18653/v1/2025.findings-emnlp.1105
- Cite (ACL):
- Horacio Jarquín Vásquez, Hugo Jair Escalante, Manuel Montes, and Mario Ezra Aragon. 2025. GAttention: Gated Attention for the Detection of Abusive Language. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 20314–20329, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- GAttention: Gated Attention for the Detection of Abusive Language (Vásquez et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1105.pdf