Quantum-Enhanced Gated Recurrent Units for Part-of-Speech Tagging

Ashutosh Rai, Shyambabu Pandey, Partha Pakray


Abstract
Deep learning models for Natural Language Processing (NLP) tasks, such as Part-of-Speech (POS) tagging, usually have significant parameter counts that make them costly to train and deploy. Quantum Machine Learning (QML) offers a potential approach for building more parameter-efficient models. This paper proposes a hybrid quantum-classical gated recurrent unit model for POS tagging in code-mixed social media text. By integrating a quantum layer into the recurrent framework, our model achieved an accuracy comparable to the baseline classical model, while needing fewer parameters. Although the cut-off point in the parameters is modest in our setup, the approach is promising when scaled to deeper architectures. These results suggest that hybrid models can offer a resource-efficient alternative for NLP tasks.
Anthology ID:
2025.quantumnlp-1.5
Volume:
Proceedings of the QuantumNLP{:} Integrating Quantum Computing with Natural Language Processing
Month:
November
Year:
2025
Address:
Mumbai, India (Hybrid)
Editors:
Santanu Pal, Partha Pakray, Priyanka Jain, Asif Ekbal, Sivaji Bandyopadhyay
Venues:
QuantumNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
26–32
Language:
URL:
https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.quantumnlp-1.5/
DOI:
Bibkey:
Cite (ACL):
Ashutosh Rai, Shyambabu Pandey, and Partha Pakray. 2025. Quantum-Enhanced Gated Recurrent Units for Part-of-Speech Tagging. In Proceedings of the QuantumNLP{:} Integrating Quantum Computing with Natural Language Processing, pages 26–32, Mumbai, India (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Quantum-Enhanced Gated Recurrent Units for Part-of-Speech Tagging (Rai et al., QuantumNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.quantumnlp-1.5.pdf