Unlocking Large Audio-Language Models for Interactive Language Learning

Hongfu Liu, Zhouying Cui, Xiangming Gu, Ye Wang


Abstract
Achieving pronunciation proficiency in a second language (L2) remains a challenge, despite the development of Computer-Assisted Pronunciation Training (CAPT) systems. Traditional CAPT systems often provide unintuitive feedback that lacks actionable guidance, limiting its effectiveness. Recent advancements in audio-language models (ALMs) offer the potential to enhance these systems by providing more user-friendly feedback. In this work, we investigate ALMs for chat-based pronunciation training by introducing L2-Arctic-plus, an English dataset with detailed error explanations and actionable suggestions for improvement. We benchmark cascaded ASR+LLMs and existing ALMs on this dataset, specifically in detecting mispronunciation and generating actionable feedback. To improve the performance, we further propose to instruction-tune ALMs on L2-Arctic-plus. Experimental results demonstrate that our instruction-tuned models significantly outperform existing baselines on mispronunciation detection and suggestion generation in terms of both objective and human evaluation, highlighting the value of the proposed dataset.
Anthology ID:
2026.findings-eacl.190
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3667–3690
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.190/
DOI:
Bibkey:
Cite (ACL):
Hongfu Liu, Zhouying Cui, Xiangming Gu, and Ye Wang. 2026. Unlocking Large Audio-Language Models for Interactive Language Learning. In Findings of the Association for Computational Linguistics: EACL 2026, pages 3667–3690, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Unlocking Large Audio-Language Models for Interactive Language Learning (Liu et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.190.pdf
Checklist:
 2026.findings-eacl.190.checklist.pdf