Semantic Role Labeling from Chinese Speech via End-to-End Learning

Huiyao Chen, Xinxin Li, Meishan Zhang, Min Zhang


Abstract
Semantic Role Labeling (SRL), crucial for understanding semantic relationships in sentences, has traditionally focused on text-based input. However, the increasing use of voice assistants and the need for hands-free interaction have highlighted the importance of SRL from speech.SRL from speech can be accomplished via a two-step pipeline directly: transcribing speech to text via Automatic Speech Recognition (ASR) and then applying text-based SRL, which could lead to error propagation and loss of useful acoustic features.Addressing these challenges, we present the first end-to-end approach for SRL from speech, integrating ASR and SRL in a joint-learning framework, focusing on the Chinese language. By employing a Stright-Through Gumbel-Softmax module for connecting ASR and SRL models, it enables gradient back-propagation and joint optimization, enhancing robustness and effectiveness.Experiments on the Chinese Proposition Bank 1.0 (CPB1.0) and a newly annotated dataset AS-SRL based on AISHELL-1 demonstrate the superiority of the end-to-end model over traditional pipelines, with significantly improved performance.
Anthology ID:
2024.findings-acl.527
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8898–8911
Language:
URL:
https://aclanthology.org/2024.findings-acl.527
DOI:
10.18653/v1/2024.findings-acl.527
Bibkey:
Cite (ACL):
Huiyao Chen, Xinxin Li, Meishan Zhang, and Min Zhang. 2024. Semantic Role Labeling from Chinese Speech via End-to-End Learning. In Findings of the Association for Computational Linguistics: ACL 2024, pages 8898–8911, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Semantic Role Labeling from Chinese Speech via End-to-End Learning (Chen et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/autopr/2024.findings-acl.527.pdf