基于BERT的意图分类与槽填充联合方法(Joint Method of Intention Classification and Slot Filling Based on BERT)

Jun Qin (覃俊), Tianyu Ma (马天宇), Jing Liu (刘晶), Jun Tie (帖军), Qi Hou (后琦)


Abstract
口语理解是自然语言处理的一个重要内容,意图分类和槽填充是口语理解的两个基本子任务。最近的研究表明,共同学习这两项任务可以起到相互促进的作用。本文提出了一个基于BERT的意图分类联合模型,通过一个关联网络使得两个任务建立直接联系,共享信息,以此来提升任务效果。模型引入BERT来增强词向量的语义表示,有效解决了目前联合模型由于训练数据规模较小导致的泛化能力较差的问题。实验结果表明,该模型能有效提升意图分类和槽填充的性能。
Anthology ID:
2021.ccl-1.12
Volume:
Proceedings of the 20th Chinese National Conference on Computational Linguistics
Month:
August
Year:
2021
Address:
Huhhot, China
Editors:
Sheng Li (李生), Maosong Sun (孙茂松), Yang Liu (刘洋), Hua Wu (吴华), Kang Liu (刘康), Wanxiang Che (车万翔), Shizhu He (何世柱), Gaoqi Rao (饶高琦)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
121–129
Language:
Chinese
URL:
https://aclanthology.org/2021.ccl-1.12
DOI:
Bibkey:
Cite (ACL):
Jun Qin, Tianyu Ma, Jing Liu, Jun Tie, and Qi Hou. 2021. 基于BERT的意图分类与槽填充联合方法(Joint Method of Intention Classification and Slot Filling Based on BERT). In Proceedings of the 20th Chinese National Conference on Computational Linguistics, pages 121–129, Huhhot, China. Chinese Information Processing Society of China.
Cite (Informal):
基于BERT的意图分类与槽填充联合方法(Joint Method of Intention Classification and Slot Filling Based on BERT) (Qin et al., CCL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.ccl-1.12.pdf
Data
SNIPS