Enhancing Transformer with Sememe Knowledge

Yuhui Zhang, Chenghao Yang, Zhengping Zhou, Zhiyuan Liu


Abstract
While large-scale pretraining has achieved great success in many NLP tasks, it has not been fully studied whether external linguistic knowledge can improve data-driven models. In this work, we introduce sememe knowledge into Transformer and propose three sememe-enhanced Transformer models. Sememes, by linguistic definition, are the minimum semantic units of language, which can well represent implicit semantic meanings behind words. Our experiments demonstrate that introducing sememe knowledge into Transformer can consistently improve language modeling and downstream tasks. The adversarial test further demonstrates that sememe knowledge can substantially improve model robustness.
Anthology ID:
2020.repl4nlp-1.21
Volume:
Proceedings of the 5th Workshop on Representation Learning for NLP
Month:
July
Year:
2020
Address:
Online
Editors:
Spandana Gella, Johannes Welbl, Marek Rei, Fabio Petroni, Patrick Lewis, Emma Strubell, Minjoon Seo, Hannaneh Hajishirzi
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
177–184
Language:
URL:
https://aclanthology.org/2020.repl4nlp-1.21
DOI:
10.18653/v1/2020.repl4nlp-1.21
Bibkey:
Cite (ACL):
Yuhui Zhang, Chenghao Yang, Zhengping Zhou, and Zhiyuan Liu. 2020. Enhancing Transformer with Sememe Knowledge. In Proceedings of the 5th Workshop on Representation Learning for NLP, pages 177–184, Online. Association for Computational Linguistics.
Cite (Informal):
Enhancing Transformer with Sememe Knowledge (Zhang et al., RepL4NLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2020.repl4nlp-1.21.pdf
Video:
 http://slideslive.com/38929787