Going “Deeper”: Structured Sememe Prediction via Transformer with Tree Attention

Yining Ye, Fanchao Qi, Zhiyuan Liu, Maosong Sun


Abstract
Sememe knowledge bases (SKBs), which annotate words with the smallest semantic units (i.e., sememes), have proven beneficial to many NLP tasks. Building an SKB is very time-consuming and labor-intensive. Therefore, some studies have tried to automate the building process by predicting sememes for the unannotated words. However, all existing sememe prediction studies ignore the hierarchical structures of sememes, which are important in the sememe-based semantic description system. In this work, we tackle the structured sememe prediction problem for the first time, which is aimed at predicting a sememe tree with hierarchical structures rather than a set of sememes. We design a sememe tree generation model based on Transformer with adjusted attention mechanism, which shows its superiority over the baselines in experiments. We also conduct a series of quantitative and qualitative analyses of the effectiveness of our model. All the code and data of this paper are available at https://github.com/thunlp/STG.
Anthology ID:
2022.findings-acl.12
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
128–138
Language:
URL:
https://aclanthology.org/2022.findings-acl.12
DOI:
10.18653/v1/2022.findings-acl.12
Bibkey:
Cite (ACL):
Yining Ye, Fanchao Qi, Zhiyuan Liu, and Maosong Sun. 2022. Going “Deeper”: Structured Sememe Prediction via Transformer with Tree Attention. In Findings of the Association for Computational Linguistics: ACL 2022, pages 128–138, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Going “Deeper”: Structured Sememe Prediction via Transformer with Tree Attention (Ye et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.findings-acl.12.pdf
Software:
 2022.findings-acl.12.software.zip
Code
 thunlp/stg