Recognition Method of Important Words in Korean Text based on Reinforcement Learning

Yang Feiyang, Zhao Yahui, Cui Rongyi


Abstract
The manual labeling work for constructing the Korean corpus is too time-consuming and laborious. It is difficult for low-minority languages to integrate resources. As a result, the research progress of Korean language information processing is slow. From the perspective of representation learning, reinforcement learning was combined with traditional deep learning methods. Based on the Korean text classification effect as a benchmark, and studied how to extract important Korean words in sentences. A structured model Information Distilled of Korean (IDK) was proposed. The model recognizes the words in Korean sentences and retains important words and deletes non-important words. Thereby transforming the reconstruction of the sentence into a sequential decision problem. So you can introduce the Policy Gradient method in reinforcement learning to solve the conversion problem. The results show that the model can identify the important words in Korean instead of manual annotation for representation learning. Furthermore, compared with traditional text classification methods, the model also improves the effect of Korean text classification.
Anthology ID:
2020.ccl-1.94
Volume:
Proceedings of the 19th Chinese National Conference on Computational Linguistics
Month:
October
Year:
2020
Address:
Haikou, China
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1017–1025
Language:
English
URL:
https://aclanthology.org/2020.ccl-1.94
DOI:
Bibkey:
Cite (ACL):
Yang Feiyang, Zhao Yahui, and Cui Rongyi. 2020. Recognition Method of Important Words in Korean Text based on Reinforcement Learning. In Proceedings of the 19th Chinese National Conference on Computational Linguistics, pages 1017–1025, Haikou, China. Chinese Information Processing Society of China.
Cite (Informal):
Recognition Method of Important Words in Korean Text based on Reinforcement Learning (Feiyang et al., CCL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2020.ccl-1.94.pdf