Abstract
We propose a simple yet effective approach for improving Korean word representations using additional linguistic annotation (i.e. Hanja). We employ cross-lingual transfer learning in training word representations by leveraging the fact that Hanja is closely related to Chinese. We evaluate the intrinsic quality of representations learned through our approach using the word analogy and similarity tests. In addition, we demonstrate their effectiveness on several downstream tasks, including a novel Korean news headline generation task.- Anthology ID:
- D19-1358
- Volume:
- Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
- Month:
- November
- Year:
- 2019
- Address:
- Hong Kong, China
- Editors:
- Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
- Venues:
- EMNLP | IJCNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3528–3533
- Language:
- URL:
- https://aclanthology.org/D19-1358
- DOI:
- 10.18653/v1/D19-1358
- Cite (ACL):
- Kang Min Yoo, Taeuk Kim, and Sang-goo Lee. 2019. Don’t Just Scratch the Surface: Enhancing Word Representations for Korean with Hanja. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 3528–3533, Hong Kong, China. Association for Computational Linguistics.
- Cite (Informal):
- Don’t Just Scratch the Surface: Enhancing Word Representations for Korean with Hanja (Yoo et al., EMNLP-IJCNLP 2019)
- PDF:
- https://preview.aclanthology.org/landing_page/D19-1358.pdf
- Code
- shin285/KOMORAN + additional community code