Knowledge-Augmented Methods for Natural Language Processing

Chenguang Zhu, Yichong Xu, Xiang Ren, Bill Yuchen Lin, Meng Jiang, Wenhao Yu


Abstract
Knowledge in natural language processing (NLP) has been a rising trend especially after the advent of large scale pre-trained models. NLP models with attention to knowledge can i) access unlimited amount of external information; ii) delegate the task of storing knowledge from its parameter space to knowledge sources; iii) obtain up-to-date information; iv) make prediction results more explainable via selected knowledge. In this tutorial, we will introduce the key steps in integrating knowledge into NLP, including knowledge grounding from text, knowledge representation and fusing. In addition, we will introduce recent state-of-the-art applications in fusing knowledge into language understanding, language generation and commonsense reasoning.
Anthology ID:
2022.acl-tutorials.3
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Luciana Benotti, Naoaki Okazaki, Yves Scherrer, Marcos Zampieri
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–20
Language:
URL:
https://aclanthology.org/2022.acl-tutorials.3
DOI:
10.18653/v1/2022.acl-tutorials.3
Bibkey:
Cite (ACL):
Chenguang Zhu, Yichong Xu, Xiang Ren, Bill Yuchen Lin, Meng Jiang, and Wenhao Yu. 2022. Knowledge-Augmented Methods for Natural Language Processing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, pages 12–20, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Knowledge-Augmented Methods for Natural Language Processing (Zhu et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2022.acl-tutorials.3.pdf
Data
CommonGenCommonsenseQAConceptNetRiddleSense