Zhongqiang Zhang


2024

pdf
NER-guided Comprehensive Hierarchy-aware Prompt Tuning for Hierarchical Text Classification
Fuhan Cai | Duo Liu | Zhongqiang Zhang | Ge Liu | Xiaozhe Yang | Xiangzhong Fang
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

Hierarchical text classification (HTC) is a significant but challenging task in natural language processing (NLP) due to its complex taxonomic label hierarchy. Recently, there have been a number of approaches that applied prompt learning to HTC problems, demonstrating impressive efficacy. The majority of prompt-based studies emphasize global hierarchical features by employing graph networks to represent the hierarchical structure as a whole, with limited research on maintaining path consistency within the internal hierarchy of the structure. In this paper, we formulate prompt-based HTC as a named entity recognition (NER) task and introduce conditional random fields (CRF) and Global Pointer to establish hierarchical dependencies. Specifically, we approach single- and multi-path HTC as flat and nested entity recognition tasks and model them using span- and token-based methods. By narrowing the gap between HTC and NER, we maintain the consistency of internal paths within the hierarchical structure through a simple and effective way. Extensive experiments on three public datasets show that our method achieves state-of-the-art (SoTA) performance.