TEAM: A multitask learning based Taxonomy Expansion approach for Attach and Merge

Bornali Phukon, Anasua Mitra, Ranbir Sanasam, Priyankoo Sarmah


Abstract
Taxonomy expansion is a crucial task. Most of Automatic expansion of taxonomy are of two types, attach and merge. In a taxonomy like WordNet, both merge and attach are integral parts of the expansion operations but majority of study consider them separately. This paper proposes a novel mult-task learning-based deep learning method known as Taxonomy Expansion with Attach and Merge (TEAM) that performs both the merge and attach operations. To the best of our knowledge this is the first study which integrates both merge and attach operations in a single model. The proposed models have been evaluated on three separate WordNet taxonomies, viz., Assamese, Bangla, and Hindi. From the various experimental setups, it is shown that TEAM outperforms its state-of-the-art counterparts for attach operation, and also provides highly encouraging performance for the merge operation.
Anthology ID:
2022.findings-naacl.28
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
366–378
Language:
URL:
https://aclanthology.org/2022.findings-naacl.28
DOI:
10.18653/v1/2022.findings-naacl.28
Bibkey:
Cite (ACL):
Bornali Phukon, Anasua Mitra, Ranbir Sanasam, and Priyankoo Sarmah. 2022. TEAM: A multitask learning based Taxonomy Expansion approach for Attach and Merge. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 366–378, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
TEAM: A multitask learning based Taxonomy Expansion approach for Attach and Merge (Phukon et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.findings-naacl.28.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.findings-naacl.28.mp4