Is Graph Structure Necessary for Multi-hop Question Answering?

Nan Shao, Yiming Cui, Ting Liu, Shijin Wang, Guoping Hu


Abstract
Recently, attempting to model texts as graph structure and introducing graph neural networks to deal with it has become a trend in many NLP research areas. In this paper, we investigate whether the graph structure is necessary for textual multi-hop reasoning. Our analysis is centered on HotpotQA. We construct a strong baseline model to establish that, with the proper use of pre-trained models, graph structure may not be necessary for textual multi-hop reasoning. We point out that both graph structure and adjacency matrix are task-related prior knowledge, and graph-attention can be considered as a special case of self-attention. Experiments demonstrate that graph-attention or the entire graph structure can be replaced by self-attention or Transformers.
Anthology ID:
2020.emnlp-main.583
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7187–7192
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2020.emnlp-main.583/
DOI:
10.18653/v1/2020.emnlp-main.583
Bibkey:
Cite (ACL):
Nan Shao, Yiming Cui, Ting Liu, Shijin Wang, and Guoping Hu. 2020. Is Graph Structure Necessary for Multi-hop Question Answering?. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 7187–7192, Online. Association for Computational Linguistics.
Cite (Informal):
Is Graph Structure Necessary for Multi-hop Question Answering? (Shao et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2020.emnlp-main.583.pdf
Video:
 https://slideslive.com/38938772
Data
HotpotQA