Jeff Da at COIN - Shared Task: BIG MOOD: Relating Transformers to Explicit Commonsense Knowledge

Jeff Da


Abstract
We introduce a simple yet effective method of integrating contextual embeddings with commonsense graph embeddings, dubbed BERT Infused Graphs: Matching Over Other embeDdings. First, we introduce a preprocessing method to improve the speed of querying knowledge bases. Then, we develop a method of creating knowledge embeddings from each knowledge base. We introduce a method of aligning tokens between two misaligned tokenization methods. Finally, we contribute a method of contextualizing BERT after combining with knowledge base embeddings. We also show BERTs tendency to correct lower accuracy question types. Our model achieves a higher accuracy than BERT, and we score fifth on the official leaderboard of the shared task and score the highest without any additional language model pretraining.
Anthology ID:
D19-6010
Volume:
Proceedings of the First Workshop on Commonsense Inference in Natural Language Processing
Month:
November
Year:
2019
Address:
Hong Kong, China
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
85–92
Language:
URL:
https://aclanthology.org/D19-6010
DOI:
10.18653/v1/D19-6010
Bibkey:
Cite (ACL):
Jeff Da. 2019. Jeff Da at COIN - Shared Task: BIG MOOD: Relating Transformers to Explicit Commonsense Knowledge. In Proceedings of the First Workshop on Commonsense Inference in Natural Language Processing, pages 85–92, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Jeff Da at COIN - Shared Task: BIG MOOD: Relating Transformers to Explicit Commonsense Knowledge (Da, 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/D19-6010.pdf