LED down the rabbit hole: exploring the potential of global attention for biomedical multi-document summarisation
Yulia Otmakhova, Thinh Hung Truong, Timothy Baldwin, Trevor Cohn, Karin Verspoor, Jey Han Lau
Abstract
In this paper we report the experiments performed for the submission to the Multidocument summarisation for Literature Review (MSLR) Shared Task. In particular, we adopt Primera model to the biomedical domain by placing global attention on important biomedical entities in several ways. We analyse the outputs of 23 resulting models and report some patterns related to the presence of additional global attention, number of training steps and the inputs configuration.- Anthology ID:
- 2022.sdp-1.21
- Volume:
- Proceedings of the Third Workshop on Scholarly Document Processing
- Month:
- October
- Year:
- 2022
- Address:
- Gyeongju, Republic of Korea
- Editors:
- Arman Cohan, Guy Feigenblat, Dayne Freitag, Tirthankar Ghosal, Drahomira Herrmannova, Petr Knoth, Kyle Lo, Philipp Mayr, Michal Shmueli-Scheuer, Anita de Waard, Lucy Lu Wang
- Venue:
- sdp
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 181–187
- Language:
- URL:
- https://aclanthology.org/2022.sdp-1.21
- DOI:
- Cite (ACL):
- Yulia Otmakhova, Thinh Hung Truong, Timothy Baldwin, Trevor Cohn, Karin Verspoor, and Jey Han Lau. 2022. LED down the rabbit hole: exploring the potential of global attention for biomedical multi-document summarisation. In Proceedings of the Third Workshop on Scholarly Document Processing, pages 181–187, Gyeongju, Republic of Korea. Association for Computational Linguistics.
- Cite (Informal):
- LED down the rabbit hole: exploring the potential of global attention for biomedical multi-document summarisation (Otmakhova et al., sdp 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2022.sdp-1.21.pdf
- Code
- allenai/mslr-shared-task + additional community code
- Data
- EBM-NLP