LTRC @MuP 2022: Multi-Perspective Scientific Document Summarization Using Pre-trained Generation Models

Ashok Urlana, Nirmal Surange, Manish Shrivastava


Abstract
The MuP-2022 shared task focuses on multiperspective scientific document summarization. Given a scientific document, with multiple reference summaries, our goal was to develop a model that can produce a generic summary covering as many aspects of the document as covered by all of its reference summaries. This paper describes our best official model, a finetuned BART-large, along with a discussion on the challenges of this task and some of our unofficial models including SOTA generation models. Our submitted model out performedthe given, MuP 2022 shared task, baselines on ROUGE-2, ROUGE-L and average ROUGE F1-scores. Code of our submission can be ac- cessed here.
Anthology ID:
2022.sdp-1.35
Volume:
Proceedings of the Third Workshop on Scholarly Document Processing
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Arman Cohan, Guy Feigenblat, Dayne Freitag, Tirthankar Ghosal, Drahomira Herrmannova, Petr Knoth, Kyle Lo, Philipp Mayr, Michal Shmueli-Scheuer, Anita de Waard, Lucy Lu Wang
Venue:
sdp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
279–284
Language:
URL:
https://aclanthology.org/2022.sdp-1.35
DOI:
Bibkey:
Cite (ACL):
Ashok Urlana, Nirmal Surange, and Manish Shrivastava. 2022. LTRC @MuP 2022: Multi-Perspective Scientific Document Summarization Using Pre-trained Generation Models. In Proceedings of the Third Workshop on Scholarly Document Processing, pages 279–284, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
LTRC @MuP 2022: Multi-Perspective Scientific Document Summarization Using Pre-trained Generation Models (Urlana et al., sdp 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.sdp-1.35.pdf
Code
 ashokurlana/ltrc-mup-coling-2022
Data
SciTLDR