Unsupervised document summarization using pre-trained sentence embeddings and graph centrality

Juan Ramirez-Orta, Evangelos Milios


Abstract
This paper describes our submission for the LongSumm task in SDP 2021. We propose a method for incorporating sentence embeddings produced by deep language models into extractive summarization techniques based on graph centrality in an unsupervised manner. The proposed method is simple, fast, can summarize any kind of document of any size and can satisfy any length constraints for the summaries produced. The method offers competitive performance to more sophisticated supervised methods and can serve as a proxy for abstractive summarization techniques
Anthology ID:
2021.sdp-1.14
Volume:
Proceedings of the Second Workshop on Scholarly Document Processing
Month:
June
Year:
2021
Address:
Online
Editors:
Iz Beltagy, Arman Cohan, Guy Feigenblat, Dayne Freitag, Tirthankar Ghosal, Keith Hall, Drahomira Herrmannova, Petr Knoth, Kyle Lo, Philipp Mayr, Robert M. Patton, Michal Shmueli-Scheuer, Anita de Waard, Kuansan Wang, Lucy Lu Wang
Venue:
sdp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
110–115
Language:
URL:
https://aclanthology.org/2021.sdp-1.14
DOI:
10.18653/v1/2021.sdp-1.14
Bibkey:
Cite (ACL):
Juan Ramirez-Orta and Evangelos Milios. 2021. Unsupervised document summarization using pre-trained sentence embeddings and graph centrality. In Proceedings of the Second Workshop on Scholarly Document Processing, pages 110–115, Online. Association for Computational Linguistics.
Cite (Informal):
Unsupervised document summarization using pre-trained sentence embeddings and graph centrality (Ramirez-Orta & Milios, sdp 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2021.sdp-1.14.pdf