Abstract
This report describes the development of our system for automatic minuting created for the AutoMin 2023 Task A. As a baseline, we utilize a system based on the BART encoder-decoder model paired with a preprocessing pipeline similar to the one introduced by the winning solutions at AutoMin 2021. We then further explore the possibilities for iterative summarization by constructing an iterative minuting dataset from the provided data, finetuning on it and feeding the model previously generated minutes. We also experiment with adding more context by utilizing the Longformer encoder-decoder model and finetuning it on the SAMSum dataset. Our submitted solution is of the baseline approach, since we were unable to match its performance with our iterative variants. With the baseline, we achieve a ROUGE-1 score of 0.368 on the ELITR minuting corpus development set. We finally explore the performance of Vicuna 13B quantized language model for summarization.- Anthology ID:
- 2023.inlg-genchal.16
- Volume:
- Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges
- Month:
- September
- Year:
- 2023
- Address:
- Prague, Czechia
- Editor:
- Simon Mille
- Venues:
- INLG | SIGDIAL
- SIG:
- SIGGEN
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 114–120
- Language:
- URL:
- https://aclanthology.org/2023.inlg-genchal.16
- DOI:
- Cite (ACL):
- František Kmječ and Ondřej Bojar. 2023. Team Iterate @ AutoMin 2023 - Experiments with Iterative Minuting. In Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges, pages 114–120, Prague, Czechia. Association for Computational Linguistics.
- Cite (Informal):
- Team Iterate @ AutoMin 2023 - Experiments with Iterative Minuting (Kmječ & Bojar, INLG-SIGDIAL 2023)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/2023.inlg-genchal.16.pdf