TMU Feedback Comment Generation System Using Pretrained Sequence-to-Sequence Language Models

Naoya Ueda, Mamoru Komachi

[How to correct problems with metadata yourself]


Abstract
In this paper, we introduce our Tokyo Metropolitan University Feedback Comment Generation system submitted to the feedback comment generation task for INLG 2023 Generation Challenge. In this task, a source sentence and offset range of preposition uses are given as the input. Then, a system generates hints or explanatory notes about preposition uses as the output. To tackle this generation task, we finetuned pretrained sequence-to-sequence language models. The models using BART and T5 showed significant improvement in BLEU score, demonstrating the effectiveness of the pretrained sequence-to-sequence language models in this task. We found that using part-of-speech tag information as an auxiliary input improves the generation quality of feedback comments. Furthermore, we adopt a simple postprocessing method that can enhance the reliability of the generation. As a result, our system achieved the F1 score of 47.4 points in BLEU-based evaluation and 60.9 points in manual evaluation, which ranked second and third on the leaderboard.
Anthology ID:
2023.inlg-genchal.10
Volume:
Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges
Month:
September
Year:
2023
Address:
Prague, Czechia
Editor:
Simon Mille
Venues:
INLG | SIGDIAL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
68–73
Language:
URL:
https://aclanthology.org/2023.inlg-genchal.10
DOI:
Bibkey:
Cite (ACL):
Naoya Ueda and Mamoru Komachi. 2023. TMU Feedback Comment Generation System Using Pretrained Sequence-to-Sequence Language Models. In Proceedings of the 16th International Natural Language Generation Conference: Generation Challenges, pages 68–73, Prague, Czechia. Association for Computational Linguistics.
Cite (Informal):
TMU Feedback Comment Generation System Using Pretrained Sequence-to-Sequence Language Models (Ueda & Komachi, INLG-SIGDIAL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/2023.inlg-genchal.10.pdf