The AFRL IWSLT 2020 Systems: Work-From-Home Edition

Brian Ore, Eric Hansen, Tim Anderson, Jeremy Gwinnup


Abstract
This report summarizes the Air Force Research Laboratory (AFRL) submission to the offline spoken language translation (SLT) task as part of the IWSLT 2020 evaluation campaign. As in previous years, we chose to adopt the cascade approach of using separate systems to perform speech activity detection, automatic speech recognition, sentence segmentation, and machine translation. All systems were neural based, including a fully-connected neural network for speech activity detection, a Kaldi factorized time delay neural network with recurrent neural network (RNN) language model rescoring for speech recognition, a bidirectional RNN with attention mechanism for sentence segmentation, and transformer networks trained with OpenNMT and Marian for machine translation. Our primary submission yielded BLEU scores of 21.28 on tst2019 and 23.33 on tst2020.
Anthology ID:
2020.iwslt-1.11
Volume:
Proceedings of the 17th International Conference on Spoken Language Translation
Month:
July
Year:
2020
Address:
Online
Editors:
Marcello Federico, Alex Waibel, Kevin Knight, Satoshi Nakamura, Hermann Ney, Jan Niehues, Sebastian Stüker, Dekai Wu, Joseph Mariani, Francois Yvon
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
Association for Computational Linguistics
Note:
Pages:
103–108
Language:
URL:
https://aclanthology.org/2020.iwslt-1.11
DOI:
10.18653/v1/2020.iwslt-1.11
Bibkey:
Cite (ACL):
Brian Ore, Eric Hansen, Tim Anderson, and Jeremy Gwinnup. 2020. The AFRL IWSLT 2020 Systems: Work-From-Home Edition. In Proceedings of the 17th International Conference on Spoken Language Translation, pages 103–108, Online. Association for Computational Linguistics.
Cite (Informal):
The AFRL IWSLT 2020 Systems: Work-From-Home Edition (Ore et al., IWSLT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2020.iwslt-1.11.pdf
Video:
 http://slideslive.com/38929615