Atsuhito Kita


Fixing paper assignments

  1. Please select all papers that belong to the same person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2021

pdf bib
Improving Simultaneous Translation by Incorporating Pseudo-References with Fewer Reorderings
Junkun Chen | Renjie Zheng | Atsuhito Kita | Mingbo Ma | Liang Huang
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

Simultaneous translation is vastly different from full-sentence translation, in the sense that it starts translation before the source sentence ends, with only a few words delay. However, due to the lack of large-scale, high-quality simultaneous translation datasets, most such systems are still trained on conventional full-sentence bitexts. This is far from ideal for the simultaneous scenario due to the abundance of unnecessary long-distance reorderings in those bitexts. We propose a novel method that rewrites the target side of existing full-sentence corpora into simultaneous-style translation. Experiments on ZhEn and JaEn simultaneous translation show substantial improvements (up to +2.7 BLEU) with the addition of these generated pseudo-references.