@inproceedings{sekizawa-etal-2017-improving,
    title = "Improving {J}apanese-to-{E}nglish Neural Machine Translation by Paraphrasing the Target Language",
    author = "Sekizawa, Yuuki  and
      Kajiwara, Tomoyuki  and
      Komachi, Mamoru",
    editor = "Nakazawa, Toshiaki  and
      Goto, Isao",
    booktitle = "Proceedings of the 4th Workshop on {A}sian Translation ({WAT}2017)",
    month = nov,
    year = "2017",
    address = "Taipei, Taiwan",
    publisher = "Asian Federation of Natural Language Processing",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W17-5703/",
    pages = "64--69",
    abstract = "Neural machine translation (NMT) produces sentences that are more fluent than those produced by statistical machine translation (SMT). However, NMT has a very high computational cost because of the high dimensionality of the output layer. Generally, NMT restricts the size of vocabulary, which results in infrequent words being treated as out-of-vocabulary (OOV) and degrades the performance of the translation. In evaluation, we achieved a statistically significant BLEU score improvement of 0.55-0.77 over the baselines including the state-of-the-art method."
}Markdown (Informal)
[Improving Japanese-to-English Neural Machine Translation by Paraphrasing the Target Language](https://preview.aclanthology.org/iwcs-25-ingestion/W17-5703/) (Sekizawa et al., WAT 2017)
ACL