Yanliang Zhang


Fixing paper assignments

  1. Please select all papers that belong to the same person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2024

pdf bib
SimCLNMT: A Simple Contrastive Learning Method for Enhancing Neural Machine Translation Quality
Menglong Xu | Yanliang Zhang
Proceedings of the 23rd Chinese National Conference on Computational Linguistics (Volume 1: Main Conference)

“Neural Machine Translation (NMT) models are typically trained using Maximum LikelihoodEstimation (MLE). However, this approach has a limitation: while it might select the bestword for the immediate context, it does not generally optimize for the entire sentence. Tomitigate this issue, we propose a simple yet effective training method called SimCLNMT.This method is designed to select words that fit well in the immediate context and also en-hance the overall translation quality over time. During training, SimCLNMT scores multiplesystem-generated (candidate) translations using the logarithm of conditional probabilities.Itthen employs a ranking loss function to learn and adjust these probabilities to align with thecorresponding quality scores. Our experimental results demonstrate that SimCLNMT consis-tently outperforms traditional MLE training on both the NIST English-Chinese and WMT’14English-German datasets. Further analysis also indicates that the translations generated by ourmodel are more closely aligned with the corresponding quality scores. We release our code athttps://github.com/chaos130/fairseq_SimCLNMT.Introduction”