Qianqian Zhu
2021
Findings of the WMT 2021 Shared Task on Efficient Translation
Kenneth Heafield
|
Qianqian Zhu
|
Roman Grundkiewicz
Proceedings of the Sixth Conference on Machine Translation
The machine translation efficiency task challenges participants to make their systems faster and smaller with minimal impact on translation quality. How much quality to sacrifice for efficiency depends upon the application, so participants were encouraged to make multiple submissions covering the space of trade-offs. In total, there were 53 submissions by 4 teams. There were GPU, single-core CPU, and multi-core CPU hardware tracks as well as batched throughput or single-sentence latency conditions. Submissions showed hundreds of millions of words can be translated for a dollar, average latency is 5–17 ms, and models fit in 7.5–150 MB.
Efficient Machine Translation with Model Pruning and Quantization
Maximiliana Behnke
|
Nikolay Bogoychev
|
Alham Fikri Aji
|
Kenneth Heafield
|
Graeme Nail
|
Qianqian Zhu
|
Svetlana Tchistiakova
|
Jelmer van der Linde
|
Pinzhen Chen
|
Sidharth Kashyap
|
Roman Grundkiewicz
Proceedings of the Sixth Conference on Machine Translation
We participated in all tracks of the WMT 2021 efficient machine translation task: single-core CPU, multi-core CPU, and GPU hardware with throughput and latency conditions. Our submissions combine several efficiency strategies: knowledge distillation, a simpler simple recurrent unit (SSRU) decoder with one or two layers, lexical shortlists, smaller numerical formats, and pruning. For the CPU track, we used quantized 8-bit models. For the GPU track, we experimented with FP16 and 8-bit integers in tensorcores. Some of our submissions optimize for size via 4-bit log quantization and omitting a lexical shortlist. We have extended pruning to more parts of the network, emphasizing component- and block-level pruning that actually improves speed unlike coefficient-wise pruning.
Search
Co-authors
- Kenneth Heafield 2
- Roman Grundkiewicz 2
- Maximiliana Behnke 1
- Nikolay Bogoychev 1
- Alham Fikri Aji 1
- show all...
Venues
- wmt2