Yuji Yamamoto


2023

pdf bib
Absolute Position Embedding Learns Sinusoid-like Waves for Attention Based on Relative Position
Yuji Yamamoto | Takuya Matsuzaki
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing

Attention weight is a clue to interpret how a Transformer-based model makes an inference. In some attention heads, the attention focuses on the neighbors of each token. This allows the output vector of each token to depend on the surrounding tokens and contributes to make the inference context-dependent. We analyze the mechanism behind the concentration of attention on nearby tokens. We show that the phenomenon emerges as follows: (1) learned position embedding has sinusoid-like components, (2) such components are transmitted to the query and the key in the self-attention, (3) the attention head shifts the phases of the sinusoid-like components so that the attention concentrates on nearby tokens at specific relative positions. In other words, a certain type of Transformer-based model acquires the sinusoidal positional encoding to some extent on its own through Masked Language Modeling.

2017

pdf
Rule-based MT and UTX Glossary Management – Honda’s Case Dealing with Thousands of Technical Terms
Saemi Hirayama | Yuji Yamamoto
Proceedings of Machine Translation Summit XVI: Commercial MT Users and Translators Track

2011

pdf
UTX 1.11, a Simple and Open User Dictionary/Terminology Standard, and its Effectiveness with Multiple MT Systems
Seiji Okura | Yuji Yamamoto | Hajime Ito | Michael Kato | Miwako Shimazu
Proceedings of Machine Translation Summit XIII: Papers

2008

pdf bib
Sharing User Dictionaries Across Multiple Systems with UTX-S
Francis Bond | Seiji Okura | Yuji Yamamoto | Toshiki Murata | Kiyotaka Uchimoto | Michael Kato | Miwako Shimazu | Tsugiyoshi Suzuki
Proceedings of the 8th Conference of the Association for Machine Translation in the Americas: Government and Commercial Uses of MT

Careful tuning of user-created dictionaries is indispensable when using a machine translation system for computer aided translation. However, there is no widely used standard for user dictionaries in the Japanese/English machine translation market. To address this issue, AAMT (the Asia-Pacific Association for Machine Translation) has established a specification of sharable dictionaries (UTX-S: Universal Terminology eXchange -- Simple), which can be used across different machine translation systems, thus increasing the interoperability of language resources. UTX-S is simpler than existing specifications such as UPF and OLIF. It was explicitly designed to make it easy to (a) add new user dictionaries and (b) share existing user dictionaries. This facilitates rapid user dictionary production and avoids vendor tie in. In this study we describe the UTX-Simple (UTX-S) format, and show that it can be converted to the user dictionary formats for five commercial English-Japanese MT systems. We then present a case study where we (a) convert an on-line glossary to UTX-S, and (b) produce user dictionaries for five different systems, and then exchange them. The results show that the simplified format of UTX-S can be used to rapidly build dictionaries. Further, we confirm that customized user dictionaries are effective across systems, although with a slight loss in quality: on average, user dictionaries improved the translations for 44.8% of translations with the systems they were built for and 37.3% of translations for different systems. In ongoing work, AAMT is using UTX-S as the format in building up a user community for producing, sharing, and accumulating user dictionaries in a sustainable way.