21 Apr 2020

In our lab meeting tomorrow, Nishant will review techniques for improving zero-shot NMT. A Zoom link will be posted to Twist on the morning of the meeting.

A Review of Representational Constraints to Improve Zero-Shot NMT

Abstract: Multilingual Neural Machine Translation (NMT) models are capable of translating between multiple source and target languages. Despite various approaches to train such models, they have difficulty with zero-shot translation: translating between language pairs that were not together seen during training. We first diagnose why state-of-the-art multilingual NMT models that rely purely on parameter sharing (language-specific methods) , fail to generalize to unseen language pairs. We then review auxiliary losses (language-independent constraints) on the NMT encoder and decoder that impose representational invariance across languages.

Tuesday, Apr 21st, 09:30 a.m.