20 Oct 2020

In our lab meeting tomorrow, Nishant will introduce his work on Multilingual NMT.

A Zoom link will be posted to Twist on the morning of the meeting.

Improving Supervised Massively Multilingual NMT

Abstract: The dominant approach in multilingual neural machine translation (NMT) uses a single model with parameters shared across multiple languages. We’ll look at a novel approach that simultaneously trains two NMT models: a forward model trained on parallel data from multiple languages to a single language and a backward model trained to translate from the same single language into multiple languages. It is purely supervised using no monolingual data but exploits the available parallel training data by learning a shared multilingual representation space.

Tuesday, October 20th, 09:30 a.m.