This week, Nishant will give us a survey about Massively multilingual NMT. A zoom link will be sent tomorrow morning.
The Current State of Massively Multilingual NMT
Abstract: Massively multilingual NMT (MMNMT) models are capable of handling over a 100 languages and thousands of translation directions with a single trained model. Apart from scalability, zero-shot translations between languages as a result of the inherent transfer learning makes such models desirable. In this presentation, we will take a look at the preconditions and assumptions that are crucial to build an MMNMT, the widely accepted approaches and results, an in-depth analysis of various aspects that are critical to achieving a practical MMNMT model followed by some recent proposals for improving such models. To conclude, we will discuss some shortcomings and open problems in this direction.
Tuesday, July 6th, 09:30 a.m.