In our lab meeting tomorrow, Nishant will give us a review of recent papers in EMNLP.
Notes from EMNLP 2020
Abstract: EMNLP just got over. Of the several interesting research published this year, we will take a quick look at a few papers. Multilingual models, while recording ever increasing performance scores, suffer severely when subjected to controlled test cases, specially the ones where we focus on the model-behaviour wrt certain languages. In this presentation, we look at how improving the vocabulary generation can lead to better model generalizations. We will also look at the factors essential for the multilinguality in the popular mBERT model. Finally, we will look at the adequacy of the reference-translations used to evaluate our favourite NMT models.
Tuesday, Dec 7th, 09:30 a.m.