22 Jun 2021

This week, Vincent will discuss a paper about Federated Learning in NLP field. A zoom link will be sent tomorrow morning.

FedNLP: A Research Platform for Federated Learning in Natural Language Processing

Abstract: Increasing concerns and regulations about data privacy, necessitate the study of privacy-preserving methods for natural language processing (NLP) applications. Federated learning (FL) provides promising methods for a large number of clients (i.e., personal devices or organizations) to collaboratively learn a shared global model to benefit all clients, while allowing users to keep their data locally. To facilitate FL research in NLP, we present the FedNLP, a research platform for federated learning in NLP. FedNLP supports various popular task formulations in NLP such as text classification, sequence tagging, question answering, seq2seq generation, and language modeling. We also implement an interface between Transformer language models (e.g., BERT) and FL methods (e.g., FedAvg, FedOpt, etc.) for distributed training. The evaluation protocol of this interface supports a comprehensive collection of non-IID partitioning strategies. Our preliminary experiments with FedNLP reveal that there exists a large performance gap between learning on decentralized and centralized datasets – opening intriguing and exciting future research directions aimed at developing FL methods suited to NLP tasks.

https://arxiv.org/abs/2104.08815

Tuesday, June 22nd, 09:30 a.m.