Abstract
Federated learning (FL) is a solution for privacy challenge, which allows multiparty to train a shared model without violating privacy protection regulations. Many excellent works of FL have been proposed in recent years. To help researchers verify their ideas in FL, we designed and developed FedLab, a flexible and modular FL framework based on PyTorch. In this paper, we will introduce architecture and features of FedLab. For current popular research points: optimization and communication compression, FedLab provides functional interfaces and a series of baseline implementation are available, making researchers quickly implement ideas. In addition, FedLab is scale-able in both client simulation and distributed communication.
Abstract (translated)
URL
https://arxiv.org/abs/2107.11621