• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Publication

FedML: A Research Library and Benchmark for Federated Machine Learning

Chaoyang He, Songze Li, Jinhyun So, Mi Zhang, Xiao Zeng, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Xinghua Zhu, Jianzong Wang, Li Shen, Peilin Zhao, Yan Kang, Yang Liu, Ramesh Raskar, Qiang Yang, Murali Annavaram and Salman Avestimehr. "FedML: A Research Library and Benchmark for Federated Machine Learning." NeurIPS-SpicyFL 2020. (Baidu Best Paper Award)

Abstract

Federated learning (FL) is a rapidly growing research field in machine learning. However, existing FL libraries cannot adequately support diverse algorithmic development needs of FL, such as topology customization, supporting varied aggregation schemes, and so on. In this work, we introduce FedML, an open research library, and FL benchmarks to facilitate algorithm development and fair performance comparison across computing design choices. FedML supports three computing paradigms: on-device training using a federation of edge devices, distributed training in the cloud that supports exchanging of auxiliary information beyond just gradients, and single-machine simulation of a federated learning algorithm. FedML also promotes diverse algorithmic research with flexible and generic API design and comprehensive reference baseline implementations (optimizer, models, and datasets). We believe that FedML provides an efficient and reproducible means for developing and evaluating FL algorithms that would benefit the FL community. We maintain the source code, documents, and user community at https://fedml.ai.

Related Content