NeurIPS 2020

FedSplit: an algorithmic framework for fast federated optimization

Meta Review

The paper first gives examples of fixed points for FedAvg/FedProx which are not corresponding to the zero of sum of gradients of the consensus problems even in the deterministic case. Motivated by the incorrect fixed points of FedAvg/FedProx, the paper then proposes FedSplit, a splitting method for federated optimization with convergence analysis. Consensus was reached among the reviewers that the contribution is valuable and above the bar for NeurIPS. However, we urge the authors to incorporate a discussion of e.g. Scaffold which solves largely the same problem, so that also the convergence rates should be compared, as well as more discussion AGD and on deterministic vs stochastic modes. We hope the detailed feedback with improvement suggestions from the 4 reviews will be implemented for the camera ready version.