As the public’s awareness of data privacy is increasingly raised over the years, federated learning - a specific setting of distributed machine learning - gains its popularity among researchers and practitioners. Having the unique features like privacy protection, federated learning also faces many new challenges, one of which is how to maximize the communication efficiency over the model training process. This paper will focus two major directions to improve the efficiency of communications in federated learning. For each of these two directions, papers along with their proposed methods will be categorized, summarized, analyzed, compared, and commented on their designs and effectiveness.
The survey paper is available here.