-
作者
Zhu, Hongbin; Yang, Miao; Kuang, Junqian; Qian, Hua; Zhou, Yong
-
刊物名称
2022 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS (ICC WORKSHOPS)
-
年、卷、文献号
2022, , 2164-7038
-
关键词
Zhu, Hongbin; Yang, Miao; Kuang, Junqian; Qian, Hua; Zhou, Yong
-
摘要
Federated learning (FL), as a nascent distributed learning framework, trains a machine learning model in a collaborative manner. Synchronous model aggregation is widely adopted, but suffers from the straggler issue because of the system heterogeneity. To overcome the straggler issue, we employ the asynchronous FL framework. The target of this paper is to minimize the training latency by client selection while taking into account both the client availability and the long-term fairness. A practical scenario is considered where the channel conditions and the local computing power of the clients are not aware by the parameter server. This makes client selection problem thorny to be tackled, because the training latency consists of timevarying round trip transmission latency and the local training latency. By transforming the latency minimization problem into a multi-armed bandit problem and leveraging the upper confidence bound policy and the virtual queue technique, we tackle the asynchronous client selection problem. Numerical results validate that our proposed algorithm outperforms the baseline algorithms in terms of the convergence performance.