TY - GEN
T1 - A Communication-efficient Approach of Bayesian Distributed Federated Learning
AU - Wang, Sihua
AU - Guo, Huayan
AU - Zhu, Xu
AU - Yin, Changchuan
AU - Lau, Vincent K.N.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - This paper investigates a fully distributed federated learning (FL) problem, in which each device is restricted to only utilize its local dataset and the information received from its adjacent devices that are defined in a communication graph to update the local model weights for minimizing the global loss function. To incorporate the communication graph constraint into the joint posterior distribution, we exploit the fact that the model weights on each device is a function of its local likelihood and local prior and then, the connectivity between adjacent devices is modeled by a Dirichlet distribution. In this way, the joint distribution can be factorized naturally by a factor graph. Based on the Dirichlet-based factor graph, we propose a novel distributed approximate Bayesian inference algorithm that combines loopy belief propagation (LBP) and variational Bayesian inference (VBI) for distributed FL. Specifically, VBI is used to approximate the non-Gaussian marginal posterior as a Gaussian distribution in local training process and then, the global training process resembles Gaussian LBP where only the mean and variance are passed among adjacent devices. Furthermore, we propose a new damping factor design according to the communication graph topology to mitigate the potential divergence and achieve consensus convergence. Simulation results verify that the proposed solution achieves faster convergence speed with better performance than baselines.
AB - This paper investigates a fully distributed federated learning (FL) problem, in which each device is restricted to only utilize its local dataset and the information received from its adjacent devices that are defined in a communication graph to update the local model weights for minimizing the global loss function. To incorporate the communication graph constraint into the joint posterior distribution, we exploit the fact that the model weights on each device is a function of its local likelihood and local prior and then, the connectivity between adjacent devices is modeled by a Dirichlet distribution. In this way, the joint distribution can be factorized naturally by a factor graph. Based on the Dirichlet-based factor graph, we propose a novel distributed approximate Bayesian inference algorithm that combines loopy belief propagation (LBP) and variational Bayesian inference (VBI) for distributed FL. Specifically, VBI is used to approximate the non-Gaussian marginal posterior as a Gaussian distribution in local training process and then, the global training process resembles Gaussian LBP where only the mean and variance are passed among adjacent devices. Furthermore, we propose a new damping factor design according to the communication graph topology to mitigate the potential divergence and achieve consensus convergence. Simulation results verify that the proposed solution achieves faster convergence speed with better performance than baselines.
UR - https://www.scopus.com/pages/publications/105000822046
U2 - 10.1109/GLOBECOM52923.2024.10901723
DO - 10.1109/GLOBECOM52923.2024.10901723
M3 - 会议稿件
AN - SCOPUS:105000822046
T3 - Proceedings - IEEE Global Communications Conference, GLOBECOM
SP - 4666
EP - 4671
BT - GLOBECOM 2024 - 2024 IEEE Global Communications Conference
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE Global Communications Conference, GLOBECOM 2024
Y2 - 8 December 2024 through 12 December 2024
ER -