z-logo
open-access-imgOpen Access
Communication Efficient Decentralized Learning Over Bipartite Graphs
Author(s) -
Chaouki Ben Issaid,
Anis Elgabli,
Jihong Park,
Mehdi Bennis,
Merouane Debbah
Publication year - 2021
Publication title -
ieee transactions on wireless communications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.01
H-Index - 223
eISSN - 1558-2248
pISSN - 1536-1276
DOI - 10.1109/twc.2021.3126859
Subject(s) - communication, networking and broadcast technologies , computing and processing , signal processing and analysis
In this paper, we propose a communication-efficiently decentralized machine learning framework that solves a consensus optimization problem defined over a network of inter-connected workers. The proposed algorithm, Censored and Quantized Generalized GADMM (CQ-GGADMM), leverages the worker grouping and decentralized learning ideas of Group Alternating Direction Method of Multipliers (GADMM), and pushes the frontier in communication efficiency by extending its applicability to generalized network topologies, while incorporating link censoring for negligible updates after quantization. We theoretically prove that CQ-GGADMM achieves the linear convergence rate when the local objective functions are strongly convex under some mild assumptions. Numerical simulations corroborate that CQ-GGADMM exhibits higher communication efficiency in terms of the number of communication rounds and transmit energy consumption without compromising the accuracy and convergence speed, compared to the censored decentralized ADMM, and the worker grouping method of GADMM.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here