Low-Similarity Client Sampling for Decentralized Federated Learning
Author(s) -
Yunseok Kang,
Jaeyoung Song
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3615459
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Federated Learning (FL) often encounters challenges from statistical heterogeneity caused by non-identical client data distributions. A common approach to mitigate this issue is to cluster clients with similar data and train within each cluster, which reduces inter-client heterogeneity but also introduces communication overhead and redundant updates. While decentralized federated learning has made recent advances, decentralized client sampling for clustered environments remains unexplored, leaving an important gap between clustering strategies and communication-efficient training. To bridge this gap, we propose Low-Similarity Decentralized Client Sampling (LS-DCS), a framework that integrates decentralized decision-making with similarity-based filtering in clustered settings. Instead of relying on a central server, LS-DCS allows each client to independently determine whether to transmit an update by measuring cosine similarity with the previous cluster-level update, ensuring that only non-redundant and high-value updates are communicated. In addition, LS-DCS introduces a reset mechanism that preserves fairness and stability: if the number of active clients in a round falls below a threshold, all clients in the cluster are re-engaged, preventing long-term exclusion and preserving cluster completeness. Experimental results show that LS-DCS achieves faster convergence than prior decentralized sampling methods, even with fewer participating clients per round, while maintaining a strong balance between communication efficiency and training stability. These findings underline the importance of low-similarity sampling and adaptive reset strategies as key enablers of scalable, cluster-aware decentralized federated learning.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom