DR-DSGD : a distributionally robust decentralized learning algorithm over graphs
Ben Issaid, Chaouki; Elgabli, Anis; Bennis, Mehdi (2022-08-01)
Ben Issaid, C., Elgabli, A., & Bennis, M. (2022). DR-DSGD: A distributionally robust decentralized learning algorithm over graphs. Transactions on Machine Learning Research, 2022(8), 1-25.
TMLR makes all published content immediately available to the public free of charge.
https://rightsstatements.org/vocab/InC/1.0/
https://urn.fi/URN:NBN:fi-fe2023061454708
Tiivistelmä
Abstract
In this paper, we propose to solve a regularized distributionally robust learning problem in the decentralized setting, taking into account the data distribution shift. By adding a Kullback-Liebler regularization function to the robust min-max optimization problem, the learning problem can be reduced to a modified robust minimization problem and solved efficiently. Leveraging the newly formulated optimization problem, we propose a robust version of Decentralized Stochastic Gradient Descent (DSGD), coined Distributionally Robust Decentralized Stochastic Gradient Descent (DR-DSGD). Under some mild assumptions and provided that the regularization parameter is larger than one, we theoretically prove that DR-DSGD achieves a convergence rate of O(1/√KT + K/T), where K is the number of devices and T is the number of iterations. Simulation results show that our proposed algorithm can improve the worst distribution test accuracy by up to 10%. Moreover, DR-DSGD is more communication-efficient than DSGD since it requires fewer communication rounds (up to 20 times less) to achieve the same worst distribution test accuracy target. Furthermore, the conducted experiments reveal that DR-DSGD results in a fairer performance across devices in terms of test accuracy.
Kokoelmat
- Avoin saatavuus [37744]