loading page

Convergence property of the Quantized Decentralized Gradient descent with constant stepsizes and an effective strategy for the stepsize selection
  • Myeong-Su Lee,
  • Woocheol Choi
Myeong-Su Lee
Sungkyunkwan University Institute of Basic Science

Corresponding Author:[email protected]

Author Profile
Woocheol Choi
Sungkyunkwan University Department of Mathematics
Author Profile

Abstract

Distributed algorithms involving quantization have received a lot of interest recently as the quantized communication appears naturally in real applications. For such algorithms, it is non-trivial to select appropriate stepsizes for high performance, due to presence of the noise effect induced from quantization. In this paper, we establish new convergence results for the quantized decentralized gradient descent and we propose a novel strategy of choosing the stepsizes for the high performance of the algorithm. Precisely, under the strongly convexity assumption on the aggregate cost function and the smoothness assumption on each local cost function, we prove the algorithm converges exponentially fast to a small neighborhood of the optimizer whose radius depends on the stepsizes. Then, based on our convergence result, we suggest an effective stepsize selection algorithm which repeats diminishing the stepsizes after a number of specific iterations by a certain rule. Both the convergence results and the effectiveness of the suggested stepsize selection are also verified by the numerical experiments.