loading page

Communication-efficient ADMM using Quantization-aware Gaussian Process Regression
  • Aldo Duarte Vera Tudela ,
  • Shuangqing Wei ,
  • Truong Nghiem
Aldo Duarte Vera Tudela
Author Profile
Shuangqing Wei
Author Profile
Truong Nghiem
Author Profile

Abstract

In networks consisting of agents communicating with a central coordinator and working together to solve a global optimization problem in a distributed manner, the agents are often required to solve private proximal minimization subproblems. Such a setting often requires a decomposition method to solve the global distributed problem, resulting in extensive communication overhead. In networks where communication is expensive, it is crucial to reduce the communication overhead of the distributed optimization scheme. Gaussian processes (GPs) are effective at learning the agents’ local proximal operators, thereby reducing the communication between the agents and the coordinator. We propose combining this learning method with adaptive uniform quantization for a hybrid approach that can achieve further communication reduction. In our approach, the GP algorithm is modified to account for the introduced quantization noise statistics due to data quantization. We further improve our approach by introducing an orthogonalization process to the quantizer’s input to address the inherent correlation of the input components. We also use dithering to ensure uncorrelation between the quantizer’s introduced noise and its input. We propose multiple measures to quantify the trade-off between the communication cost reduction and the optimization solution’s accuracy/optimality. Under such metrics, our proposed algorithms can achieve significant communication reduction for distributed optimization with acceptable accuracy, even at low quantization resolutions. This result is demonstrated by simulations of a distributed sharing problem with quadratic cost functions for the agents.