摘要
In this article, we consider a strongly convex finite-sum minimization problem over a decentralized network and pro-pose a communication-efficient decentralized Newton's method for solving it. The main challenges in designing such an algorithm come from three aspects: (i) mismatch between local gradients/Hessians and the global ones; (ii) cost of sharing second-order information; (iii) tradeoff among computation and communication. To handle these challenges, we first apply dynamic average consensus (DAC) so that each node is able to use a local gradient approximation and a local Hessian approximation to track the global gradient and Hessian, respectively. Second, since exchanging Hessian approxi-mations is far from communication-efficient, we require the nodes to exchange the compressed ones instead and then apply an error compensation mechanism to correct for the compression noise. Third, we introduce multi-step consensus for exchanging local variables and local gradient approximations to balance between computation and communication. With novel analysis, we establish the globally linear (resp., asymptotically super-linear) convergence rate of the proposed algorithm when m is constant (resp., tends to infinity), where m = 1 is the number of consensus inner steps. To the best of our knowledge, this is the first super-linear conver-gence result for a communication-efficient decentralized Newton's method. Moreover, the rate we establish is provably faster than those of first-order methods. Our numerical results on various applications corroborate the theoretical findings.
-
单位中山大学