AIDE: Fast and Communication Efficient Distributed Optimization. In this paper, we present two new communication-efficient methods for distributed minimization of an average of functions. The first algorithm is an inexact variant of the DANE algorithm that allows any local algorithm to return an approximate solution to a local subproblem. We show that such a strategy does not affect the theoretical guarantees of DANE significantly. In fact, our approach can be viewed as a robustification strategy since the method is substantially better behaved than DANE on data partition arising in practice. It is well known that DANE algorithm does not match the communication complexity lower bounds. To bridge this gap, we propose an accelerated variant of the first method, called AIDE, that not only matches the communication lower bounds but can also be implemented using a purely first-order oracle. Our empirical results show that AIDE is superior to other communication efficient algorithms in settings that naturally arise in machine learning applications.
Keywords for this software
References in zbMATH (referenced in 5 articles )
Showing results 1 to 5 of 5.
- Kylasa, Sudhir; Fang, Chih-Hao; Roosta, Fred; Grama, Ananth: Parallel optimization techniques for machine learning (2020)
- Li, Boyue; Cen, Shicong; Chen, Yuxin; Chi, Yuejie: Communication-efficient distributed optimization in networks with gradient tracking and variance reduction (2020)
- Yuan, Xiao-Tong; Li, Ping: On convergence of distributed approximate Newton methods: globalization, sharper bounds and beyond (2020)
- Xiao, Lin; Yu, Adams Wei; Lin, Qihang; Chen, Weizhu: DSCOVR: randomized primal-dual block coordinate algorithms for asynchronous distributed optimization (2019)
- Lee, Jason D.; Lin, Qihang; Ma, Tengyu; Yang, Tianbao: Distributed stochastic variance reduced gradient methods by sampling extra data with replacement (2017)