Bregman Alternating Direction Method of Multipliers. The mirror descent algorithm (MDA) generalizes gradient descent by using a Bregman divergence to replace squared Euclidean distance. In this paper, we similarly generalize the alternating direction method of multipliers (ADMM) to Bregman ADMM (BADMM), which allows the choice of different Bregman divergences to exploit the structure of problems. BADMM provides a unified framework for ADMM and its variants, including generalized ADMM, inexact ADMM and Bethe ADMM. We establish the global convergence and the O(1/T) iteration complexity for BADMM. In some cases, BADMM can be faster than ADMM by a factor of O(n/log(n)). In solving the linear program of mass transportation problem, BADMM leads to massive parallelism and can easily run on GPU. BADMM is several times faster than highly optimized commercial software Gurobi.

References in zbMATH (referenced in 36 articles )

Showing results 1 to 20 of 36.
Sorted by year (citations)

1 2 next

  1. Lin, Qihang; Ma, Runchao; Xu, Yangyang: Complexity of an inexact proximal-point penalty method for constrained smooth non-convex optimization (2022)
  2. Arridge, Simon R. (ed.); Maaß, Peter (ed.); Schönlieb, Carola-Bibiane (ed.): Deep learning for inverse problems. Abstracts from the workshop held March 7--13, 2021 (hybrid meeting) (2021)
  3. Benning, Martin; Betcke, Marta M.; Ehrhardt, Matthias J.; Schönlieb, Carola-Bibiane: Choose your path wisely: gradient descent in a Bregman distance framework (2021)
  4. Bian, Fengmiao; Liang, Jingwei; Zhang, Xiaoqun: A stochastic alternating direction method of multipliers for non-smooth and non-convex optimization (2021)
  5. Jian, Jinbao; Liu, Pengjie; Yin, Jianghua; Zhang, Chen; Chao, Miantao: A QCQP-based splitting SQP algorithm for two-block nonconvex constrained optimization problems with application (2021)
  6. Jia, Zehui; Gao, Xue; Cai, Xingju; Han, Deren: Local linear convergence of the alternating direction method of multipliers for nonconvex separable optimization problems (2021)
  7. Kiefer, Lukas; Petra, Stefania; Storath, Martin; Weinmann, Andreas: Multi-channel Potts-based reconstruction for multi-spectral computed tomography (2021)
  8. Qian, Yitian; Pan, Shaohua: An inexact PAM method for computing Wasserstein barycenter with unknown supports (2021)
  9. Wu, Tingting; Ng, Michael K.; Zhao, Xi-Le: Sparsity reconstruction using nonconvex TGpV-shearlet regularization and constrained projection (2021)
  10. Yang, Lei; Li, Jia; Sun, Defeng; Toh, Kim-Chuan: A fast globally linearly convergent algorithm for the computation of Wasserstein barycenters (2021)
  11. Boţ, Radu Ioan; Nguyen, Dang-Khoa: The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates (2020)
  12. Jian, Jinbao; Zhang, Chen; Yin, Jianghua; Yang, Linfeng; Ma, Guodong: Monotone splitting sequential quadratic optimization algorithm with applications in electric power systems (2020)
  13. Tu, Kai; Zhang, Haibin; Gao, Huan; Feng, Junkai: A hybrid Bregman alternating direction method of multipliers for the linearly constrained difference-of-convex problems (2020)
  14. Wang, Jianjun; Huang, Jianwen; Zhang, Feng; Wang, Wendong: Group sparse recovery in impulsive noise via alternating direction method of multipliers (2020)
  15. Yu, Yue; Açıkmeşe, Behçet; Mesbahi, Mehran: Mass-spring-damper networks for distributed optimization in non-Euclidean spaces (2020)
  16. Dai, Ben; Wang, Junhui: Query-dependent ranking and its asymptotic properties (2019)
  17. Jian, Jin Bao; Zhang, Ye; Chao, Mian Tao: A regularized alternating direction method of multipliers for a class of nonconvex problems (2019)
  18. Lu, Kaihong; Jing, Gangshan; Wang, Long: A distributed algorithm for solving mixed equilibrium problems (2019)
  19. Rahimi, Yaghoub; Wang, Chao; Dong, Hongbo; Lou, Yifei: A scale-invariant approach for sparse signal recovery (2019)
  20. Wang, Yu; Yin, Wotao; Zeng, Jinshan: Global convergence of ADMM in nonconvex nonsmooth optimization (2019)

1 2 next