SDPNAL+
SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints.In this paper, we present a majorized semismooth Newton-CG augmented Lagrangian method, called SDPNAL+, for semidefinite programming (SDP) with partial or full nonnegative constraints on the matrix variable. SDPNAL+ is a much enhanced version of SDPNAL introduced by X.-Y. Zhao et al. [SIAM J. Optim. 20, No. 4, 1737–1765 (2010; Zbl 1213.90175)] for solving generic SDPs. SDPNAL works very efficiently for nondegenerate SDPs but may encounter numerical difficulty for degenerate ones. Here we tackle this numerical difficulty by employing a majorized semismooth Newton-CG augmented Lagrangian method coupled with a convergent 3-block alternating direction method of multipliers introduced recently by D. Sun et al. [SIAM J. Optim. 25, No. 2, 882–915 (2015; Zbl 06444987)]. Numerical results for various large scale SDPs with or without nonnegative constraints show that the proposed method is not only fast but also robust in obtaining accurate solutions. It outperforms, by a significant margin, two other competitive publicly available first order methods based codes: (1) an alternating direction method of multipliers based solver called SDPAD by Z. Wen et al. [Math. Program. Comput. 2, No. 3–4, 203–230 (2010; Zbl 1206.90088)] and (2) a two-easy-block-decomposition hybrid proximal extragradient method called 2EBD-HPE by R. Monteiro et al. [“A first-order block-decomposition method for solving two-easy-block structured semidefinite programs”, Math. Program. Comput. 6, No. 2, 103–150 (2014; doi:10.1007/s12532-013-0062-7)]. In contrast to these two codes, we are able to solve all the 95 difficult SDP problems arising from the relaxations of quadratic assignment problems tested in SDPNAL to an accuracy of 10 -6 efficiently, while SDPAD and 2EBD-HPE successfully solve 30 and 16 problems, respectively. In addition, SDPNAL+ appears to be the only viable method currently available to solve large scale SDPs arising from rank-1 tensor approximation problems constructed by J. Nie and L. Wang [SIAM J. Matrix Anal. Appl. 35, No. 3, 1155–1179 (2014; Zbl 1305.65134)]. The largest rank-1 tensor approximation problem we solved (in about 14.5 h) is nonsym(21,4), in which its resulting SDP problem has matrix dimension n=9261 and the number of equality constraints m=12,326,390.
Keywords for this software
References in zbMATH (referenced in 40 articles , 2 standard articles )
Showing results 1 to 20 of 40.
Sorted by year (- Chen, Liang; Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On the equivalence of inexact proximal ALM and ADMM for a class of convex composite programming (2021)
- Chen, Liang; Chang, Xiaokai; Liu, Sanyang: A three-operator splitting perspective of a three-block ADMM for convex quadratic semidefinite programming and beyond (2020)
- Chen, Shixiang; Ma, Shiqian; Man-Cho So, Anthony; Zhang, Tong: Proximal gradient method for nonsmooth optimization over the Stiefel manifold (2020)
- Ding, Chao; Sun, Defeng; Sun, Jie; Toh, Kim-Chuan: Spectral operators of matrices: semismoothness and characterizations of the generalized Jacobian (2020)
- Gaar, Elisabeth; Rendl, Franz: A computational study of exact subgraph based SDP bounds for max-cut, stable set and coloring (2020)
- Goulart, Paul J.; Nakatsukasa, Yuji; Rontsis, Nikitas: Accuracy of approximate projection to the semidefinite cone (2020)
- Li, Xiaodong; Li, Yang; Ling, Shuyang; Strohmer, Thomas; Wei, Ke: When do birds of a feather flock together? (k)-means, proximity, and conic programming (2020)
- Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: An asymptotically superlinearly convergent semismooth Newton augmented Lagrangian method for linear programming (2020)
- Sun, Defeng; Toh, Kim-Chuan; Yuan, Yancheng; Zhao, Xin-Yuan: SDPNAL+: A Matlab software for semidefinite programming with bound constraints (version 1.0) (2020)
- Yang, Qingzhi; Li, Yiyong; Huang, Pengfei: A novel formulation of the max-cut problem and related algorithm (2020)
- Yan, Yinqiao; Li, Qingna: An efficient augmented Lagrangian method for support vector machine (2020)
- Zhao, Xin-Yuan; Chen, Liang: The linear and asymptotically superlinear convergence rates of the augmented Lagrangian method with a practical relative error criterion (2020)
- Ahmadi, Amir Ali; Majumdar, Anirudha: DSOS and SDSOS optimization: more tractable alternatives to sum of squares and semidefinite optimization (2019)
- Campos, Juan S.; Misener, Ruth; Parpas, Panos: A multilevel analysis of the Lasserre hierarchy (2019)
- Cui, Ying; Sun, Defeng; Toh, Kim-Chuan: On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming (2019)
- Cui, Ying; Sun, Defeng; Toh, Kim-Chuan: Computing the best approximation over the intersection of a polyhedral set and the doubly nonnegative cone (2019)
- Eisenach, Carson; Liu, Han: Efficient, certifiably optimal clustering with applications to latent variable graphical models (2019)
- Hu, Shenglong; Sun, Defeng; Toh, Kim-Chuan: Best nonnegative rank-one approximations of tensors (2019)
- Ito, Naoki; Kim, Sunyoung; Kojima, Masakazu; Takeda, Akiko; Toh, Kim-Chuan: Algorithm 996: BBCPOP: a sparse doubly nonnegative relaxation of polynomial optimization problems with binary, box, and complementarity constraints (2019)
- Khoo, Yuehaw; Ying, Lexing: Convex relaxation approaches for strictly correlated density functional theory (2019)