ProxSARAH: an efficient algorithmic framework for stochastic composite nonconvex optimization. We propose a new stochastic first-order algorithmic framework to solve stochastic composite nonconvex optimization problems that covers both finite-sum and expectation settings. Our algorithms rely on the SARAH estimator and consist of two steps: a proximal gradient and an averaging step making them different from existing nonconvex proximal-type algorithms. The algorithms only require an average smoothness assumption of the nonconvex objective term and additional bounded variance assumption if applied to expectation problems. They work with both constant and dynamic step-sizes, while allowing single sample and mini-batches. In all these cases, we prove that our algorithms can achieve the best-known complexity bounds in terms of stochastic first-order oracle. One key step of our methods is the new constant and dynamic step-sizes resulting in the desired complexity bounds while improving practical performance. Our constant step-size is much larger than existing methods including proximal SVRG scheme in the single sample case. We also specify our framework to the non-composite case that covers existing state-of-the-arts in terms of oracle complexity bounds. Our update also allows one to trade-off between step-sizes and mini-batch sizes to improve performance. We test the proposed algorithms on two composite nonconvex problems and neural networks using several well-known data sets.
Keywords for this software
References in zbMATH (referenced in 9 articles , 1 standard article )
Showing results 1 to 9 of 9.
- Driggs, Derek; Ehrhardt, Matthias J.; Schönlieb, Carola-Bibiane: Accelerating variance-reduced stochastic gradient methods (2022)
- Tran-Dinh, Quoc; Pham, Nhan H.; Phan, Dzung T.; Nguyen, Lam M.: A hybrid stochastic optimization framework for composite nonconvex optimization (2022)
- Xin, Ran; Khan, Usman A.; Kar, Soummya: Fast decentralized nonconvex finite-sum optimization with recursive variance reduction (2022)
- Driggs, Derek; Tang, Junqi; Liang, Jingwei; Davies, Mike; Schönlieb, Carola-Bibiane: A stochastic proximal alternating minimization for nonsmooth and nonconvex optimization (2021)
- Metel, Michael R.; Takeda, Akiko: Stochastic proximal methods for non-smooth non-convex constrained sparse optimization (2021)
- Nguyen, Lam M.; Scheinberg, Katya; Takáč, Martin: Inexact SARAH algorithm for stochastic optimization (2021)
- Zhang, Junyu; Xiao, Lin: Multilevel composite stochastic optimization via nested variance reduction (2021)
- Pham, Nhan H.; Nguyen, Lam M.; Phan, Dzung T.; Tran-Dinh, Quoc: ProxSARAH: an efficient algorithmic framework for stochastic composite nonconvex optimization (2020)
- Wang, Xiao; Zhang, Hongchao: Inexact proximal stochastic second-order methods for nonconvex composite optimization (2020)