ve08
On the unconstrained optimization of partially separable functions We consider the problem of minimizing a smooth objective function f of n real variables. For n>200 we can only hope to locate a local minimum of f within the usual limitations on storage and computing time by using a minimization algorithm that exploits some special structure of f. One such possibility is that the Hessian G(x) of f(x) has clustered eigenvalues at a minimizer x *, in which case conjugate gradient and limited memory variable metric methods were found to work quite well. However, in general, the performance of these methods is rather unpredictable since, except for certain test functions, the eigenvalue structure of G at or near x * is usually not known. Therefore we pursue the traditional approach of approximating f by local quadratic models, which is computationally feasible even for large n if f has a certain separability structure. This structure is always implied by sparsity of G, and depends only on the way in which the components of x enter into f, and not on the numerical values of f or its derivatives.
(Source: http://plato.asu.edu)
Keywords for this software
References in zbMATH (referenced in 127 articles , 3 standard articles )
Showing results 1 to 20 of 127.
Sorted by year (- Gao, Wenbo; Goldfarb, Donald: Block BFGS methods (2018)
- Kronqvist, Jan; Lundell, Andreas; Westerlund, Tapio: Reformulations for utilizing separability when solving convex MINLP problems (2018)
- Yuan, Gonglin; Sheng, Zhou; Wang, Bopeng; Hu, Wujie; Li, Chunnian: The global convergence of a modified BFGS method for nonconvex functions (2018)
- Cao, Hui-Ping; Li, Dong-Hui: Partitioned quasi-Newton methods for sparse nonlinear equations (2017)
- Gaur, Daya R.; Hossain, Shahadat; Saha, Anik: Determining sparse Jacobian matrices using two-sided compression: an algorithm and lower bounds (2016)
- Huang, Wen; Gallivan, K. A.; Absil, P.-A.: A Broyden class of quasi-Newton methods for Riemannian optimization (2015)
- Bidabadi, Narges; Mahdavi-Amiri, Nezam: Superlinearly convergent exact penalty methods with projected structured secant updates for constrained nonlinear least squares (2014)
- Dai, Yu-Hong; Yamashita, Nobuo: Analysis of sparse quasi-Newton updates with positive definite matrix completion (2014)
- Gower, Robert Mansel; Mello, Margarida Pinheiro: Computing the sparsity pattern of Hessians using automatic differentiation (2014)
- Kchouk, Bilel; Dussault, Jean-Pierre: On per-iteration complexity of high order Chebyshev methods for sparse functions with banded Hessians (2014)
- Xue, Dan; Sun, Wenyu; Qi, Liqun: An alternating structured trust region algorithm for separable optimization problems with nonconvex constraints (2014)
- Zhang, Yong; Zhu, Detong: A family of the local convergence of the improved secant methods for nonlinear equality constrained optimization subject to bounds on variables (2014)
- Bidabadi, Narges; Mahdavi-Amiri, Nezam: A two-step superlinearly convergent projected structured BFGS method for constrained nonlinear least squares (2013)
- Cartis, Coralia; Gould, Nicholas I. M.; Toint, Philippe L.: On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization (2012)
- Gundersen, Geir; Steihaug, Trond: On diagonally structured problems in unconstrained optimization using an inexact super Halley method (2012)
- Gundersen, Geir; Steihaug, Trond: Sparsity in higher order methods for unconstrained optimization (2012)
- Vlček, J.; Lukšan, L.: A conjugate directions approach to improve the limited-memory BFGS method (2012)
- Malmedy, Vincent; Toint, Philippe L.: Approximating Hessians in unconstrained optimization arising from discretized problems (2011)
- Toivanen, Jukka I.; Mäkinen, Raino A. E.: Implementation of sparse forward mode automatic differentiation with application to electromagnetic shape optimization (2011)
- Andrei, Neculai: Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization (2010)