ve08

On the unconstrained optimization of partially separable functions We consider the problem of minimizing a smooth objective function f of n real variables. For n>200 we can only hope to locate a local minimum of f within the usual limitations on storage and computing time by using a minimization algorithm that exploits some special structure of f. One such possibility is that the Hessian G(x) of f(x) has clustered eigenvalues at a minimizer x *, in which case conjugate gradient and limited memory variable metric methods were found to work quite well. However, in general, the performance of these methods is rather unpredictable since, except for certain test functions, the eigenvalue structure of G at or near x * is usually not known. Therefore we pursue the traditional approach of approximating f by local quadratic models, which is computationally feasible even for large n if f has a certain separability structure. This structure is always implied by sparsity of G, and depends only on the way in which the components of x enter into f, and not on the numerical values of f or its derivatives. (Source: http://plato.asu.edu)


References in zbMATH (referenced in 148 articles , 3 standard articles )

Showing results 1 to 20 of 148.
Sorted by year (citations)

1 2 3 ... 6 7 8 next

  1. Chen, X.; Toint, Ph. L.: High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms (2021)
  2. Rodomanov, Anton; Nesterov, Yurii: New results on superlinear convergence of classical quasi-Newton methods (2021)
  3. Rodomanov, Anton; Nesterov, Yurii: Greedy quasi-Newton methods with explicit superlinear convergence (2021)
  4. Zhang, Richard Y.; Lavaei, Javad: Sparse semidefinite programs with guaranteed near-linear time complexity via dualized clique tree conversion (2021)
  5. Galli, Leonardo; Galligari, Alessandro; Sciandrone, Marco: A unified convergence framework for nonmonotone inexact decomposition methods (2020)
  6. Gratton, S.; Toint, Ph. L.: A note on solving nonlinear optimization problems in variable precision (2020)
  7. Hosseini Dehmiry, Alireza: The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique (2020)
  8. Hübner, Jens; Schmidt, Martin; Steinbach, Marc C.: Optimization techniques for tree-structured nonlinear problems (2020)
  9. Kuřátko, Jan; Ratschan, Stefan: Solving reachability problems by a scalable constrained optimization method (2020)
  10. Yuan, Gonglin; Wang, Xiaoliang; Sheng, Zhou: The projection technique for two open problems of unconstrained optimization problems (2020)
  11. Yuan, Gonglin; Wang, Zhan; Li, Pengyuan: A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems (2020)
  12. Chen, Xiaojun; Toint, Ph. L.; Wang, H.: Complexity of partially separable convexly constrained optimization with non-Lipschitzian singularities (2019)
  13. Gao, Wenbo; Goldfarb, Donald: Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions (2019)
  14. García, Oscar: Estimating reducible stochastic differential equations by conversion to a least-squares problem (2019)
  15. Petra, Cosmin G.; Chiang, Naiyuan; Anitescu, Mihai: A structured quasi-Newton algorithm for optimizing with incomplete Hessian information (2019)
  16. Tyagi, Hemant; Vybiral, Jan: Learning general sparse additive models from point queries in high dimensions (2019)
  17. Gao, Wenbo; Goldfarb, Donald: Block BFGS methods (2018)
  18. Kronqvist, Jan; Lundell, Andreas; Westerlund, Tapio: Reformulations for utilizing separability when solving convex MINLP problems (2018)
  19. Petra, C. G.; Qiang, F.; Lubin, M.; Huchette, J.: On efficient Hessian computation using the edge pushing algorithm in Julia (2018)
  20. Yuan, Gonglin; Sheng, Zhou; Wang, Bopeng; Hu, Wujie; Li, Chunnian: The global convergence of a modified BFGS method for nonconvex functions (2018)

1 2 3 ... 6 7 8 next