tn
Newton-type minimization via the Lanczos method This paper discusses the use of the linear conjugate-gradient method (developed via the Lanczos method) in the solution of large-scale unconstrained minimization problems. It is shown how the equivalent Lanczos characterization of the linear conjugate-gradient method may be exploited to define a modified Newton method which can be applied to problems that do not necessarily have positive-definite Hessian matrices. This derivation also makes it possible to compute a negative-curvature direction at a stationary point. The above mentioned modified Lanczos algorithm requires up to n iterations to compute the search direction, where n denotes the number of variables of the problem. The idea of a truncated Newton method is to terminate the iterations earlier. A preconditioned truncated Newton method is described that defines a search direction which interpolates between the direction defined by a nonlinear conjugate-gradient-type method and a modified Newton direction. Numerical results are given which show the promising performance of truncated Newton methods.
(Source: http://plato.asu.edu)
Keywords for this software
References in zbMATH (referenced in 99 articles )
Showing results 1 to 20 of 99.
Sorted by year (- Grote, Marcus J.; Kray, Marie; Nahum, Uri: Adaptive eigenspace method for inverse scattering problems in the frequency domain (2017)
- Métivier, L.; Brossier, R.; Operto, S.; Virieux, J.: Full waveform inversion and the truncated Newton method (2017)
- Di, Zichao (Wendy); Leyffer, Sven; Wild, Stefan M.: Optimization-based approach for joint X-ray fluorescence and transmission tomographic inversion (2016)
- Xu, Wei; Zheng, Ning; Hayami, Ken: Jacobian-free implicit inner-iteration preconditioner for nonlinear least squares problems (2016)
- Fasano, Giovanni: A framework of conjugate direction methods for symmetric linear systems in optimization (2015)
- De Simone, V.; di Serafino, D.: A matrix-free approach to build band preconditioners for large-scale bound-constrained optimization (2014)
- Lukšan, Ladislav; Vlček, Jan: Efficient tridiagonal preconditioner for the matrix-free truncated Newton method (2014)
- O’Malley, D.; Vesselinov, V.V.; Cushman, J.H.: A method for identifying diffusive trajectories with stochastic models (2014)
- Fasano, Giovanni; Roma, Massimo: Preconditioning Newton-Krylov methods in nonconvex large scale optimization (2013)
- Kojima, Masakazu; Yamashita, Makoto: Enclosing ellipsoids and elliptic cylinders of semialgebraic sets and their application to error bounds in polynomial optimization (2013)
- Métivier, L.; Brossier, R.; Virieux, J.; Operto, S.: Full waveform inversion and the truncated Newton method (2013)
- Armand, Paul; Benoist, Joël; Dussault, Jean-Pierre: Local path-following property of inexact interior methods in nonlinear programming (2012)
- Chouzenoux, E.; Moussaoui, S.; Idier, J.: Majorize-minimize linesearch for inversion methods involving barrier function optimization (2012)
- Kiiveri, Harri; de Hoog, Frank: Fitting very large sparse Gaussian graphical models (2012)
- Papadimitriou, D.I.; Giannakoglou, K.C.: Aerodynamic design using the truncated Newton algorithm and the continuous adjoint approach (2012)
- Byrd, Richard H.; Chin, Gillian M.; Neveitt, Will; Nocedal, Jorge: On the use of stochastic Hessian information in optimization methods for machine learning (2011)
- De Hoog, F.R.; Anderssen, R.S.; Lukas, M.A.: Differentiation of matrix functionals using triangular factorization (2011)
- Malmedy, Vincent; Toint, Philippe L.: Approximating Hessians in unconstrained optimization arising from discretized problems (2011)
- Yuan, Gonglin; Wei, Zengxin; Lu, Sha: Limited memory BFGS method with backtracking for symmetric nonlinear equations (2011)
- Andrei, Neculai: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization (2010)