LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail:

References in zbMATH (referenced in 1124 articles )

Showing results 1 to 20 of 1124.
Sorted by year (citations)

1 2 3 ... 55 56 57 next

  1. Blanchard, Gilles; Deshmukh, Aniket Anand; Dogan, Urun; Lee, Gyemin; Scott, Clayton: Domain generalization by marginal transfer learning (2021)
  2. Burkina, M.; Nazarov, I.; Panov, M.; Fedonin, G.; Shirokikh, B.: Inductive matrix completion with feature selection (2021)
  3. Galvan, Giulio; Lapucci, Matteo; Lin, Chih-Jen; Sciandrone, Marco: A two-level decomposition framework exploiting first and second order information for SVM training problems (2021)
  4. Gower, Robert M.; Richtárik, Peter; Bach, Francis: Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (2021)
  5. Han, Biao; Shang, Chao; Huang, Dexian: Multiple kernel learning-aided robust optimization: learning algorithm, computational tractability, and usage in multi-stage decision-making (2021)
  6. Jiang, Gaoxia; Wang, Wenjian; Qian, Yuhua; Liang, Jiye: A unified sample selection framework for output noise filtering: an error-bound perspective (2021)
  7. Lei, Yunwen; Ying, Yiming: Stochastic proximal AUC maximization (2021)
  8. Li, Zhu; Ton, Jean-Francois; Oglic, Dino; Sejdinovic, Dino: Towards a unified analysis of random Fourier features (2021)
  9. Lu, Haihao; Freund, Robert M.: Generalized stochastic Frank-Wolfe algorithm with stochastic “substitute” gradient for structured convex optimization (2021)
  10. Mudunuru, M. K.; Karra, S.: Physics-informed machine learning models for predicting the progress of reactive-mixing (2021)
  11. Nakayama, Shummin; Narushima, Yasushi; Yabe, Hiroshi: Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions (2021)
  12. Rodomanov, Anton; Nesterov, Yurii: Greedy quasi-Newton methods with explicit superlinear convergence (2021)
  13. Sun, Ruoyu; Ye, Yinyu: Worst-case complexity of cyclic coordinate descent: (O(n^2)) gap with randomized version (2021)
  14. Uribe, César A.; Lee, Soomin; Gasnikov, Alexander; Nedić, Angelia: A dual approach for optimal algorithms in distributed optimization over networks (2021)
  15. Zhang, Ning: A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems (2021)
  16. Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
  17. Bauermeister, Christoph; Keren, Hanna; Braun, Jochen: Unstructured network topology begets order-based representation by privileged neurons (2020)
  18. Bellavia, Stefania; Krejić, Nataša; Morini, Benedetta: Inexact restoration with subsampled trust-region methods for finite-sum minimization (2020)
  19. Blanco, Victor; Puerto, Justo; Rodriguez-Chia, Antonio M.: On (\ell_p)-support vector machines and multidimensional kernels (2020)
  20. Chan, Raymond H.; Kan, Kelvin K.; Nikolova, Mila; Plemmons, Robert J.: A two-stage method for spectral-spatial classification of hyperspectral images (2020)

1 2 3 ... 55 56 57 next