LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail:

References in zbMATH (referenced in 976 articles )

Showing results 1 to 20 of 976.
Sorted by year (citations)

1 2 3 ... 47 48 49 next

  1. Chang, Chuan-Yu; Srinivasan, Kathiravan; Hu, Hui-Ya; Tsai, Yuh-Shyan; Sharma, Vishal; Agarwal, Punjal: SFFS-SVM based prostate carcinoma diagnosis in DCE-MRI via ACM segmentation (2020)
  2. Fercoq, Olivier; Qu, Zheng: Restarting the accelerated coordinate descent method with a rough strong convexity estimate (2020)
  3. Gustavo Henrique de Rosa, João Paulo Papa, Alexandre Xavier Falcão: OPFython: A Python-Inspired Optimum-Path Forest Classifier (2020) arXiv
  4. Heider, Yousef; Wang, Kun; Sun, WaiChing: SO(3)-invariance of informed-graph-based deep neural network for anisotropic elastoplastic materials (2020)
  5. Jiang, Wei; Siddiqui, Sauleh: Hyper-parameter optimization for support vector machines using stochastic gradient descent and dual coordinate descent (2020)
  6. Jiang, Zehua; Liu, Keyu; Yang, Xibei; Yu, Hualong; Fujita, Hamido; Qian, Yuhua: Accelerator for supervised neighborhood based attribute reduction (2020)
  7. Lindeberg, Tony: Provably scale-covariant continuous hierarchical networks based on scale-normalized differential expressions coupled in cascade (2020)
  8. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope (2020)
  9. Lv, Didi; Zhou, Qingping; Choi, Jae Kyu; Li, Jinglai; Zhang, Xiaoqun: Nonlocal TV-Gaussian prior for Bayesian inverse problems with applications to limited CT reconstruction (2020)
  10. Mishchenko, Konstantin; Iutzeler, Franck; Malick, Jérôme: A distributed flexible delay-tolerant proximal gradient algorithm (2020)
  11. Oishi, Atsuya; Yagawa, Genki: A surface-to-surface contact search method enhanced by deep learning (2020)
  12. Park, Seonho; Jung, Seung Hyun; Pardalos, Panos M.: Combining stochastic adaptive cubic regularization with negative curvature for nonconvex optimization (2020)
  13. Wang, Guoqiang; Wei, Xinyuan; Yu, Bo; Xu, Lijun: An efficient proximal block coordinate homotopy method for large-scale sparse least squares problems (2020)
  14. Zhang, Yangjing; Zhang, Ning; Sun, Defeng; Toh, Kim-Chuan: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems (2020)
  15. Abdulhussain, Sadiq H.; Ramli, Abd Rahman; Mahmmod, Basheera M.; Saripan, M. Iqbal; Al-Haddad, S. A. R.; Jassim, Wissam A.: A new hybrid form of Krawtchouk and Tchebichef polynomials: design and application (2019)
  16. Ahookhosh, Masoud; Neumaier, Arnold: An optimal subgradient algorithm with subspace search for costly convex optimization problems (2019)
  17. Alves, M. Marques; Geremia, Marina: Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng’s F-B four-operator splitting method for solving monotone inclusions (2019)
  18. Amir M. Mir; Jalal A. Nasiri: LightTwinSVM: A Simple and Fast Implementation of Standard Twin Support Vector Machine Classifier (2019) not zbMATH
  19. Baumann, P.; Hochbaum, D. S.; Yang, Y. T.: A comparative study of the leading machine learning techniques and two new optimization algorithms (2019)
  20. Chen, Mingshuai; Wang, Jian; An, Jie; Zhan, Bohua; Kapur, Deepak; Zhan, Naijun: NIL: learning nonlinear interpolants (2019)

1 2 3 ... 47 48 49 next