Description (homepage): SVMlight is an implementation of Vapnik’s Support Vector Machine [Vapnik, 1995] for the problem of pattern recognition, for the problem of regression, and for the problem of learning a ranking function. The optimization algorithms used in SVMlight are described in [Joachims, 2002a ]. [Joachims, 1999a]. The algorithm has scalable memory requirements and can handle problems with many thousands of support vectors efficiently. The software also provides methods for assessing the generalization performance efficiently. It includes two efficient estimation methods for both error rate and precision/recall. XiAlpha-estimates [Joachims, 2002a, Joachims, 2000b] can be computed at essentially no computational expense, but they are conservatively biased. Almost unbiased estimates provides leave-one-out testing. SVMlight exploits that the results of most leave-one-outs (often more than 99%) are predetermined and need not be computed [Joachims, 2002a]. New in this version is an algorithm for learning ranking functions [Joachims, 2002c]. The goal is to learn a function from preference examples, so that it orders a new set of objects as accurately as possible. Such ranking problems naturally occur in applications like search engines and recommender systems. Futhermore, this version includes an algorithm for training large-scale transductive SVMs. The algorithm proceeds by solving a sequence of optimization problems lower-bounding the solution using a form of local search. A detailed description of the algorithm can be found in [Joachims, 1999c]. A similar transductive learner, which can be thought of as a transductive version of k-Nearest Neighbor is the Spectral Graph Transducer. SVMlight can also train SVMs with cost models (see [Morik et al., 1999]). The code has been used on a large range of problems, including text classification [Joachims, 1999c][Joachims, 1998a], image recognition tasks, bioinformatics and medical applications. Many tasks have the property of sparse instance vectors. This implementation makes use of this property which leads to a very compact and efficient representation.

References in zbMATH (referenced in 220 articles )

Showing results 1 to 20 of 220.
Sorted by year (citations)

1 2 3 ... 9 10 11 next

  1. Bai, Yan-Qin; Shen, Kai-Ji: Alternating direction method of multipliers for $\ell_1$-$\ell_2$-regularized logistic regression model (2016)
  2. Doğan, Ürün; Glasmachers, Tobias; Igel, Christian: A unified view on multi-class support vector classification (2016)
  3. Souillard-Mandar, William; Davis, Randall; Rudin, Cynthia; Au, Rhoda; Libon, David J.; Swenson, Rodney; Price, Catherine C.; Lamar, Melissa; Penney, Dana L.: Learning classification models of cognitive conditions from subtle behaviors in the digital clock drawing test (2016)
  4. Niu, Lingfeng; Zhou, Ruizhi; Zhao, Xi; Shi, Yong: Two new decomposition algorithms for training bound-constrained support vector machines (2015)
  5. Steidl, Gabriele: Supervised learning by support vector machines (2015)
  6. Veelaert, Peter: Combinatorial properties of support vectors of separating hyperplanes (2015)
  7. Beck, Amir: The 2-coordinate descent method for solving double-sided simplex constrained minimization problems (2014)
  8. Bridge, James P.; Holden, Sean B.; Paulson, Lawrence C.: Machine learning for first-order theorem proving (2014)
  9. Chen, Xiaobo; Yang, Jian; Chen, Long: An improved robust and sparse twin support vector regression via linear programming (2014)
  10. Lee, Jung-Tae; Yang, Min-Chul; Rim, Hae-Chang: Discovering high-quality threaded discussions in online forums (2014)
  11. Pan, Binbin; Lai, Jianhuang; Shen, Lixin: Ideal regularization for learning kernels from labels (2014)
  12. Sadri, Javad; Jalili, Mohammad J.; Akbari, Younes; Foroozandeh, Atefeh: Designing a new standard structure for improving automatic processing of Persian handwritten bank cheques (2014)
  13. Shao, Yuan-Hai; Chen, Wei-Jie; Zhang, Jing-Jing; Wang, Zhen; Deng, Nai-Yang: An efficient weighted Lagrangian twin support vector machine for imbalanced data classification (2014)
  14. Toh, Kar-Ann; Tan, Geok-Choo: Exploiting the relationships among several binary classifiers via data transformation (2014)
  15. Zheng, Songfeng: A generalized Newton algorithm for quantile regression models (2014)
  16. Cocianu, Catalina-Lucia; State, Luminita; Mircea, Marinela; Vlamos, Panayiotis: A faster gradient ascent learning algorithm for nonlinear SVM (2013)
  17. Fabrizio, Jonathan; Marcotegui, Beatriz; Cord, Matthieu: Text detection in street level images (2013)
  18. Hensinger, Elena; Flaounas, Ilias; Cristianini, Nello: Modelling and predicting news popularity (2013)
  19. Jirina, Marcel; Jirina, Marcel jun.: Utilization of singularity exponent in nearest neighbor based classifier (2013)
  20. Zhou, Shuisheng; Cui, Jiangtao; Ye, Feng; Liu, Hongwei; Zhu, Qiang: New smoothing SVM algorithm with tight error bound and efficient reduced techniques (2013)

1 2 3 ... 9 10 11 next