SSVM: A smooth support vector machine for classification. Smoothing methods, extensively used for solving important mathematical programming problems and applications, are applied here to generate and solve an unconstrained smooth reformulation of the support vector machine for pattern classification using a completely arbitrary kernel. We term such reformulation a Smooth Support Vector Machine (SSVM). A fast Newton-Armijo algorithm for solving the SSVM converges globally and quadratically. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm. On six publicly available datasets, tenfold cross validation correctness of SSVM was the highest compared with four other methods as well as the fastest. On larger problems, SSVM was comparable or faster than SVM light [T. Joachims, in: Advances in kernel methods – support vector learning, MIT Press: Cambridge, MA (1999)], SOR [O. L. Mangasarian and D. R. Musicant, IEEE Trans. Neural Networks 10, 1032-1037 (1999)] and SMO [J. Platt, in: Advances in kernel methods – support vector learning, MIT Press: Cambridge, MA (1999)]. SSVM can also generate a highly nonlinear separating surface, such as a checkerboard.

References in zbMATH (referenced in 44 articles , 1 standard article )

Showing results 1 to 20 of 44.
Sorted by year (citations)

1 2 3 next

  1. Balasundaram, S.; Gupta, Deepak; Kapil: Lagrangian support vector regression via unconstrained convex minimization (2014)
  2. Cassioli, A.; Chiavaioli, A.; Manes, C.; Sciandrone, M.: An incremental least squares algorithm for large scale linear classification (2013)
  3. Che, Haitao; Li, Meixia: A smoothing and regularization Broyden-like method for nonlinear inequalities (2013)
  4. Cocianu, Catalina-Lucia; State, Luminita; Mircea, Marinela; Vlamos, Panayiotis: A faster gradient ascent learning algorithm for nonlinear SVM (2013)
  5. Ketabchi, Saeed; Behboodi-Kahoo, Malihe: Smoothing techniques and augmented Lagrangian method for recourse problem of two-stage stochastic linear programming (2013)
  6. Liang, Jinjin; Wu, De: Smooth diagonal weighted Newton support vector machine (2013)
  7. Peng, Jian-Xun; Ferguson, Stuart; Rafferty, Karen; Stewart, Victoria: A sequential algorithm for sparse support vector classifiers (2013)
  8. Rothkopf, Constantin A.; Ballard, Dana H.: Modular inverse reinforcement learning for visuomotor behavior (2013)
  9. Wang, Zhen; Shao, Yuan-Hai; Wu, Tie-Ru: A GA-based model selection for smooth twin parametric-margin support vector machine (2013)
  10. Wu, Qing; Wang, Wenqing: Piecewise-smooth support vector machine for classification (2013)
  11. Yuan, Yubo: Forecasting the movement direction of exchange rate with polynomial smooth support vector machine (2013)
  12. Zhou, Shuisheng; Cui, Jiangtao; Ye, Feng; Liu, Hongwei; Zhu, Qiang: New smoothing SVM algorithm with tight error bound and efficient reduced techniques (2013)
  13. Che, Haitao: A smoothing and regularization predictor-corrector method for nonlinear inequalities (2012)
  14. Yuan, Yubo: Canonical duality solution for alternating support vector machine (2012)
  15. Cao, Feilong; Yuan, Yubo: Learning errors of linear programming support vector regression (2011)
  16. Chang, Chih-Cheng; Chien, Li-Jen; Lee, Yuh-Jye: A novel framework for multi-class classification via ternary smooth support vector machine (2011)
  17. Mao, Ching-Hao; Pao, Hsing-Kuo; Faloutsos, Christos; Lee, Hahn-Ming: SBAD: sequence based attack detection via sequence comparison (2011)
  18. Niu, Lingfeng: Parallel algorithm for training multiclass proximal support vector machines (2011)
  19. Peng, Xinjun: Building sparse twin support vector machine classifiers in primal space (2011)
  20. Peng, Xinjun: TPMSVM: A novel twin parametric-margin support vector machine for pattern recognition (2011)

1 2 3 next