RSVM: Reduced Support Vector Machines. An algorithm is proposed which generates a nonlinear kernel-based separating surface that requires as little as 1% of a large dataset for its explicit evaluation. To generate this nonlinear surface, the entire dataset is used as a constraint in an optimization problem with very few variables corresponding to the 1% of the data kept. The remainder of the data can be thrown away after solving the optimization problem. This is achieved by making use of a rectangular m × kernel K(A, Ā′) that greatly reduces the size of the quadratic program to be solved and simplifies the characterization of the nonlinear separating surface. Here, the m rows of A represent the original m data points while the rows of Ā represent a greatly reduced data points. Computational results indicate that test set correctness for the reduced support vector machine (RSVM), with a nonlinear separating surface that depends on a small randomly selected portion of the dataset, is better than that of a conventional support vector machine (SVM) with a nonlinear surface that explicitly depends on the entire dataset, and much better than a conventional SVM using a small random sample of the data. Computational times, as well as memory usage, are much smaller for RSVM than that of a conventional SVM using the entire dataset.

References in zbMATH (referenced in 52 articles )

Showing results 21 to 40 of 52.
Sorted by year (citations)
  1. Huang, Chia-Hui: A reduced support vector machine approach for interval regression analysis (2012)
  2. Shao, Yuan-Hai; Deng, Nai-Yang: A coordinate descent margin based-twin support vector machine for classification (2012)
  3. Ye, Qiaolin; Zhao, Chunxia; Ye, Ning; Zheng, Hao; Chen, Xiaobo: A feature selection method for nonparallel plane support vector machine classification (2012)
  4. Yu, Hwanjo; Kim, Jinha; Kim, Youngdae; Hwang, Seungwon; Lee, Young Ho: An efficient method for learning nonlinear ranking SVM functions (2012) ioport
  5. Zhao, Yong-Ping; Sun, Jian-Guo; Du, Zhong-Hua; Zhang, Zhi-An; Li, Ye-Bo: Online independent reduced least squares support vector regression (2012)
  6. Chang, Chih-Cheng; Chien, Li-Jen; Lee, Yuh-Jye: A novel framework for multi-class classification via ternary smooth support vector machine (2011)
  7. Peng, Xinjun: TPMSVM: A novel twin parametric-margin support vector machine for pattern recognition (2011)
  8. Pham, Huy Nguyen Anh; Triantaphyllou, Evangelos: A meta-heuristic approach for improving the accuracy in some classification algorithms (2011)
  9. Woodsend, Kristian; Gondzio, Jacek: Exploiting separability in large-scale linear support vector machine training (2011)
  10. Ghorai, Santanu; Hossain, Shaikh Jahangir; Mukherjee, Anirban; Dutta, Pranab K.: Newton’s method for nonparallel plane proximal classifier with unity norm hyperplanes (2010)
  11. Orabona, Francesco; Castellini, Claudio; Caputo, Barbara; Jie, Luo; Sandini, Giulio: On-line independent support vector machines (2010)
  12. Wang, Zhuang; Vucetic, Slobodan: Online training on a budget of support vector machines using twin prototypes (2010)
  13. Ghorai, Santanu; Mukherjee, Anirban; Dutta, Pranab K.: Nonparallel plane proximal classifier (2009)
  14. Huang, Su-Yun; Lee, Mei-Hsien; Hsiao, Chuhsing Kate: Nonlinear measures of association with kernel canonical correlation analysis and applications (2009)
  15. Mangasarian, O. L.; Wild, E. W.; Fung, G. M.: Proximal knowledge-based classification (2009)
  16. Zhao, Yongping; Sun, Jianguo: Recursive reduced least squares support vector regression (2009)
  17. Balcázar, José L.; Dai, Yang; Tanaka, Junichi; Watanabe, Osamu: Provably fast training algorithms for support vector machines (2008)
  18. Mangasarian, O. L.; Thompson, Michael E.: Chunking for massive nonlinear kernel classification (2008)
  19. Mangasarian, O. L.; Wild, E. W.: Multiple instance classification via successive linear programming (2008)
  20. Valyon, József; Horváth, Gábor: Selection methods for extended least squares support vector machines (2008)