RSVM

RSVM: Reduced Support Vector Machines. An algorithm is proposed which generates a nonlinear kernel-based separating surface that requires as little as 1% of a large dataset for its explicit evaluation. To generate this nonlinear surface, the entire dataset is used as a constraint in an optimization problem with very few variables corresponding to the 1% of the data kept. The remainder of the data can be thrown away after solving the optimization problem. This is achieved by making use of a rectangular m × kernel K(A, Ā′) that greatly reduces the size of the quadratic program to be solved and simplifies the characterization of the nonlinear separating surface. Here, the m rows of A represent the original m data points while the rows of Ā represent a greatly reduced data points. Computational results indicate that test set correctness for the reduced support vector machine (RSVM), with a nonlinear separating surface that depends on a small randomly selected portion of the dataset, is better than that of a conventional support vector machine (SVM) with a nonlinear surface that explicitly depends on the entire dataset, and much better than a conventional SVM using a small random sample of the data. Computational times, as well as memory usage, are much smaller for RSVM than that of a conventional SVM using the entire dataset.


References in zbMATH (referenced in 52 articles )

Showing results 41 to 52 of 52.
Sorted by year (citations)
  1. Huang, Chien-Ming; Lee, Yuh-Jye; Lin, Dennis K. J.; Huang, Su-Yun: Model selection for support vector machines via uniform design (2007)
  2. Penna Resende de Carvalho, Bernardo; Soares Lacerda, Wilian; de Pádua Braga, Antônio: RRS+LS-SVM: a new strategy for “a priori” sample selection (2007) ioport
  3. Zhong, Ping; Fukushima, Masao: Regularized nonsmooth Newton method for multi-class support vector machines (2007)
  4. Ince, Huseyin: Non-parametric regression methods (2006)
  5. Xu, Yong; Zhang, David; Jin, Zhong; Li, Miao; Yang, Jing-Yu: A fast kernel-based nonlinear discriminant analysis for multi-class problems (2006)
  6. Zhan, Yiqiang; Shen, Dinggang: An adaptive error penalization method for training an efficient and generalized SVM (2006)
  7. Fung, Glenn M.; Mangasarian, O. L.: Multicategory proximal support vector machine classifiers (2005) ioport
  8. Fung, Glenn M.; Mangasarian, O. L.: Multicategory proximal support vector machine classifiers (2005)
  9. Yu, Hwanjo; Yang, Jiong; Han, Jiawei; Li, Xiaolei: Making SVMs scalable to large data sets using hierarchical cluster indexing (2005) ioport
  10. Yu, Hwanjo; Yang, Jiong; Han, Jiawei; Li, Xiaolei: Making svms scalable to large data sets using hierarchical cluster indexing (2005) ioport
  11. Cawley, Gavin C.; Talbot, Nicola L. C.: Fast exact leave-one-out cross-validation of sparse least-squares support vector machines (2004)
  12. Orsenigo, Carlotta; Vercellis, Carlo: Discrete support vector decision trees via tabu search (2004)