RSVM
RSVM: Reduced Support Vector Machines. An algorithm is proposed which generates a nonlinear kernel-based separating surface that requires as little as 1% of a large dataset for its explicit evaluation. To generate this nonlinear surface, the entire dataset is used as a constraint in an optimization problem with very few variables corresponding to the 1% of the data kept. The remainder of the data can be thrown away after solving the optimization problem. This is achieved by making use of a rectangular m × kernel K(A, Ā′) that greatly reduces the size of the quadratic program to be solved and simplifies the characterization of the nonlinear separating surface. Here, the m rows of A represent the original m data points while the rows of Ā represent a greatly reduced data points. Computational results indicate that test set correctness for the reduced support vector machine (RSVM), with a nonlinear separating surface that depends on a small randomly selected portion of the dataset, is better than that of a conventional support vector machine (SVM) with a nonlinear surface that explicitly depends on the entire dataset, and much better than a conventional SVM using a small random sample of the data. Computational times, as well as memory usage, are much smaller for RSVM than that of a conventional SVM using the entire dataset.
Keywords for this software
References in zbMATH (referenced in 49 articles )
Showing results 1 to 20 of 49.
Sorted by year (- Pang, Xinying; Xu, Yitian: A safe screening rule for accelerating weighted twin support vector machine (2019)
- Gu, Weizhe; Chen, Wei-Po; Ko, Chun-Hsu; Lee, Yuh-Jye; Chen, Jein-Shan: Two smooth support vector machines for (\varepsilon)-insensitive regression (2018)
- Khemchandani, Reshma; Saigal, Pooja; Chandra, Suresh: Angle-based twin support vector machine (2018)
- Manh Cuong, Nguyen; Van Thien, Nguyen: A method for reducing the number of support vectors in fuzzy support vector machine (2016)
- Wang, Di; Zhang, Xiaoqin; Fan, Mingyu; Ye, Xiuzi: Hierarchical mixing linear support vector machines for nonlinear classification (2016)
- Zhao, Yong-Ping: Parsimonious kernel extreme learning machine in primal via Cholesky factorization (2016)
- Lee, G. E.; Zaknich, A.: A mixed-integer programming approach to GRNN parameter estimation (2015)
- Zhao, Yong-Ping; Wang, Kang-Kang; Li, Fu: A pruning method of refining recursive reduced least squares support vector regression (2015)
- Bai, Yan-Qin; Shen, Yan-Jun; Shen, Kai-Ji: Consensus proximal support vector machine for classification problems with sparse solutions (2014)
- Couellan, Nicolas; Jan, Sophie: Incremental accelerated gradient methods for SVM classification: study of the constrained approach (2014)
- Shabanzadeh, Parvaneh; Yusof, Rubiyah: A new method for solving supervised data classification problems (2014)
- Shao, Yuan-Hai; Chen, Wei-Jie; Deng, Nai-Yang: Nonparallel hyperplane support vector machine for binary classification problems (2014)
- Chang, Lo-Bin; Bai, Zhidong; Huang, Su-Yun; Hwang, Chii-Ruey: Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices (2013)
- Fung, Glenn M.; Mangasarian, Olvi L.: Privacy-preserving linear and nonlinear approximation via linear programming (2013)
- Ma, Jiayi; Zhao, Ji; Tian, Jinwen; Bai, Xiang; Tu, Zhuowen: Regularized vector field learning with sparse approximation for mismatch removal (2013)
- Wang, Zhen; Shao, Yuan-Hai; Wu, Tie-Ru: A GA-based model selection for smooth twin parametric-margin support vector machine (2013)
- Xia, Xiao-Lei; Qian, Suxiang; Liu, Xueqin; Xing, Huanlai: Efficient model selection for sparse least-square SVMs (2013)
- Zhou, Shuisheng; Cui, Jiangtao; Ye, Feng; Liu, Hongwei; Zhu, Qiang: New smoothing SVM algorithm with tight error bound and efficient reduced techniques (2013)
- Chang, Chien-Chung; Pao, Hsing-Kuo; Lee, Yuh-Jye: An RSVM based two-teachers-one-student semi-supervised learning algorithm (2012) ioport
- Huang, Chia-Hui: A reduced support vector machine approach for interval regression analysis (2012)