COFFIN: Computational Framework for Linear SVMs In a variety of applications, kernel machines such as Support Vector Machines (SVMs) have been used with great success often delivering state-of-the-art results. Using the kernel trick, they work on several domains and even enable heterogeneous data fusion by concatenating feature spaces or multiple kernel learning. Unfortunately, they are not suited for truly large-scale applications since they suffer from the curse of supporting vectors, e.g., the speed of applying SVMs decays linearly with the number of support vectors. In this paper we develop COFFIN --- a new training strategy for linear SVMs that effectively allows the use of on demand computed kernel feature spaces and virtual examples in the primal. With linear training and prediction effort this framework leverages SVM applications to truly large-scale problems: As an example, we train SVMs for human splice site recognition involving 50 million examples and sophisticated string kernels. Additionally, we learn an SVM based gender detector on 5 million examples on low-tech hardware and achieve beyond the state-of-the-art accuracies on both tasks.
Keywords for this software
References in zbMATH (referenced in 2 articles )
Showing results 1 to 2 of 2.
- Antoniuk, Kostiantyn; Franc, Vojtěch; Hlaváč, Václav: V-shaped interval insensitive loss for ordinal classification (2016)
- Martínez, Ana M.; Webb, Geoffrey I.; Chen, Shenglei; Zaidi, Nayyar A.: Scalable learning of Bayesian network classifiers (2016)
Further publications can be found at: http://sonnenburgs.de/soeren/item/SonFra10/abstract/#SonFra10