The SHOGUN machine learning toolbox. We have developed a machine learning toolbox, called SHOGUN, which is designed for unified large-scale learning for a broad range of feature types and learning settings. It offers a considerable number of machine learning models such as support vector machines, hidden Markov models, multiple kernel learning, linear discriminant analysis, and more. Most of the specific algorithms are able to deal with several different data classes. We have used this toolbox in several applications from computational biology, some of them coming with no less than 50 million training examples and others with 7 billion test examples. With more than a thousand installations worldwide, SHOGUN is already widely adopted in the machine learning community and beyond. SHOGUN is implemented in C++ and interfaces to MATLAB TM , R, Octave, Python, and has a stand-alone command line interface. The source code is freely available under the GNU General Public License, Version 3 at

References in zbMATH (referenced in 95 articles , 2 standard articles )

Showing results 1 to 20 of 95.
Sorted by year (citations)

1 2 3 4 5 next

  1. Wang, Peiyan; Cai, Dongfeng: Multiple kernel learning by empirical target kernel (2020)
  2. Dong, Fangli; Wang, Xiaozhou: A classifier for multi-dimensional datasets based on Bayesian multiple kernel grouping learning (2019)
  3. Szymański, Piotr; Kajdanowicz, Tomasz: scikit-multilearn: a scikit-based Python environment for performing multi-label classification (2019)
  4. Gao, Hongjuan; Geng, Guohua; Yang, Wen: Sex determination of 3D skull based on a novel unsupervised learning method (2018)
  5. Goberna, M. A.; López, M. A.: Recent contributions to linear semi-infinite optimization: an update (2018)
  6. Patrascu, Andrei; Necoara, Ion: Nonasymptotic convergence of stochastic proximal point methods for constrained convex optimization (2018)
  7. Shikhar Bhardwaj, Ryan R. Curtin, Marcus Edel, Yannis Mentekidis, Conrad Sanderson: ensmallen: a flexible C++ library for efficient function optimization (2018) arXiv
  8. Tang, Jingjing; Tian, Yingjie; Liu, Xiaohui; Li, Dewei; Lv, Jia; Kou, Gang: Improved multi-view privileged support vector machine (2018)
  9. Andrea Esuli, Tiziano Fagni, Alejandro Moreo Fernandez: JaTeCS an open-source JAva TExt Categorization System (2017) arXiv
  10. Chang, Yan-Shuo; Nie, Feiping; Wang, Ming-Yu: Multiview feature analysis via structured sparsity and shared subspace discovery (2017)
  11. Goberna, M. A.; López, M. A.: Recent contributions to linear semi-infinite optimization (2017)
  12. Liu, Weiwei; Tsang, Ivor W.: Making decision trees feasible in ultrahigh feature and label dimensions (2017)
  13. Ryan R. Curtin, Shikhar Bhardwaj, Marcus Edel, Yannis Mentekidis: A generic and fast C++ optimization framework (2017) arXiv
  14. Wang, Xiaoming; Huang, Zengxi; Du, Yajun: Improving localized multiple kernel learning via radius-margin bound (2017)
  15. Antoniuk, Kostiantyn; Franc, Vojtěch; Hlaváč, Václav: V-shaped interval insensitive loss for ordinal classification (2016)
  16. Christmann, Andreas; Dumpert, Florian; Xiang, Dao-Hong: On extension theorems and their connection to universal consistency in machine learning (2016)
  17. Gondzio, Jacek; González-Brevis, Pablo; Munari, Pedro: Large-scale optimization with the primal-dual column generation method (2016)
  18. Mohsenzadeh, Yalda; Sheikhzadeh, Hamid; Nazari, Sobhan: Incremental relevance sample-feature machine: a fast marginal likelihood maximization approach for joint feature selection and classification (2016)
  19. Qi, Chengming; Wang, Yuping; Tian, Wenjie; Wang, Qun: Multiple kernel boosting framework based on information measure for classification (2016)
  20. Rieck, Konrad; Wressnegger, Christian: Harry: a tool for measuring string similarity (2016)

1 2 3 4 5 next