UCI-ml

UC Irvine Machine Learning Repository. We currently maintain 251 data sets as a service to the machine learning community. You may view all data sets through our searchable interface. Our old web site is still available, for those who prefer the old format. For a general overview of the Repository, please visit our About page. For information about citing data sets in publications, please read our citation policy. If you wish to donate a data set, please consult our donation policy. For any other questions, feel free to contact the Repository librarians. We have also set up a mirror site for the Repository. The UCI Machine Learning Repository is a collection of databases, domain theories, and data generators that are used by the machine learning community for the empirical analysis of machine learning algorithms. The archive was created as an ftp archive in 1987 by David Aha and fellow graduate students at UC Irvine. Since that time, it has been widely used by students, educators, and researchers all over the world as a primary source of machine learning data sets. As an indication of the impact of the archive, it has been cited over 1000 times, making it one of the top 100 most cited ”papers” in all of computer science. The current version of the web site was designed in 2007 by Arthur Asuncion and David Newman, and this project is in collaboration with Rexa.info at the University of Massachusetts Amherst. Funding support from the National Science Foundation is gratefully acknowledged. Many people deserve thanks for making the repository a success. Foremost among them are the donors and creators of the databases and data generators. Special thanks should also go to the past librarians of the repository: David Aha, Patrick Murphy, Christopher Merz, Eamonn Keogh, Cathy Blake, Seth Hettich, and David Newman.


References in zbMATH (referenced in 2286 articles )

Showing results 1 to 20 of 2286.
Sorted by year (citations)

1 2 3 ... 113 114 115 next

  1. Aduenko, Alexander A.; Motrenko, Anastasia P.; Strijov, Vadim V.: Object selection in credit scoring using covariance matrix of parameters estimations (2018)
  2. Amin, Talha; Moshkov, Mikhail: Totally optimal decision rules (2018)
  3. Chen, Eunice Yuh-Jie; Darwiche, Adnan; Choi, Arthur: On pruning with the MDL score (2018)
  4. Cortijo, Santiago; Gonzales, Christophe: On conditional truncated densities Bayesian networks (2018)
  5. David M. Burns, Cari M. Whyne: Seglearn: A Python Package for Learning Sequences and Time Series (2018) arXiv
  6. Du, Wen Sheng; Hu, Bao Qing: A fast heuristic attribute reduction approach to ordered decision systems (2018)
  7. Fasiolo, Matteo; de Melo, Flávio Eler; Maskell, Simon: Langevin incremental mixture importance sampling (2018)
  8. Franc, Vojtech; Fikar, Ondrej; Bartos, Karel; Sofka, Michal: Learning data discretization via convex optimization (2018)
  9. Ghaddar, Bissan; Naoum-Sawaya, Joe: High dimensional data classification and feature selection using support vector machines (2018)
  10. Hooker, Giles; Mentch, Lucas: Bootstrap bias corrections for ensemble methods (2018)
  11. Kordík, Pavel; Černý, Jan; Frýda, Tomáš: Discovering predictive ensembles for transfer learning and meta-learning (2018)
  12. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
  13. Lorena, Ana C.; Maciel, Aron I.; de Miranda, Péricles B. C.; Costa, Ivan G.; Prud^encio, Ricardo B. C.: Data complexity meta-features for regression problems (2018)
  14. Ma, Anna; Needell, Deanna; Ramdas, Aaditya: Iterative methods for solving factorized linear systems (2018)
  15. Malone, Brandon; Kangas, Kustaa; Järvisalo, Matti; Koivisto, Mikko; Myllymäki, Petri: Empirical hardness of finding optimal Bayesian network structures: algorithm selection and runtime prediction (2018)
  16. Muñoz, Mario A.; Villanova, Laura; Baatar, Davaatseren; Smith-Miles, Kate: Instance spaces for machine learning classification (2018)
  17. Panday, Deepak; Cordeiro de Amorim, Renato; Lane, Peter: Feature weighting as a tool for unsupervised feature selection (2018)
  18. Poon, Leonard K.M.; Liu, April H.; Zhang, Nevin L.: UC-LTM: unidimensional clustering using latent tree models for discrete data (2018)
  19. Raza, Muhammad Summair; Qamar, Usman: Feature selection using rough set-based direct dependency calculation by avoiding the positive region (2018)
  20. Abaszade, Maryam; Effati, Sohrab: Support vector regression with random output variable and probabilistic constraints (2017)

1 2 3 ... 113 114 115 next