PMTK

PMTK is a collection of Matlab/Octave functions, written by Matt Dunham, Kevin Murphy and various other people. The toolkit is primarily designed to accompany Kevin Murphy’s textbook Machine learning: a probabilistic perspective, but can also be used independently of this book. The goal is to provide a unified conceptual and software framework encompassing machine learning, graphical models, and Bayesian statistics (hence the logo). (Some methods from frequentist statistics, such as cross validation, are also supported.) Since December 2011, the toolbox is in maintenance mode, meaning that bugs will be fixed, but no new features will be added (at least not by Kevin or Matt).


References in zbMATH (referenced in 168 articles )

Showing results 1 to 20 of 168.
Sorted by year (citations)

1 2 3 ... 7 8 9 next

  1. Keller, Rachael T.; Du, Qiang: Discovery of dynamics using linear multistep methods (2021)
  2. Oliehoek, Frans A.; Witwicki, Stefan; Kaelbling, Leslie P.: A sufficient statistic for influence in structured multiagent environments (2021)
  3. Chen, Nan; Majda, Andrew J.: Predicting observed and hidden extreme events in complex nonlinear dynamical systems with partial observations and short training time series (2020)
  4. Duan, Bojia; Yuan, Jiabin; Yu, Chao-Hua; Huang, Jianbang; Hsieh, Chang-Yu: A survey on HHL algorithm: from theory to application in quantum machine learning (2020)
  5. Dunlop, Matthew M.; Helin, Tapio; Stuart, Andrew M.: Hyperparameter estimation in Bayesian MAP estimation: parameterizations and consistency (2020)
  6. Gurevich, Pavel; Stuke, Hannes: Gradient conjugate priors and multi-layer neural networks (2020)
  7. He, Qizhi; Chen, Jiun-Shyan: A physics-constrained data-driven approach based on locally convex reconstruction for noisy database (2020)
  8. Holm-Jensen, Tue; Hansen, Thomas Mejer: Linear waveform tomography inversion using machine learning algorithms (2020)
  9. Hosseini, Reshad; Sra, Suvrit: An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization (2020)
  10. Hung, Ying-Chao; Michailidis, George; PakHai Lok, Horace: Locating infinite discontinuities in computer experiments (2020)
  11. Inatsu, Yu; Karasuyama, Masayuki; Inoue, Keiichi; Kandori, Hideki; Takeuchi, Ichiro: Active learning of Bayesian linear models with high-dimensional binary features by parameter confidence-region estimation (2020)
  12. Kaandorp, Mikael L. A.; Dwight, Richard P.: Data-driven modelling of the Reynolds stress tensor using random forests with invariance (2020)
  13. Keshavarzzadeh, Vahid; Kirby, Robert M.; Narayan, Akil: Stress-based topology optimization under uncertainty via simulation-based Gaussian process (2020)
  14. Kuwajima, Hiroshi; Yasuoka, Hirotoshi; Nakae, Toshihiro: Engineering problems in machine learning systems (2020)
  15. Lau, John W.; Cripps, Edward; Hui, Wendy: Variational inference for multiplicative intensity models (2020)
  16. Li, Hang; Del Castillo, Enrique; Runger, George: Rejoinder on: “On active learning methods for manifold data” (2020)
  17. Lin, Peng; Neil, Martin; Fenton, Norman: Improved high dimensional discrete Bayesian network inference using triplet region construction (2020)
  18. Liu, Jiapeng; Kadziński, Miłosz; Liao, Xiuwu; Mao, Xiaoxin; Wang, Yao: A preference learning framework for multiple criteria sorting with diverse additive value models and valued assignment examples (2020)
  19. López-Lopera, Andrés F.; Bachoc, François; Durrande, Nicolas; Rohmer, Jérémy; Idier, Déborah; Roustant, Olivier: Approximating Gaussian process emulators with linear inequality constraints and noisy observations via MC and MCMC (2020)
  20. Mahajan, Pravar Dilip; Maurya, Abhinav; Megahed, Aly; Elwany, Alaa; Strong, Ray; Blomberg, Jeanette: Optimizing predictive precision in imbalanced datasets for actionable revenue change prediction (2020)

1 2 3 ... 7 8 9 next