Gaussian processes for machine learning (GPML) toolbox. The GPML toolbox provides a wide range of functionality for Gaussian process (GP) inference and prediction. GPs are specified by mean and covariance functions; we offer a library of simple mean and covariance functions and mechanisms to compose more complex ones. Several likelihood functions are supported including Gaussian and heavy-tailed for regression as well as others suitable for classification. Finally, a range of inference methods is provided, including exact and variational inference, Expectation Propagation, and Laplace’s method dealing with non-Gaussian likelihoods and FITC for dealing with large regression tasks.

References in zbMATH (referenced in 38 articles , 1 standard article )

Showing results 1 to 20 of 38.
Sorted by year (citations)

1 2 next

  1. Bartels, Simon; Hennig, Philipp: Conjugate gradients for kernel machines (2020)
  2. Binois, Mickael; Picheny, Victor; Taillandier, Patrick; Habbal, Abderrahmane: The Kalai-Smorodinsky solution for many-objective Bayesian optimization (2020)
  3. Burkhart, Michael C.; Brandman, David M.; Franco, Brian; Hochberg, Leigh R.; Harrison, Matthew T.: The discriminative Kalman filter for Bayesian filtering with nonlinear and Nongaussian observation models (2020)
  4. Chen, Chen; Liao, Qifeng: ANOVA Gaussian process modeling for high-dimensional stochastic computational models (2020)
  5. Chen, Hanshu; Meng, Zeng; Zhou, Huanlin: A hybrid framework of efficient multi-objective optimization of stiffened shells with imperfection (2020)
  6. Hartmann, Marcelo; Vanhatalo, Jarno: Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic Student-(t) model (2019)
  7. Herlands, William; Neill, Daniel B.; Nickisch, Hannes; Wilson, Andrew Gordon: Change surfaces for expressive multidimensional changepoints and counterfactual prediction (2019)
  8. Li, Yongqiang; Yang, Chengzan; Hou, Zhongsheng; Feng, Yuanjing; Yin, Chenkun: Data-driven approximate Q-learning stabilization with optimality error bound analysis (2019)
  9. Mao, Zhiping; Li, Zhen; Karniadakis, George Em: Nonlocal flocking dynamics: learning the fractional order of PDEs from particle simulations (2019)
  10. Pang, Guofei; Yang, Liu; Karniadakis, George Em: Neural-net-induced Gaussian process regression for function approximation and PDE solution (2019)
  11. Price, Ilan; Fowkes, Jaroslav; Hopman, Daniel: Gaussian processes for unconstraining demand (2019)
  12. Seongil Jo; Taeryon Choi; Beomjo Park; Peter Lenk: bsamGP: An R Package for Bayesian Spectral Analysis Models Using Gaussian Process Priors (2019) not zbMATH
  13. Seshadri, Pranay; Yuchi, Shaowu; Parks, Geoffrey T.: Dimension reduction via Gaussian ridge functions (2019)
  14. Bradford, Eric; Schweidtmann, Artur M.; Lapkin, Alexei: Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm (2018)
  15. Schulz, Eric; Speekenbrink, Maarten; Krause, Andreas: A tutorial on Gaussian process regression: modelling, exploring, and exploiting functions (2018)
  16. Van Steenkiste, Tom; van der Herten, Joachim; Couckuyt, Ivo; Dhaene, Tom: Sequential sensitivity analysis of expensive black-box simulators with metamodelling (2018)
  17. Bussas, Matthias; Sawade, Christoph; Kühn, Nicolas; Scheffer, Tobias; Landwehr, Niels: Varying-coefficient models for geospatial transfer learning (2017)
  18. Ghosh, Sanmitra; Dasmahapatra, Srinandan; Maharatna, Koushik: Fast approximate Bayesian computation for estimating parameters in differential equations (2017)
  19. Li, Yongqiang; Hou, Zhongsheng; Feng, Yuanjing; Chi, Ronghu: Data-driven approximate value iteration with optimality error bound analysis (2017)
  20. Matthews, Alexander G. De G.; van der Wilk, Mark; Nickson, Tom; Fujii, Keisuke; Boukouvalas, Alexis; León-Villagrá, Pablo; Ghahramani, Zoubin; Hensman, James: GPflow: a Gaussian process library using TensorFlow (2017)

1 2 next