References in zbMATH (referenced in 29 articles )

Showing results 1 to 20 of 29.
Sorted by year (citations)

1 2 next

  1. Demo, Nicola; Ortali, Giulio; Gustin, Gianluca; Rozza, Gianluigi; Lavini, Gianpiero: An efficient computational framework for naval shape design and optimization problems by means of data-driven reduced order modeling techniques (2021)
  2. Benavoli, Alessio; Azzimonti, Dario; Piga, Dario: Skew Gaussian processes for classification (2020)
  3. Burt, David R.; Rasmussen, Carl Edward; van der Wilk, Mark: Convergence of sparse variational inference in Gaussian processes regression (2020)
  4. Ganti, Himakar; Khare, Prashant: Data-driven surrogate modeling of multiphase flows using machine learning techniques (2020)
  5. Jackson, Samuel E.; Vernon, Ian; Liu, Junli; Lindsey, Keith: Understanding hormonal crosstalk in \textitArabidopsisroot development via emulation and history matching (2020)
  6. Kast, Mariella; Guo, Mengwu; Hesthaven, Jan S.: A non-intrusive multifidelity method for the reduced order modeling of nonlinear problems (2020)
  7. Lee, Taeksang; Bilionis, Ilias; Buganza Tepole, Adrian: Propagation of uncertainty in the mechanical and biological response of growing tissues using multi-fidelity Gaussian process regression (2020)
  8. Lu, Xuefei; Rudi, Alessandro; Borgonovo, Emanuele; Rosasco, Lorenzo: Faster Kriging: facing high-dimensional simulators (2020)
  9. Razaaly, Nassim; Persico, Giacomo; Gori, Giulio; Congedo, Pietro Marco: Quantile-based robust optimization of a supersonic nozzle for organic rankine cycle turbines (2020)
  10. Schürch, Manuel; Azzimonti, Dario; Benavoli, Alessio; Zaffalon, Marco: Recursive estimation for sparse Gaussian process regression (2020)
  11. Dias, Mafalda; Frazer, Jonathan; Westphal, Alexander: Inflation as an information bottleneck: a strategy for identifying universality classes and making robust predictions (2019)
  12. Lomelí, M.; Rowland, M.; Gretton, A.; Ghahramani, Z.: Antithetic and Monte Carlo kernel estimators for partial rankings (2019)
  13. Roman Novak, Lechao Xiao, Jiri Hron, Jaehoon Lee, Alexander A. Alemi, Jascha Sohl-Dickstein, Samuel S. Schoenholz: Neural Tangents: Fast and Easy Infinite Neural Networks in Python (2019) arXiv
  14. Vernon, Ian; Jackson, Samuel E.; Cumming, Jonathan A.: Known boundary emulation of complex computer models (2019)
  15. Zhang, Michael Minyi; Williamson, Sinead A.: Embarrassingly parallel inference for Gaussian processes (2019)
  16. Alaa, Ahmed M.; van der Schaar, Mihaela: A hidden absorbing semi-Markov model for informatively censored temporal data: learning and inference (2018)
  17. Antoine Cully; Konstantinos Chatzilygeroudis; Federico Allocati; Jean-Baptiste Mouret: Limbo: A Flexible High-performance Library for Gaussian Processes modeling and Data-Efficient Optimization (2018) not zbMATH
  18. Erickson, Collin B.; Ankenman, Bruce E.; Sanchez, Susan M.: Comparison of Gaussian process modeling software (2018)
  19. Nguyen, Thi Nhat Anh; Bouzerdoum, Abdesselam; Phung, Son Lam: Stochastic variational hierarchical mixture of sparse Gaussian processes for regression (2018)
  20. Razaaly, Nassim; Congedo, Pietro Marco: Novel algorithm using active metamodel learning and importance sampling: application to multiple failure regions of low probability (2018)

1 2 next