BoTorch

BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design. We introduce BoTorch, a modern programming framework for Bayesian optimization that combines Monte-Carlo (MC) acquisition functions, a novel sample average approximation optimization approach, auto-differentiation, and variance reduction techniques. BoTorch’s modular design facilitates flexible specification and optimization of probabilistic models written in PyTorch, simplifying implementation of new acquisition functions. Our approach is backed by novel theoretical convergence results and made practical by a distinctive algorithmic foundation that leverages fast predictive distributions, hardware acceleration, and deterministic optimization. We also propose a novel ”one-shot” formulation of the Knowledge Gradient, enabled by a combination of our theoretical and software contributions. In experiments, we demonstrate the improved sample efficiency of BoTorch relative to other popular libraries


References in zbMATH (referenced in 10 articles )

Showing results 1 to 10 of 10.
Sorted by year (citations)

  1. Akimitsu Ishii, Ryunosuke Kamijyo, Akinori Yamanaka, Akiyasu Yamamoto: BOXVIA: Bayesian optimization executable and visualizable application (2022) not zbMATH
  2. Guo, Hailong; Yang, Xu: Deep unfitted Nitsche method for elliptic interface problems (2022)
  3. Hayden Joy, Marios Mattheakis, Pavlos Protopapas: RcTorch: a PyTorch Reservoir Computing Package with Automated Hyper-Parameter Optimization (2022) arXiv
  4. Hertel, Lars; Baldi, Pierre; Gillen, Daniel L.: Reproducible hyperparameter optimization (2022)
  5. Winter, J. M.; Kaiser, J. W. J.; Adami, S.; Akhatov, I. S.; Adams, N. A.: Stochastic multi-fidelity surrogate modeling of dendritic crystal growth (2022)
  6. Chatigny, Philippe; Patenaude, Jean-Marc; Wang, Shengrui: Spatiotemporal adaptive neural network for long-term forecasting of financial time series (2021)
  7. Grosnit, Antoine; Cowen-Rivers, Alexander I.; Tutunov, Rasul; Griffiths, Ryan-Rhys; Wang, Jun; Bou-Ammar, Haitham: Are we forgetting about compositional optimisers in Bayesian optimisation? (2021)
  8. Oune, Nicholas; Bostanabad, Ramin: Latent map Gaussian processes for mixed variable metamodeling (2021)
  9. Owen, Art B.; Rudolf, Daniel: A strong law of large numbers for scrambled net integration (2021)
  10. Shi, Junjie; Bian, Jiang; Richter, Jakob; Chen, Kuan-Hsun; Rahnenführer, Jörg; Xiong, Haoyi; Chen, Jian-Jia: MODES: model-based optimization on distributed embedded systems (2021)