MOE
MOE: metric optimization engine; a new open source, machine learning service for optimal experiment design.
Keywords for this software
References in zbMATH (referenced in 3 articles )
Showing results 1 to 3 of 3.
Sorted by year (- Lakhmiri, Dounia; Digabel, Sébastien Le; Tribes, Christophe: HyperNOMAD. Hyperparameter optimization of deep neural networks using mesh adaptive direct search (2021)
- Wang, Jialei; Clark, Scott C.; Liu, Eric; Frazier, Peter I.: Parallel Bayesian global optimization of expensive functions (2020)
- Jiménez J., Ginebra J.: pyGPGO: Bayesian Optimization for Python (2017) not zbMATH