Menu
  • About & Contact
  • Feedback
  • Contribute
  • Help
  • zbMATH

swMATH

swmath-logo
  • Search
  • Advanced search
  • Browse
  • browse software by name
  • browse software by keywords
  • browse software by MSC
  • browse software by types

MOE

MOE: metric optimization engine; a new open source, machine learning service for optimal experiment design.

Keywords for this software

Anything in here will be replaced on browsers that support the canvas element

  • Bayesian optimization
  • deep neural networks
  • hyperparameter optimization
  • mesh adaptive direct search
  • PyGPGO
  • infinitesimal perturbation analysis
  • derivative-free optimization
  • parallel expected improvement
  • acquisition function
  • Python
  • categorical variables
  • parallel optimization
  • blackbox optimization
  • JOSS
  • surrogate models
  • neural architecture search

  • URL: github.com/Yelp/MOE
  • Code
  • InternetArchive
  • Authors: Clark SC, Liu E, Frazier PI, Wang J, Oktay D, Vesdapunt N

  • Add information on this software.


  • Related software:
  • Spearmint
  • Python
  • SciPy
  • L-BFGS
  • HyperNOMAD
  • DeepHyper
  • Hyperband
  • BOBYQA
  • EGO
  • scikit-optimize
  • Show more...
  • Caffe
  • MNIST
  • RMSprop
  • Adam
  • OPAL
  • BARON
  • QSIMVN
  • PyTorch
  • Fashion-MNIST
  • BFO
  • Show less...

References in zbMATH (referenced in 3 articles )

Showing results 1 to 3 of 3.
y Sorted by year (citations)

  1. Lakhmiri, Dounia; Digabel, Sébastien Le; Tribes, Christophe: HyperNOMAD. Hyperparameter optimization of deep neural networks using mesh adaptive direct search (2021)
  2. Wang, Jialei; Clark, Scott C.; Liu, Eric; Frazier, Peter I.: Parallel Bayesian global optimization of expensive functions (2020)
  3. Jiménez J., Ginebra J.: pyGPGO: Bayesian Optimization for Python (2017) not zbMATH

  • Article statistics & filter:

  • Search for articles
  • MSC classification / top
    • Top MSC classes
      • 65 Numerical analysis
      • 90 Optimization

  • Publication year
    • 2010 - today
    • 2005 - 2009
    • 2000 - 2004
    • before 2000
  • Terms & Conditions
  • Imprint
  • Privacy Policy