SMAC: Sequential Model-based Algorithm Configuration. SMAC (sequential model-based algorithm configuration) is a versatile tool for optimizing algorithm parameters (or the parameters of some other process we can run automatically, or a function we can evaluate, such as a simulation). SMAC has helped us speed up both local search and tree search algorithms by orders of magnitude on certain instance distributions. Recently, we have also found it to be very effective for the hyperparameter optimization of machine learning algorithms, scaling better to high dimensions and discrete input dimensions than other algorithms. Finally, the predictive models SMAC is based on can also capture and exploit important information about the model domain, such as which input variables are most important. We hope you find SMAC similarly useful. Ultimately, we hope that it helps algorithm designers focus on tasks that are more scientifically valuable than parameter tuning.

References in zbMATH (referenced in 59 articles )

Showing results 1 to 20 of 59.
Sorted by year (citations)

1 2 3 next

  1. Ahmed, Mohamed Osama; Vaswani, Sharan; Schmidt, Mark: Combining Bayesian optimization and Lipschitz optimization (2020)
  2. Baioletti, Marco; Di Bari, Gabriele; Milani, Alfredo; Santucci, Valentino: An experimental comparison of algebraic crossover operators for permutation problems (2020)
  3. Bayless, Sam; Kodirov, Nodir; Iqbal, Syed M.; Beschastnikh, Ivan; Hoos, Holger H.; Hu, Alan J.: Scalable constraint-based virtual data center allocation (2020)
  4. Binois, Mickaël; Ginsbourger, David; Roustant, Olivier: On the choice of the low-dimensional domain for global optimization via random embeddings (2020)
  5. Kandasamy, Kirthevasan; Vysyaraju, Karun Raju; Neiswanger, Willie; Paria, Biswajit; Collins, Christopher R.; Schneider, Jeff; Poczos, Barnabas; Xing, Eric P.: Tuning hyperparameters without grad students: scalable and robust Bayesian optimisation with Dragonfly (2020)
  6. Kletzander, Lucas; Musliu, Nysret: Solving the general employee scheduling problem (2020)
  7. Moriconi, Riccardo; Deisenroth, Marc Peter; Sesh Kumar, K. S.: High-dimensional Bayesian optimization using low-dimensional feature spaces (2020)
  8. Moriconi, Riccardo; Kumar, K. S. Sesh; Deisenroth, Marc Peter: High-dimensional Bayesian optimization with projections using quantile Gaussian processes (2020)
  9. Ribeiro, Rita P.; Moniz, Nuno: Imbalanced regression and extreme value prediction (2020)
  10. Toutouh, Jamal; Rossit, Diego; Nesmachnow, Sergio: Soft computing methods for multiobjective location of garbage accumulation points in smart cities (2020)
  11. Banbara, Mutsunori; Inoue, Katsumi; Kaufmann, Benjamin; Okimoto, Tenda; Schaub, Torsten; Soh, Takehide; Tamura, Naoyuki; Wanko, Philipp: \textitteaspoon: solving the curriculum-based course timetabling problems with answer set programming (2019)
  12. ChangYong Oh, Efstratios Gavves, Max Welling: BOCK : Bayesian Optimization with Cylindrical Kernels (2019) arXiv
  13. Franzin, Alberto; Stützle, Thomas: Revisiting simulated annealing: a component-based analysis (2019)
  14. Lindauer, Marius; van Rijn, Jan N.; Kotthoff, Lars: The algorithm selection competitions 2015 and 2017 (2019)
  15. Liu, Jianfeng; Ploskas, Nikolaos; Sahinidis, Nikolaos V.: Tuning BARON using derivative-free optimization algorithms (2019)
  16. Nikolić, Mladen; Marinković, Vesna; Kovács, Zoltán; Janičić, Predrag: Portfolio theorem proving and prover runtime prediction for geometry (2019)
  17. Pagnozzi, Federico; Stützle, Thomas: Automatic design of hybrid stochastic local search algorithms for permutation flowshop problems (2019)
  18. Salem, Malek Ben; Bachoc, François; Roustant, Olivier; Gamboa, Fabrice; Tomaso, Lionel: Gaussian process-based dimension reduction for goal-oriented sequential design (2019)
  19. Wang, Yuepeng; Hu, Kun; Ren, Lanlan; Lin, Guang: Optimal observations-based retrieval of topography in 2D shallow water equations using PC-EnKF (2019)
  20. Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)

1 2 3 next

Further publications can be found at: