Spearmint
Spearmint is a software package to perform Bayesian optimization. The Software is designed to automatically run experiments (thus the code name spearmint) in a manner that iteratively adjusts a number of parameters so as to minimize some objective in as few runs as possible.
Keywords for this software
References in zbMATH (referenced in 98 articles )
Showing results 1 to 20 of 98.
Sorted by year (- Ayensa-Jiménez, Jacobo; Doweidar, Mohamed H.; Sanz-Herrera, Jose A.; Doblaré, Manuel: Prediction and identification of physical systems by means of physically-guided neural networks with meaningful internal layers (2021)
- Binder, Martin; Pfisterer, Florian; Lang, Michel; Schneider, Lennart; Kotthoff, Lars; Bischl, Bernd: mlr3pipelines -- flexible machine learning pipelines in R (2021)
- Bliek, Laurens; Verwer, Sicco; de Weerdt, Mathijs: Black-box combinatorial optimization using models with integer-valued minima (2021)
- Chen, Xiaoli; Duan, Jinqiao; Karniadakis, George Em: Learning and meta-learning of stochastic advection-diffusion-reaction systems from sparse measurements (2021)
- Corazza, Marco; di Tollo, Giacomo; Fasano, Giovanni; Pesenti, Raffaele: A novel hybrid PSO-based metaheuristic for costly portfolio selection problems (2021)
- Ellenbach, Nicole; Boulesteix, Anne-Laure; Bischl, Bernd; Unger, Kristian; Hornung, Roman: Improved outcome prediction across data sources through robust parameter tuning (2021)
- Grosnit, Antoine; Cowen-Rivers, Alexander I.; Tutunov, Rasul; Griffiths, Ryan-Rhys; Wang, Jun; Bou-Ammar, Haitham: Are we forgetting about compositional optimisers in Bayesian optimisation? (2021)
- Huang, Junhao; Sun, Weize; Huang, Lei: Joint structure and parameter optimization of multiobjective sparse neural network (2021)
- Jakubik, Johannes; Binding, Adrian; Feuerriegel, Stefan: Directed particle swarm optimization with Gaussian-process-based function forecasting (2021)
- Järvenpää, Marko; Gutmann, Michael U.; Vehtari, Aki; Marttinen, Pekka: Parallel Gaussian process surrogate Bayesian inference with noisy likelihood evaluations (2021)
- Kafka, Dominic; Wilke, Daniel N.: Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches (2021)
- Kim, Jungtaek; McCourt, Michael; You, Tackgeun; Kim, Saehoon; Choi, Seungjin: Bayesian optimization with approximate set kernels (2021)
- Masti, Daniele; Bemporad, Alberto: Learning nonlinear state-space models using autoencoders (2021)
- Müller, Juliane; Park, Jangho; Sahu, Reetik; Varadharajan, Charuleka; Arora, Bhavna; Faybishenko, Boris; Agarwal, Deborah: Surrogate optimization of deep neural networks for groundwater predictions (2021)
- Nam, Jaehyun; Yong, Hwanmoo; Hwang, Jungho; Choi, Jongeun: Training an artificial neural network for recognizing electron collision patterns (2021)
- Qian, Zhaozhi; Alaa, Ahmed M.; van der Schaar, Mihaela: CPAS: the UK’s national machine learning-based hospital capacity planning system for COVID-19 (2021)
- Shi, Junjie; Bian, Jiang; Richter, Jakob; Chen, Kuan-Hsun; Rahnenführer, Jörg; Xiong, Haoyi; Chen, Jian-Jia: MODES: model-based optimization on distributed embedded systems (2021)
- Škrlj, Blaž; Martinc, Matej; Lavrač, Nada; Pollak, Senja: autoBOT: evolving neuro-symbolic representations for explainable low resource text classification (2021)
- Sudermann-Merx, Nathan; Rebennack, Steffen: Leveraged least trimmed absolute deviations (2021)
- Wang, Qihan; Wu, Di; Li, Guoyin; Gao, Wei: A virtual model architecture for engineering structures with twin extended support vector regression (T-X-SVR) method (2021)