A stochastic radial basis function method for the global optimization of expensive functions We introduce a new framework for the global optimization of computationally expensive multimodal functions when derivatives are unavailable. The proposed Stochastic Response Surface (SRS) Method iteratively utilizes a response surface model to approximate the expensive function and identifies a promising point for function evaluation from a set of randomly generated points, called candidate points. Assuming some mild technical conditions, SRS converges to the global minimum in a probabilistic sense. We also propose Metric SRS (MSRS), which is a special case of SRS where the function evaluation point in each iteration is chosen to be the best candidate point according to two criteria: the estimated function value obtained from the response surface model, and the minimum distance from previously evaluated points. We develop a global optimization version and a multistart local optimization version of MSRS. In the numerical experiments, we used a radial basis function (RBF) model for MSRS and the resulting algorithms, Global MSRBF and Multistart Local MSRBF, were compared to 6 alternative global optimization methods, including a multistart derivative-based local optimization method. Multiple trials of all algorithms were compared on 17 multimodal test problems and on a 12-dimensional groundwater bioremediation application involving partial differential equations. The results indicate that Multistart Local MSRBF is the best on most of the higher dimensional problems, including the groundwater problem. It is also at least as good as the other algorithms on most of the lower dimensional problems. Global MSRBF is competitive with the other alternatives on most of the lower dimensional test problems and also on the groundwater problem. These results suggest that MSRBF is a promising approach for the global optimization of expensive functions.

References in zbMATH (referenced in 35 articles )

Showing results 1 to 20 of 35.
Sorted by year (citations)

1 2 next

  1. Chen, Ray-Bing; Wang, Yuan; Wu, C. F. Jeff: Finding optimal points for expensive functions using adaptive RBF-based surrogate model via uncertainty quantification (2020)
  2. Gao, Han; Zhu, Xueyu; Wang, Jian-Xun: A bi-fidelity surrogate modeling approach for uncertainty propagation in three-dimensional hemodynamic simulations (2020)
  3. He, Xinyu; Reyes, Kristofer G.; Powell, Warren B.: Optimal learning with local nonlinear parametric models over continuous designs (2020)
  4. Macêdo, M. Joseane F. G.; Karas, Elizabeth W.; Costa, M. Fernanda P.; Rocha, Ana Maria A. C.: Filter-based stochastic algorithm for global optimization (2020)
  5. Sun, Luning; Gao, Han; Pan, Shaowu; Wang, Jian-Xun: Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data (2020)
  6. Ahmadvand, Mohammad; Esmaeilbeigi, Mohsen; Kamandi, Ahmad; Yaghoobi, Farajollah Mohammadi: An improved hybrid-ORBIT algorithm based on point sorting and MLE technique (2019)
  7. He, Jiachuan; Mattis, Steven A.; Butler, Troy D.; Dawson, Clint N.: Data-driven uncertainty quantification for predictive flow and transport modeling using support vector machines (2019)
  8. Larson, Jeffrey; Menickelly, Matt; Wild, Stefan M.: Derivative-free optimization methods (2019)
  9. Costa, Alberto; Nannicini, Giacomo: RBFOpt: an open-source library for black-box optimization with costly function evaluations (2018)
  10. Zhou, Zhe; Bai, Fusheng: An adaptive framework for costly black-box global optimization based on radial basis function interpolation (2018)
  11. Zhou, Zhe; Bai, Fu-Sheng: A stochastic adaptive radial basis function algorithm for costly black-box optimization (2018)
  12. Beiranvand, Vahid; Hare, Warren; Lucet, Yves: Best practices for comparing optimization algorithms (2017)
  13. Boukouvala, Fani; Faruque Hasan, M. M.; Floudas, Christodoulos A.: Global optimization of general constrained grey-box models: new method and its application to constrained PDEs for pressure swing adsorption (2017)
  14. Corveleyn, Samuel; Vandewalle, Stefan: Computation of the output of a function with fuzzy inputs based on a low-rank tensor approximation (2017)
  15. D’Ambrosio, Claudia; Nannicini, Giacomo; Sartor, Giorgio: MILP models for the selection of a small set of well-distributed points (2017)
  16. Martinez, Nadia; Anahideh, Hadis; Rosenberger, Jay M.; Martinez, Diana; Chen, Victoria C. P.; Wang, Bo Ping: Global optimization of non-convex piecewise linear regression splines (2017)
  17. Müller, Juliane; Woodbury, Joshua D.: GOSAC: global optimization with surrogate approximation of constraints (2017)
  18. Rahmanpour, Fardin; Hosseini, Mohammad Mehdi; Maalek Ghaini, Farid Mohammad: Penalty-free method for nonsmooth constrained optimization via radial basis functions (2017)
  19. Vu, Ky Khac; D’Ambrosio, Claudia; Hamadi, Youssef; Liberti, Leo: Surrogate-based methods for black-box optimization (2017)
  20. Akhtar, Taimoor; Shoemaker, Christine A.: Multi objective optimization of computationally expensive multi-modal functions with RBF surrogates and multi-rule selection (2016)

1 2 next

Further publications can be found at: