Multi-Objective Bayesian Global Optimization using expected hypervolume improvement gradient

Kaifeng Yang, Michael Emmerich, André Deutz, Thomas Bäck

Research output: Contribution to journalArticlepeer-review

87 Citations (Scopus)

Abstract

The Expected Hypervolume Improvement (EHVI) is a frequently used infill criterion in Multi-Objective Bayesian Global Optimization (MOBGO), due to its good ability to lead the exploration. Recently, the computational complexity of EHVI calculation is reduced to O(n log n) for both 2-D and 3-D cases. However, the optimizer in MOBGO still requires a significant amount of time, because the calculation of EHVI is carried out in each iteration and usually tens of thousands of the EHVI calculations are required. This paper derives a formula for the Expected Hypervolume Improvement Gradient (EHVIG) and proposes an efficient algorithm to calculate EHVIG. The new criterion (EHVIG) is utilized by two different strategies to improve the efficiency of the optimizer discussed in this paper. Firstly, it enables gradient ascent methods to be used in MOBGO. Moreover, since the EHVIG of an optimal solution should be a zero vector, it can be regarded as a stopping criterion in global optimization, e.g., in Evolution Strategies. Empirical experiments are performed on seven benchmark problems. The experimental results show that the second proposed strategy, using EHVIG as a stopping criterion for local search, can outperform the normal MOBGO on problems where the optimal solutions are located in the interior of the search space. For the ZDT series test problems, EHVIG still can perform better when gradient projection is applied.

Original languageEnglish
Pages (from-to)945-956
Number of pages12
JournalSwarm and Evolutionary Computation
Volume44
DOIs
Publication statusPublished - Feb 2019

Keywords

  • Bayesian global optimization
  • Expected hypervolume improvement
  • Expected hypervolume improvement gradient
  • Kriging stopping criterion

Fingerprint

Dive into the research topics of 'Multi-Objective Bayesian Global Optimization using expected hypervolume improvement gradient'. Together they form a unique fingerprint.

Cite this