An analysis of covariance parameters in Gaussian Process-based optimization

Authors

  • Hossein Mohammadi University of Exeter
  • Rodolphe Le Riche Ecole des Mines de Saint Etienne
  • Xavier Bay Ecole des Mines de Saint Etienne
  • Eric Touboul Ecole des Mines de Saint Etienne

Abstract

The need for globally optimizing expensive-to-evaluate functions frequently occurs in many real-world applications. Among the methods developed for solving such problems, the Efficient Global Optimization (EGO) is regarded as one of the state-of-the-art unconstrained continuous optimization algorithms. The surrogate model used in EGO is a Gaussian Process (GP) conditional on data points. The most important control on the efficiency of the EGO algorithm is the Gaussian process covariance function (or kernel), which is taken as a parameterized function. In this article, we theoretically and empirically analyze the effect of the covariance parameters, the so-called ``characteristic length-scale'' and ``nugget'', on the EGO performance. More precisely, we analyze the EGO algorithm with fixed covariance parameters and compare them to the standard setting where they are statistically estimated. The limit behavior of EGO with very small or very large characteristic length-scales is identified. Experiments show that a ``small'' nugget should be preferred to its maximum likelihood estimate. Overall, this study contributes to a better understanding of a key optimization algorithm from both a practical and a theoretical point of view.

Downloads

Published

2018-07-24

Issue

Section

CRORR Journal Regular Issue