Thema:
|
Gradient-free optimization from noisy data and nonparametric regression - Zeige Abstract
Abstract: We study the problem of estimating the minimizer or the minimal value of a smooth function by exploration of its values under possibly adversarial noise. We consider active (sequential) and passive settings of the problem and various approximations of the gradient descent algorithm, where the gradient is estimated by procedures involving function evaluations in randomized points and a smoothing kernel based on the ideas from nonparametric regression. The objective function is assumed to be either Hölder smooth or Hölder smooth and satisfying additional assumptions such as strong convexity or Polyak-Lojasiewicz condition. In all scenarios, we suggest polynomial time algorithms achieving non-asymptotic minimax optimal or near minimax optimal rates of convergence. The talk is based on a joint work with Arya Akhavan, Evgenii Chzhen, Davit Gogolashvili and Massimiliano Pontil.
|