Speaker:   Andrew R. Conn
  Thomas J. Watson Research Center, IBM


Title: Derivative Free Optimization - Some New Results

Derivative free optimization methods have been extensively developed in the past decade To be able to employ the well developed convergence theory of derivative based methods, the models used (whether in a trust region or a line search approach) have to satisfy Taylor-like error bounds..We will present a unified framework to study these error bounds, initially for the case of polynomial interpolation. These bounds depend on the geometry of the interpolation set. and we will analyze this geometry and present generalisations which include a viable way of measuring the quality of the geometry, We will also present extensions to least squares and minimum norm methods. Practical techniques of ensuring that the geometric requirement is satisfied and of improving the geometry of a sample set, as well as connections with other results, will also be included.
Joint work with Katya Scheinberg and Luis Vicente.