An Optimization Algorithm Based On Stochastic Sensitivity Analysis For Noisy Objective Landscapes

A function minimization algorithm such that a solution is updated based on derivative information approximated with sample points is proposed. The algorithm generates sample points with Gaussian white noise, and approximates derivatives based on stochastic sensitivity analysis. Unlike standard trust region methods which calculate gradients with $n$ or more sample points, where $n$ is the number of variables, the proposed algorithm allows the number of sample points $M$ to be less than $n$. Furthermore, it ignores small amounts of noise within a trust region. This paper addresses the following two questions: To what extent does the derivative approximation become worse when the number of sample points is small? Does the algorithm converge to a good solution with inexact derivative
information when the objective landscape is noisy? Through intensive numerical experiments using quadratic functions, the algorithm is shown to be able to approximate derivatives when $M$ is about $n/10$ or more. The experiments using a formulation of the traveling salesman problem (TSP) shows that the algorithm can find reasonably good solutions for noisy objective landscapes with inexact derivatives.

By: Hiroyuki Okano and Masato Koda

Published in: Reliability Engineering and System Safety, volume 79, (no 2), pages 245-52 in 2003

Please obtain a copy of this paper from your local library. IBM cannot distribute this paper externally.

Questions about this service can be mailed to reports@us.ibm.com .