Derivative Free Algorithms for Optimization of Function with Noise

Primary tabs

We consider a class of optimization problems that satisfies the following properties a) The objective function can only be evaluated with some error and at high computational cost. b) The error can be decreased with more compuational effort. c) The Higher order derivatives of the objective function are unavailable. Such problems commonly arise in engineering design(Eg. helipcopter rotor blade design), simulation optimization(Eg. Revenue management) etc. Our aim is to develop convergent algorithms that can solve such problems while requiring the fewest possible number of objective function evaluations.
In order to do so, we first develop a general framework of convergence for optimization algorithms. Using this framework, we can show the convergence of traditional non-linear programming algorithms that have been suitably modified to use approximations of the objective function and its gradient. Then, we present one particular scheme for approximating the gradient and hessian of the objective function using linear regression. Finally, we describe a trust region algorithm that uses linear regression to form a linear or quadratic model of the objective function and provide computational results for such an algorithm run on the problems from the CUTE test set.


  • Workflow Status:
  • Created By:
    Barbara Christopher
  • Created:
  • Modified By:
    Fletcher Moore
  • Modified:


    No keywords were submitted.

Target Audience

    No target audience selected.