Augmented Downhill Simplex a Modified Heuristic Optimization Method

Author

Department of Electrical Engineering, Center of Excellence on Soft Computing and Intelligent Information Processing (SCIIP), Ferdowsi University of Mashhad, Mashhad, Iran

Abstract

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, random search is considered as a global exploration operator in combination with DSM as a local exploitation method. Thus, presented algorithm is a derivative-free, fast, simple and nonlinear optimization method that is easy to be implemented numerically. Efficiency and reliability of the presented algorithm are compared with several other optimization methods, namely traditional downhill simplex, random search and steepest descent. Simulations verify the merits of the proposed method.

Keywords



Volume 5, Issue 2
Summer and Autumn 2012
Pages 1-6
  • Receive Date: 22 September 2011
  • Revise Date: 06 October 2011
  • Accept Date: 08 November 2011