An Adaptive Population-based Simplex Method for Continuous Optimization

An Adaptive Population-based Simplex Method for Continuous Optimization

Mahamed G.H. Omran, Maurice Clerc
Copyright: © 2016 |Pages: 29
DOI: 10.4018/IJSIR.2016100102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper proposes a new population-based simplex method for continuous function optimization. The proposed method, called Adaptive Population-based Simplex (APS), is inspired by the Low-Dimensional Simplex Evolution (LDSE) method. LDSE is a recent optimization method, which uses the reflection and contraction steps of the Nelder-Mead Simplex method. Like LDSE, APS uses a population from which different simplexes are selected. In addition, a local search is performed using a hyper-sphere generated around the best individual in a simplex. APS is a tuning-free approach, it is easy to code and easy to understand. APS is compared with five state-of-the-art approaches on 23 functions where five of them are quasi-real-world problems. The experimental results show that APS generally performs better than the other methods on the test functions. In addition, a scalability study has been conducted and the results show that APS can work well with relatively high-dimensional problems.
Article Preview
Top

1. Introduction

Many optimization algorithms have been proposed for solving the continuous nonlinear function optimization problem:

The vectorIJSIR.2016100102.m02 is composed of D real-valued variables, and the vectorsIJSIR.2016100102.m03 andIJSIR.2016100102.m04 are assumed finite and to satisfyIJSIR.2016100102.m05. The functionIJSIR.2016100102.m06 is typically multimodal, so that local optima do not in general correspond to global optima. Examples for some well-known optimization methods used to solve this problem are Genetic Algorithms (GA) (Goldberg 1989), Particle Swarm Optimization (PSO) (Kennedy and Eberhart 1995) and Differential Evolution (DE) (Storn and Price 1995).

In this paper we are looking for a method that is easy to understand, easy to code, easy to use, but nevertheless efficient:

  • Easy to use–requires no tuning and has few parameters (ideally, none that are visible to the user).

  • Easy to implement – typically has around 50 instructions in a language like MATLAB®.

  • Easy to understand (i.e. one can “visualize” what happens in 2D or 3D problems).

  • Efficient (i.e. based on properties that are valid in any dimension (or, at least, on a large range of dimensions, say 2 to 100)).

Bio-inspired algorithms (Olariu and Zomaya 2005) may be good candidates, but they are coming from the 3D world and most of them only are good in low dimensional problems.

Hence the idea was to “revisit” the classical Nelder-Mead (NM) simplex method (Nelder and Mead 1965), which is both purely geometrical and nevertheless quite intuitive and easy to implement. However, NM is well known to be bad for multimodal problems because it is easily trapped into a local optimum (Lagarias et al. 1998). In addition, NM suffers when applied to large dimensional problems (Wright 1996). Consequently, several modifications to NM have been proposed in the literature to improve its performance for global optimization (e.g. Zhao et al. 2009 and Gao and Han 2012). A recent modification has been proposed by Karimi and Siarry (2012) where a global optimization method based on the NM method was proposed. The proposed method, called Global Simplex Optimization (GSO), incorporates a multi-stage, stochastic and weighted version of the NM reflection operator. GSO generally performed better than five optimization methods (including CMA-ES (Hansen 2006)) on 17 functions. However, no single problem in their study has a dimension greater than 10.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing