Hybridization of Chaotic Maps and Gravitational Search Algorithm for Constrained Mechanical and Civil Engineering Design Frameworks: CGSA for Mechanical and Civil Engineering Design Optimization

Hybridization of Chaotic Maps and Gravitational Search Algorithm for Constrained Mechanical and Civil Engineering Design Frameworks: CGSA for Mechanical and Civil Engineering Design Optimization

Sajad Ahmad Rather, P. Shanthi Bala
Copyright: © 2022 |Pages: 39
DOI: 10.4018/IJAMC.2022010102
Article PDF Download
Open access articles are freely available for download

Abstract

The Chaotic Gravitational Search Algorithm (CGSA) is a physics-based heuristic algorithm inspired by Newton's law of universal gravitation. It uses 10 chaotic maps for optimal global search and fast convergence rate. The advantages of CGSA has been incorporated in various Mechanical and Civil engineering design frameworks which include Speed Reducer Design (SRD), Gear Train Design (GTD), Three Bar Truss Design (TBTD), Stepped Cantilever Beam Design (SCBD), Multiple Disc Clutch Brake Design (MDCBD), and Hydrodynamic Thrust Bearing Design (HTBD). The CGSA has been compared with eleven state of the art stochastic algorithms. In addition, a non-parametric statistical test namely the Signed Wilcoxon Rank-Sum test has been carried out at a 5% significance level to statistically validate the results. The simulation results indicate that CGSA shows efficient performance in terms of high convergence speed and minimization of the design parameter values as compared to other heuristic algorithms. The source codes are publicly available on Github i.e. https://github.com/SajadAHMAD1.
Article Preview
Top

1. Introduction

In the last decade, heuristic optimization algorithms are creating ripples in the international computational intelligence community of researchers. The research works and the applications of heuristic algorithms in various fields have been phenomenal. Heuristic algorithms are getting popular due to their stochastic nature and simplicity. According to the No Free Lunch theorem (NFLT) (Wolpert and Macready, 1997), a single algorithm cannot solve all optimization problems. In simpler words, if an optimization algorithm solves some problems with high performance, there is a high probability that it performs badly in solving another type(s) of the optimization problem(s). Hence, the researchers have invented many optimization algorithms and every year new algorithms are being proposed. The famous optimization algorithms include Particle Swarm Optimization (PSO) (Kennedy et al., 1993) which is inspired by the social behavior of birds and fishes, Ant Colony Optimization (ACO) (Dorigo et al., 1996) based on the searching behavior of ants, Biogeography Based Optimization (BBO) (Simon, 2008) inspired from the distribution and migration models of species, Differential Evolution (DE) (Storn and Price, 1995) and Genetic Algorithm (GA) (Tang et al., 1996) which are motivated by the theory of evolution. The heuristic Algorithms (HAs) have been applied to solve various problems of computer science and other fields of study such as electronics, biology, oil industry, and so on. In computer science, HAs have been utilized for function optimization (Du and Li, 2008; Yao et al., 1999), control objectives (Baojiang and Shiyong, 2007; Karakuza, 2008; Kim et al., 2008), pattern recognition (Liu et al., 2008; Tan and Bhanu, 2006), modeling of filters (Kalinlia and Karabogab, 2005) and optimal processing of images (Cordon et al., 2006; Nezamabadi-pour et al., 2006). In electronic science, load dispatch (Beigvand et al., 2016; 2017) and optimal power flow (Bhowmiket al., 2015) problems have been efficiently solved by optimization algorithms.

All optimization algorithms consist of a random population of agents that are used for finding the candidate solutions in the search space. The process starts with the initialization of agents in the search space. Then, the algorithm goes through many iterations and each independent trial gives the feasible candidate solutions until the end of the criterion is met. The best feasible candidate solution from all the iterations is selected as the optimal solution. It is quite amazing that stochastic algorithms consist of a few fundamental steps such as exploration and exploitation. Exploration consists of the search space of the algorithm. During this phase, the candidate solutions go through a number of changes. Moreover, Exploitation is the capability of finding global optima around different feasible solutions. The candidate solutions face small changes during the exploitation phase. It has been seen that if an optimization algorithm has good exploration capability, then it will be lacking in good exploitation capability and vice versa (Eiben and Schipper, 1998). Hence, these are inversely proportional to each other. Previously, researchers were using random walks and gradient descent methods for improving exploration and exploitation, respectively. However, they have the drawback of increasing the overall computational cost of the algorithm. In the last decade, researchers are now utilizing chaotic maps for increasing diversification and local exploitation of search space in order to find the optimal candidate solutions (Mirjalili et al., 2017; Gandomi et al., 2012).

Chaos theory is the study of dynamic systems. The interesting property of these systems is that when there is a minor change in the system, the whole system gets affected. In simpler words, change in the initial parameter(s) creates variations throughout the system. Moreover, randomness is not necessary for chaotic systems rather deterministic systems also show chaotic behavior (Kellert, 2017). The optimization algorithms utilize Initial Parameter Sensibility (IPS) property of chaotic maps for fast exploitation, global exploration, and alleviation from local minima entrapment problem.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing