A New Bio-Inspired Algorithm Based on the Hunting Behavior of Cheetah

A New Bio-Inspired Algorithm Based on the Hunting Behavior of Cheetah

D. Saravanan, P. Victer Paul, S. Janakiraman, Ankur Dumka, L. Jayakumar
Copyright: © 2020 |Pages: 18
DOI: 10.4018/IJITPM.2020100102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Soft computing is recognized as the fusion of methodologies mainly designed to model and formulate solutions to real-world problems that are too difficult to model mathematically. The grey wolf optimizer (GWO) algorithm is the recently proposed bio-inspired optimization algorithm that is mainly based on their foraging and hunting behavior. This GWO is proved as the recent and best in solving complex problems, but they too face some drawbacks of low solving precision, slow convergence, and bad local searching ability. In order to overcome the shortcomings of the existing algorithms, this paper is intended to propose a novel algorithm based on the foraging behavior of the cheetah. The cheetah is well known for their leadership hierarchy, decision making, and efficient communication capabilities between their teammates during group hunting. The famous benchmark functions such as unimodal and multimodal functions are being chosen as the testbed, and the experiments are performed on them. The proposed scheme outperforms in terms of computational time and optimal solution.
Article Preview
Top

1. Introduction

Optimization is an integral part of the engineering design process. It focuses on discovering optimum solutions to a design problem through systematic consideration of alternatives, while satisfying resource and cost constraints. Many engineering problems are open-ended and complex. The overall design objective in these problems may be to minimize cost, to maximize profit, to streamline production, to increase process efficiency, etc. Optimization is a commonly encountered mathematical problem in all engineering discipline. It literally means finding the best possible/desirable solution. Optimization problems are wide ranging and numerous, hence methods for solving these problems ought to be, an active research topic (Shufen Liu, Huang Leng, Lu Han, 2017).

Optimization Techniques is a unique reference source of methods for achieving optimization i.e. to find the optimal solution. These techniques include both systems structures and computational methods. Commonly optimization techniques are used to find the optimal solution for the problems which have more than one solution. There are many optimization techniques available today such as numerical optimization technique, linear optimization, nonlinear optimization, constrained optimization, combinatorial optimization, Stochastic programming, EA, PSO, GA etc. Many techniques are appropriate only for certain types of problems. Thus, it is important to recognize the characteristics of a problem and to identify an appropriate technique in the context of given problem to find the optimal solution, such that for each class of problems there are different minimization methods, varying in computational requirements, convergence properties, and so on. Optimization problems are classified according to the mathematical characteristics of the objective function, the constraints and the control variables. The most important characteristic is the nature of the objective function (Martin Fleck, Javier Troya, et. al, 2017) (Chunquan Li, Zhenshou Song, et. al,2017).

Although the word “optimization” shares the same root as “optimal”, it is rare for the process of optimization to produce a truly optimal system. The optimized system will typically only be optimal in one application or for one audience. One might reduce the amount of time that a program takes to perform some task at the price of making it consume more memory. In an application where memory space is at a premium, one might deliberately choose a slower algorithm in order to use less memory. Often there is no “one size fits all” design which works well in all cases, so engineers make trade-offs to optimize the attributes of greatest interest. Additionally, the effort required to make a piece of software completely optimal incapable of any further improvement is almost always more than is reasonable for the benefits that would be accrued; so the process of optimization may be halted before a completely optimal solution has been reached. Fortunately, it is often the case that the greatest improvements come early in the process. The different types of optimization techniques which are in existence are Deterministic Optimization and Stochastic Optimization (P. Victer Paul, et al., 2013) (L. Jayakumar, et al., 2016).

Deterministic algorithm aims at designing and analyzing mathematical methods and algorithm for optimization problems. In particular, the focus is on topological and geometrical methods for combinational, nonlinear, and integer optimization problems. The deterministic processes never involve probability; outcomes occur based on predictable and exact input values. The problems investigated originate from operation research, computer science, and technology. They make use of discrete mathematics to solve these problems.

Stochastic optimization is the process of maximizing or minimizing the value of a mathematical or statistical function when one or more of the input parameters is subject to randomness. The word stochastic means involving chance or probability. Stochastic processes are commonly involved in Business Analytics (BA), sales, service, manufacturing, finance and communications. Stochastic processes always involve probability, such as trying to predict the water level in a reservoir at a certain time based on random distribution of rainfall and water usage, or estimating the number of dropped connections in a communication network based on randomly variable traffic but constant available bandwidth (Yongxin Chen, et al., 2018).

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 3 Released, 1 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing