Parameter Selection for Swarm Intelligence Algorithms: Case Study on Parallel Implementation of FSS

Parameter Selection for Swarm Intelligence Algorithms: Case Study on Parallel Implementation of FSS

Breno A. M. Menezes, Fabian Wrede, Herbert Kuchen, Fernando B. Lima Neto
Copyright: © 2018 |Pages: 20
DOI: 10.4018/IJSIR.2018100101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Swarm intelligence (SI) algorithms are handy tools for solving complex optimization problems. When problems grow in size and complexity, an increase in population or number of iterations might be required in order to achieve a good solution. These adjustments also impact the execution time. This article investigates the trade-off involving population size, number of iterations and problem complexity, aiming to improve the efficiency of SI algorithms. Results based on a parallel implementation of Fish School Search show that increasing the population size is beneficial for finding good solutions. However, we observed an asymptotic behavior, i.e. increasing the population over a certain threshold only leads to slight improvements. Furthermore, the execution time was analyzed.
Article Preview
Top

Introduction

In many real-world optimization problems, finding the optimal solution can be hard and take a lot of time when using exact methods. In order to tackle these problems, metaheuristics are often used to find suitable solutions, not always a good one, but in a reasonable amount of time (Talbi, 2009).

Swarm Intelligence (SI) algorithms are part of the population-based metaheuristics family. Inspired by the behavior of gregarious species, SI algorithms have as their main characteristic very simple short-sighted agents. These agents cooperate with each other through simple communication and behavior patterns that guide the movements of each one of them through the problem's search space (Eberhart & Kennedy, 1995).

Fish School Search (FSS) was first introduced in 2008 (Bastos-Filho, de Lima Neto, Lins, Nascimento & Lima, 2008). FSS is an SI algorithm inspired by the behavior of a fish school. It incorporates the concept of feeding, where each particle (fish) has its own weight and might gain or lose weight according to its success. Heavier fish are more influential among other members of the school and have more power to attract them. Also, the collective movement operator plays an important role in FSS. If the overall weight of the fish school is increasing, it means that the school is in a promising area that should be exploited, and in this case, it contracts. Each fish moves towards the barycenter of the school. On the other hand, if the fish school loses weight, it should expand, allowing fish to explore new areas. With this feature, FSS is capable of switching automatically from exploration to exploitation and vice-versa during the search process.

Algorithms such as FSS have already been applied to many optimization problems and show good performance for many of them. SI algorithms have distinct sources of inspiration, although they share many characteristics such as the concept of population size and number of iterations to be executed in order to achieve an acceptable solution. Here we are using FSS as a case study.

The population size, one common parameter of every SI algorithm, might affect directly the performance of the search process on an optimization problem. According to the complexity of the problem (considering landscape, number of dimensions and range of the dimensions), more particles might be needed in order to ensure the convergence to a good solution. With a higher number of particles, it is possible to generate more diversity in the initial phase of the algorithm and also cover a larger area of the search space. On the other hand, adding too many particles unnecessarily increases the computational cost of every iteration and might even mislead the swarm and slow down the convergence time (Yassin, Taib, Adnan, Salleh & Hamzah, 2012). Normally this parameter is set empirically, with the likely errors that this selection method may incur.

The number of iterations is also an important parameter of SI algorithms and it is often used as a termination criterion. Depending on the problem that is being solved, too few iterations might not be enough to find an acceptable solution and the execution might stop when the algorithm is still improving. On the other hand, too many iterations might be a waste of time and resources, when the algorithm already found the best solution it could (Van Den Bergh & Engelbrecht, 2001). In real world applications, it is desirable that the termination condition be found based on the quality of the results. The not so rare difficulty is that a sought precision may not be achieved or take too long. So a fixed maximum number of iterations may still be needed.

Both parameters, population size and number of iterations, when increased also impact the computational costs of the search process. If these parameters are too big, the execution time grows significantly.

Parallel implementations of SI algorithms are a solution for enhancing the performance when tackling problems with a higher computational cost. Nowadays, the easy access to multi- and many-core processors created many possibilities to speed up such algorithms. Many tools are available and already have been explored for this matter, from low-level frameworks, such as OpenMP, MPI, and CUDA, to high-level frameworks, such as Muesli (Kuchen, 2002; Ciechanowicz & Kuchen, 2010; Ernsting & Kuchen, 2012; Ersting & Kuchen, 2017) or MALLBA (Alba, Luque, García-Neto, Ordóñez & Leguizamón, 2007). In this manner, the key issue is to determine a reasonable setup for the parameters considering the trade-off between the quality of the solution and the required effort to achieve it.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing