Sparse Based Particle Swarm Optimization Algorithm

: Particle Swarm Optimization (PSO) is the well-known metaheuristic algorithm for optimization, inspired from swarm of species.PSO can be used in various problems solving related to engineering and science inclusive of but not restricted to increase the heat transfer of systems, to diagnose the health problem using PSO based on microscopic imaging. One of the limitations with Standard-PSO and other swarm based algorithms is large computational time as position vectors are dense. In this study, a sparse initialization based PSO (Sparse-PSO) algorithm has been proposed. Comparison of proposed Sparse-PSO with Standard-PSO has been done through evaluation over several standard benchmark objective functions. Our proposed Sparse-PSO method takes less computation time and provides better solution for almost all benchmark objective functions as compared to Standard-PSO method. compared to the Standard-PSO. Results also shows that the Sparse-PSO method is more effective for multimodal functions such as Qing function, Salomon function, Styblinski-Tank function, Xin-She Yang function, and Xin-She Yang N.2 function. The proposed method achieves the global best value with less computation time compared to Standard-PSO. Furthermore, the wilcoxon test confirms that the Sparse-PSO method is significantly better than the Standard-PSO in terms to achieve optimum results for most of the functions. The future direction of this work would be to achieve more improvement by correcting and updating the used parameters and performing this method on more complex functions.


Introduction
Nature Inspired Algorithms (NIA) are the most prominent strategies developed from natural surroundings (Yang 2014) and mostly used in optimization tasks. Swarm based Intelligence algorithm is one of the category of NIA which is based on the cooperative behavior of swarm members. J. Kennedy and R. Eberhart (1995) proposed a Particle Swarm Optimization algorithm (PSO) which belongs to swarm based metaheuristic algorithm (Kennedy 2011;Marini and Walczak 2015). PSO is encouraged by the vibrant movement of insects, birds, fishes, etc. In the field of science and engineering, PSO has been utilized to tackle non-differentiable, non-linear, and multimodal optimization problems (Shi 2001;Poli 2008;Liu et al. 2016;Harrison et al. 2018) such as to achieve optimal risk investment portfolios problem (Zhu et al. 2011), to satisfy distribution of temperature and flow of heat (Farahmand at al. 2012), to integrated solar combined cycle power plants (Mabrouk et al. 2018) and many more. Limitations with respect to inadequate speed-up for reaching optimum point and unsatisfactory accuracy of PSO in some studies, motivate researchers for further development and enhancement in PSO. PSO is a simple to-actualize algorithm and furthermore has less customizable boundaries than practically equivalent other algorithms. In recent years many improvements have been done in PSO for many application areas of optimization such as Chaos based PSO (Gao et al. 2019) to improve convergence speed, diminishing population based PSO to meet the swarm on the most favorable point (Soudan and Saad 2008), FCPSO based on balancing the diversity of location to achieve convergence (Sahu et al. 2012), and many more. Nature inspired optimization techniques are also used in field of signal processing to improve the range of search space , to filter the noisy signals Ahirwal et al. 2015),to optimize adaptive noise canceller (Ahirwal et al. 2016).
Basically, the execution of PSO starts with a randomly distributed particles (population of solutions) inside the search space. As the iterations continue, the particles move as general group towards a most favorable point (Kennedy 2011). To appraise the optimality of each solution (particle) fitness function is evaluated in each cycle after that updation mechanism is applied to update the location of each particle so that they reach to optimal point and helps in convergence. This process of execution is repeated many times on particles to converge at global optima. It is challenging to process all the above steps within a reasonable amount of computation time if fitness objective function is more complex and it is difficult to achieve significant improvement also (Mahdavi et al. 2015;cheng 2016;Yang 2018;Sengupta et al. 2019). Several variants of Standard-PSO algorithms have also been developed to improve the computation time based on some application like task scheduling management (Pandey et al. 2010;Zhan and Huo 2012).
The main aim of this paper is to gain speed-up by reducing computation time and accomplish better efficiency with the help of proposed Sparse-PSO. The rest of the paper is organized as follows: A basic description of the Standard-PSO algorithm has been provided in Section 2. The proposed Sparse-PSO algorithm has been explained in Section 3. Section 4 displays the experimental results for the proposed approach and compared it with results for Standard-PSO. Section 5 contains a summary and concluding remarks.

Standard-Particle Swarm Optimization
Particle Swarm Optimization is a very popular optimization algorithm as only few parameters are there in this algorithm. PSO consist of a group of particles know as swarm. PSO is an iterative method since many iterations of the procedure has to be done to achieve the optimal value. In PSO, firstly an initial population of particles is initialized by arbitrarily initializing the position and velocity vectors in the defined search space. Each particle has fitness value to check solution quality at each iteration. This fitness value is obtained through objective functions which may be of minimization or maximization type, depending upon the problem. In this algorithm, there are two significant things to note, personal best ( ) solution accomplished up until now and the global best( ) particle which is the best solution among all individual best solutions achieved until now. The dynamic journey of the particle is controlled by its personal flying knowledge as well as the flying involvement of different particles ( ,and ) in the swarm. Each particle's velocity and position are updated by using equations (1)and (2) respectively.
In equation (1), first term in the velocity updation equation dominates the influence of earlier used velocities on the recent velocity or makes the particle move in the same direction with the same velocity (Shi and Eberhart 1998). Second term controls the position of particle by returning it to the previous position, if the previous position is better than the current position of particle according to the objective function. While third term controls the particle to follow the best particle in the swarm.

Proposed Initialization Process of Sparse-PSO
In this work, a particle is represented as a position vector of magnitude M, where M indicates the number of dimensions or variables. Our aim is to achieve the speed-up in the computation time of algorithm. In this effort, the population of particles called swarm is arbitrarily generated within defined boundaries of size N but in the form of sparse matrix representation to achieve our aim in reaching to the optimum point rapidly. After initialization of sparse matrix based population, each particle is evaluated according to the fitness function.
However, this procedure is performed in the initial phase of algorithm.

Proposed Algorithm
In proposed method, the swarm of size N is initialized to sparse matrix as discussed in Section 3.1 (The details of the same has been described in the following sub-section). Next, test functions are used to evaluate the optimal solution for every particle of swarm set. At this point, quality of solution depends on the nature of test function that can be minimization or maximization problem. Following which, personal best and global best are selected and revised repeatedly in accordance with the quality of particles. Step function ( 5 ), and Zakharov function ( 6 ) are unimodal in nature whereas the Ackley function ( 7 ), Periodic function ( 8 ), Quartic function ( 9 ), Qing function ( 10 ), Salomon function ( 11 ), Styblinski-Tank function ( 12 ), the Xin-She Yang function ( 13 ), and Xin-She Yang N.2

Parameters Setup
The basic parameter settings are defined in Table 2. All experimental results of the algorithms namely Standard-PSO and Sparse-PSO are collected from 25 independent runs, each involving 2000 iterations.

Experiment Results
In this work, swarm size is taken as 150and dimension as 30 for all the functions. Both the mentioned algorithms are implemented independently 25 times for 2000 iterations on each problem of benchmark function.
The data achieved from 25 independent runs are given in Tables 3 and 4. Furthermore, we computed the wilcoxon rank sum test on all the functions for both the optimizers and results are provided in Table 6.
Correlation of the computation time as mean time, best time, and worst time taken by both the algorithms during each run are showed in Table 3. By analyzing the outcomes of Table 3, it very well may be inferred that our proposed Sparse-PSO algorithm has a far superior execution time compared with the Standard-PSO algorithm.
Also, Table 4   Convergence graphs for each function are designed in Fig. 2 where the horizontal axis shows the number of iterations and the vertical axis represents the costs of each benchmark function for all iterations. Fig. 2(a) shows that for Brown function, the starting point of Sparse-PSO method is very close to the optimal point which is far better than Standard-PSO method. It can be seen from Fig. 2(b) that for Exponential function start point of proposed method is better than Standard-PSO. Next, both the methods converge to optimal point in almost same manner. Convergence graph for Powell Sum function denotes that starting point of Sparse-PSO method reaches near to the optimal point as compared to Standard-PSO method reported in Fig. 2(c). Fig. 2(d) represents the convergence graph for Sphere function which shows that start point as well as convergence curve of proposed method is better than Standard-PSO. Convergence graph for Step function are almost equal for both the algorithms is reported in Fig. 2(e). Fig. 2(f) shows the better convergence of Sparse-PSO method compared to Standard-PSO for Zakharov function.  Table 4 shows that Sparse-PSO over performs Standard-PSO in terms of average speed-up of execution time for all benchmark functions. Wilcoxon F. (1945) proposed a non-parametric statistical test for significance analysis known as wilcoxon rank sum test (Wilcoxon 1992). The test assigns a rank to all the observations considered as a group and afterward sums the ranks of each group. Table 6 shows that score obtained by proposed Sparse-PSO method is significantly better on almost all functions compared to Standard-PSO method.

Conclusion and Future Directions
A new initialization technique on the basis of sparse representation has been proposed for PSO algorithm in this paper. The main objective of the proposed Sparse-PSO algorithm is to reduce the computation time required.
Fourteen benchmark functions have been used to test the proposed method, and the outcomes of the same are compared with the standard approach. It is clear from results that the Sparse-PSO takes less computation time as compared to the Standard-PSO. Results also shows that the Sparse-PSO method is more effective for multimodal functions such as Qing function, Salomon function, Styblinski-Tank function, Xin-She Yang function, and Xin-She Yang N.2 function. The proposed method achieves the global best value with less computation time compared to Standard-PSO. Furthermore, the wilcoxon test confirms that the Sparse-PSO method is significantly better than the Standard-PSO in terms to achieve optimum results for most of the functions. The future direction of this work would be to achieve more improvement by correcting and updating the used parameters and performing this method on more complex functions.

Supplementary Files
This is a list of supplementary les associated with this preprint. Click to download.