A novel meta-heuristic optimization algorithm based on cell division: Cell Division Optimizer


 A novel meta-heuristic algorithm named as the Cell Division Optimizer (CDO) is proposed. The proposed algorithm is inspired by the reproduction methods at the cellular level, which is formulated by the well-known cell division process known as mitosis and meiosis. In the proposed model Meiosis and Mitosis govern the exploration and exploitation aspects of the optimization algorithm, respectively. In the proposed method, the solutions are updated in two phases to achieve the global optimum solution. The proposed algorithm can be easily adopted to solve the combinatorial optimization method. To evaluate the proposed method, 50 well-known benchmark test functions and also 2 classical engineering optimization problems including 1 mechanical engineering problem and 1 electrical engineering problem are employed. The results of the proposed method are compared with the latest versions of state-of-the-art algorithms like Particle Swarm Optimization, Cuckoo Search, Grey Wolf Optimizer, FruitFly Optimization, Whale Optimizer, Water-Wave Optimizer and recently proposed variants of top-performing algorithms like SHADE (success history-based adaptive differential evolution) and CMAES (Covariance matrix adaptation evolution strategy). Moreover, the convergence speed of the proposed algorithm is better than the considered competitive methods in most cases.


Introduction
Many real-world design problems are defined as a decision-making process to build products that satisfy defined requirements. Usually, these design problems include complicated objective functions with a large number of decision variables. A solution that satisfies the specified constraints of the different decision variables along with attaining the better possible value of the objective function is considered as a potential solution. In the existing literature, different algorithms have been developed to solve optimization problems. The majority of the algorithms are based on linear and nonlinear methods. These methods require substantial gradient information. Moreover, they are often stuck at local optimum solutions. These optimization algorithms are good for simple problems. However, most real-world problems are very complex in nature. The complexity of these models makes it difficult to apply these algorithms to solve real-world problems. The limitation of the traditional numerical methods has motivated to explore the possibility of using meta-heuristic algorithms to solve the real-world problem.
Over the past few decades, meta-heuristic algorithms have been explored to solve a wide variety of problems. Researchers have contributed with a number of algorithms inspired by natural phenomena such as Particle Swarm Optimization [1], inspired by birds flocking and fish schooling. John Holland [2] proposed the Genetic Algorithm which was inspired by the Darwinian concepts of evolution. Karaboga et al [3] proposed the Artificial Bee Colony Algorithm inspired by the foraging behaviour of bees in a honey bee swarm. Yang et al [4] have proposed Cuckoo Search Algorithm inspired by the egg-laying behaviour of Cuckoo Birds. Seyedali Mirjalili [5] has proposed whale optimization algorithms inspired by the hunting behaviour of whales.
From their development, these algorithms have since been used to solve a wide variety of reallife problems like the Travelling Salesman Problem [6][7][8], Job Scheduling Problem [9,10], Pressure Vessel Design [11], Optical Buffer Design [12,13] etc. Some of the important reasons why metaheuristics are widely used [12] are mentioned below: 1. Simplicity: Meta-heuristic Algorithms are generally very simple in their concepts. They draw inspiration from the phenomena in nature or evolutionary concepts. This makes the resultant algorithms easier to understand and implement these algorithms in diverse problems. 2. Flexibility: Meta-heuristics are extremely flexible in the sense that they can be easily modified to suit a specific problem without any major changes in the algorithm itself. 3. Derivation Free Mechanism: Due to the random initialization in these algorithms, there is no need to calculate derivatives of the search spaces to find an optimum. 4. Local Optimum Avoidance: Unlike other algorithms, meta-heuristics fare better when they encounter a local optimum. Due to the randomness incorporated within these algorithms, they have a better chance at getting out of local optima.
It is of note that, a meta-heuristic algorithm is not suitable for all optimization problems. The NFL Theorem, or No Free Lunch Theorem which has proved that a meta-heuristic algorithm can give very promising solutions on some optimization problems, but cannot show good performance on all optimization problems. This is the reason why new meta-heuristic algorithms and improvements of existing meta-heuristic algorithms is imperative. Therefore, it is the aim of this paper to propose a novel meta-heuristics that is inspired by the cell division behaviour of all cells. The architecture of the various agents, the iterative steps and as well as the optimization strategy in the proposed algorithm contribute to the reason of its superiority over other well known and recently proposed meta-heuristics.

Contributions and Novelty
The main contributions of our work are summarized below: 1. This work proposes a novel meta-heuristic optimization algorithm that works better for complex global search. 2. The proposed algorithm automatically balances exploration and exploitation innately and efficiently, better than other algorithms (Refer Section 6). 3. The proposed algorithm is able to find a near-global optimum solution in case of most multimodal problems. 4. The algorithm uses the concept of Genetic Diversity inside a colony of cells which is the reason for its efficiency and better performance. 5. The proposed algorithm can be easily adopted for any kind of optimization problems.
The proposed algorithm has taken inspiration from the process of Cell Division that happens in cells of any and all living organisms on earth, including us humans. The algorithm has the following characteristics: • The algorithm has no total population parameter thus, the total population is variable. This way, the total computation resources that are being used over the course of running the algorithm is lesser when compared to existing algorithms. • The Process of mitosis and meiosis govern both exploration and exploitation. Both are managed innately by these two processes; this makes the convergence of the algorithm more efficient.
• Stagnation and getting stuck at a local optima is innately avoided by adding in genetic diversity.
1. As soon as the population of cells in the algorithm reaches the limit, a colony level disaster is simulated. This disaster not only curbs the low fitness cells however, it also introduces genetic diversity. 2. The process of Meiosis and Mitosis also add to the genetic diversity of the colony.
• This Genetic Diversity is the reason why CDO does not get trapped into local optima and still has a good balance of Exploration and Exploitation capabilities.

Paper Organisation
The rest of this paper is organized as follows. Section 3 presents a brief description of the metaheuristic algorithm proposed in the literature. Section 4 provides a background of the proposed methodology. Section 5 discusses the detailed descriptions of the proposed meta-heuristic algorithms. In Section 6, we describe and discuss the extensive experimental results. The results and discussion of benchmark functions, and two real applications are presented in Sections 6 Results and discussion. Section 8 discusses the possible reasons for its superiority even in Composite Benchmarks with complex landscapes and many local optima. Finally, the conclusions and future works are presented in Section 9.

Review of literature
Swarm Intelligence is a meta-heuristic approach that is widely used to solve optimization problems. The core of Swarm Intelligence lies in the collective behaviours of social creatures. Algorithms have been developed to solve various types of optimization problems [6,9] including but not limited to non-linear [14], non-convex [15], multi-modal [16] or combinatorial optimization problems [17]. The two most important properties for adapting the behaviour of social creatures are self-organisation and division of labour [18][19][20][21].
1. Self-organisation: This feature defines the structure of the many organisms that form the swarm in consideration. Since there is no central authority, the global level response of the swarm is highly dependent on the local interactions of the swarm members. There are four important characteristics that self-organisation is based upon which have been defined by Bonabeau et al [22]. (a) Positive Feedback: Similar to backpropagation, feedback is the output of the system to the current input which acts as an input for further iterations of the meta-heuristic. This provides diversity and brings the system to a new stable state. (b) Negative Feedback: This helps to stabilize the current system and compensates the effect of positive feedback. (c) Fluctuations: It defines the rate or magnitude of changes in the system. As randomness helps to locate new solutions and dealing with local optimum solutions, it is considered a crucial element for metaheuristic algorithms. (d) Multiple Interactions: These interactions provide a way of learning from the other members of a swarm. 2. Division of Labour: Performance of simultaneous tasks is believed to be more efficient than sequential performance, thus division of labour marks another feature of immense importance for Swarm Intelligence [22].
Meta-heuristics are mainly classified into three categories: 1. Evolutionary Algorithms(EA) 2. Physics-based Algorithms 3. Swarm Intelligence based Algorithms Pictorial representation of the different optimization algorithms along with their category is given in the Fig. 1.
Evolutionary Algorithms are inspired by the concept of evolution in Nature. One of the most important algorithms in this category is the Genetic Algorithm [2] which is widely used today for a plethora of problems [23][24][25]. This algorithm simulates concepts of Darwinian evolution. Engineering applications of the Genetic Algorithm were investigated by Goldberg [26]. In evolutionary algorithms, the optimization starts with a randomly generated solution. In each successive iteration, new members are generated with selection, crossover, and mutation operations. The solutions are evolved with the view of generating better solutions in each iteration. A few Evolutionary Algorithms are Differential Evolution [27], Evolutionary Programming, [28,29], Evolution Strategy [30,31] and Genetic Programming [32] The second type of meta-heuristics is one of the Physics-Based algorithms which have taken inspiration from concepts in physics like the phenomenon of . Some of the most popular algorithms are Gravitational Local Search (GLSA) [33], Big-Bang Big-Crunch (BBBC) [34], Gravitational Search Algorithm (GSA) [35], Charged System Search (CSS) [36], Central Force Optimization (CFO) [37], Artificial Chemical Reaction Optimization Algorithm (ACROA) [38], Black Hole (BH) algorithm [39], Ray Optimization (RO) algorithm [40], Small-World Optimization Algorithm (SWOA) [41], Galaxy-based Search Algorithm (GbSA) [42], and Curved Space Optimization (CSO) [43]. The difference between Evolutionary and Physics-based algorithms is that in Physics-based algorithms a random set of agents communicate and move in the search space according to the physical rules.
The third sub-class is the one containing Swarm Intelligence methods. These metaheuristics mimic the social behaviour of swarms, flocks, herds etc. The simulated individuals act as agents for exploring a search space.

Background
In this section the inspiration of the proposed method is first discussed. Then, the mathematical model is provided.

Cell Division
All living beings are composed of cells and Cell Division [92,93] is the process in which a cell divides into two or more daughter cells. In all living organisms cells continuously divide and new cells replace old cells every second. There are mainly two processes by which a cell can undergo cell division: Mitosis [93][94][95] and Meiosis [96,97]. Mitosis is a form of asexual reproduction which results in the birth of two identical daughter cells. Meiosis is a process that facilitates sexual reproduction, in which a diploid cell divides into four different haploid cells, called gametes, which further combine with other haploid cells to produce a diploid daughter cell.

Mitosis
Mitosis [93][94][95] is the process in which a parent cell divides into two identical daughter cells. In multi-cellular organisms, mitosis serves as a way for growth and replacing worn-out cells. Thus, mitosis is extremely crucial for life to go on. Defects formed during mitosis can potentially lead to genetic disorders. Mitosis mainly consists of 5 stages based on the physical state of the chromosomes and spindle. These phases are prophase, prometaphase, metaphase, anaphase, and telophase. Interphase is a stage that takes place before mitosis, hence, it is not counted as a part of mitosis.
• Prophase: In this stage, chromosomes recruit a protein complex called condensin and start to undergo a condensation process. As the two pairs of centrioles travel to opposite poles during prophase, the spindle begins to develop. opposing ends of the cell at this stage. The chromatids in the centromere are broken and moved to opposite poles of the cell as the spindle fibres compress. The cell will be elongated by spindle fibres that are not linked to chromatids in order to prepare it for division. • Telophase: The cell has lengthened and is nearly completed its division at this point. Cell-like traits, such as the reconstruction of two nuclei, begin to resurface (one for each cell). The chromosomes condense and the mitotic spindle's fibres break down.

Meiosis
Meiosis [96,97] is a type of cell division exclusive to sexually reproducing organisms. In this process, there are two rounds of cell division which result in the birth of four daughter cells with each having half the number of chromosomes of the parent cell. These haploid cells then further fuse with haploid cells produced by other organisms to give birth to a new zygote of that organism's species. The two rounds of meiosis are termed as Meiosis I and Meiosis II.

Meiosis I
Meiosis I segregates homologous chromosomes and produces two haploid cells. It consists of 4 stages: prophase I, metaphase I, anaphase I, and telophase I.
• Prophase I: Similar to mitosis Prophase, chromosomes start to undergo a condensation process. The difference here is that the chromosomes do not pair up. This step does not take place in mitosis and is the reason why the resulting cells are haploid.

Meiosis II
This process involves equational segregation, i.e., separation of sister chromatids (a chromatid is one of two identical halves of a replicated chromosome). This is similar to mitosis however, the difference lies in the results. Mitosis produces two diploid offsprings (resulting cells are complete cells and contain paired chromosomes) while meiosis produces four diploid gametes (gametes are like incomplete cells, they have unpaired chromosomes). It consists of 4 stages: prophase II, metaphase II, anaphase II, and telophase II.
• Prophase II: Similar to Prophase I, this stage prepares for the cell division of the two haploid cells produced in Prophase I. The nuclear envelopes disappear and centrioles are formed. The chromosomes begin to get pulled toward the metaphase plate.

Levy Flight
Levy Flight [98] is a random walk algorithm in which the step lengths have a heavy-tailed Levy Distribution. Named after the French Mathematician Paul Lévy, the term was coined by Benoît Mandelbrot. Levy flight operates under the following function.
The equation to find the new position is: where X t , X t+1 are the current and new positions respectively, S is the step size, λ is a constant (1 ≤ λ ≤ 3) and α is a random number generated between [-1,1]. Figure 4 shows a simulation of Levy Flight in a 2-dimensional plane.

Cell Division Optimization
The algorithm proposed by this paper works by simulating a colony of cells. By Darwin's law of Survival of the fittest, the algorithm iteratively weeds out solutions with bad fitness. The algorithm starts with an initial population of cells that act as the progenitors of the colony. These cells are randomly generated within the given bounds. Each iteration in this algorithm represents the lifespan of the cells. In each iteration, the cells undergo one of the two cell divisions, Mitosis and Meiosis. In mitosis, the parent cell divides to produce two identical daughter cells. Mutation occurs in the Prophase of the mitosis process and thus mitosis becomes a very powerful tool to exploit a given region to find out a local optimum. In meiosis, the DNA of two-parent cells merges through a

Cells
Each cell in the colony represents two possible solutions for the given problem. The inspiration for this is taken from homologous chromosomes that occur in each of our bodies. Humans have 23 pairs of chromosomes and for each pair, there is a maternal and a paternal chromosome. Thus, each cell in the algorithm has a paternal and a maternal influence which is generated through meiosis.

Gametes
Gametes are the products of Meiosis. They are haploid, which means they contain half the number of chromosomes as the parent cells. This implies that each gamete represents one possible solution. The solutions contained in Gametes are intermediary and they are not evaluated. As Gametes are haploid, i.e., do not contain a complete solution, each gamete needs to be paired with another gamete of a different cell. These two gametes are used to get a new cell, which has equal contributions from both the gametes. Algorithm description of this model is presented in Algorithm 5.2.5.
In each iteration, cells either go for mitosis or meiosis, based on a Cell Division Probability given to the algorithm while instantiating it. This probability governs the exploration and exploitation capabilities of the proposed algorithm. A higher mitosis probability (lower chance of meiosis) focuses more on exploitation while a low mitosis probability (higher chance of meiosis) focuses more on exploration.

Interphase
For each cell in the colony, the two chromosomes are duplicated and stored with the cell object. These four chromosomes further go on to make the new cells. As interphase occurs before the process and prepares the cell for division, it not considered as a part of both meiosis and mitosis.

Mitosis
The process of mitosis starts after the interphase stage. As, for each cell the mitosis has to focus on the exploitation of a region, we rely on the mutation factor.

Levy Flight
Levy Flight has been used in mitosis to facilitate the random alteration capability (usually known as mutation). Breaking down Levy Flight into its bare essentials, • There's a high probability that after doing levy flight, the object in question would be somewhere close to its original position. • There's a very low probability that after doing levy flight, the object would cover a far greater distance and abandon the neighbourhood it was in.
The mathematical formulation of the step size of a single gene for Levy Flight is determined by: .., x in ) represents one of the four chromosomes of the cell after Interphase. Rand(.) represents a random number between 0 and 1. Levy flight is performed for each gene in all four of the chromosomes in the cell. This represents the prophase to metaphase of the mitosis process. After these steps, the chromosomes are interchanged and paired together. These two pairs are then used to create the new daughter cells. This is the anaphase and the telophase of the mitosis process.

Algorithm 1 Mitosis
cell ←Cell that has to undergo Mitosis Interphase Duplication(cell) SET new chromosomes to empty list for chromosome ∈ (cell → chromosomes) do SET new genes to empty list p best ← xi for i ← (0 → length of chromosome) do Perform Levy Flight end for APPEND new genes to new chromosomes end for Generate 2 cells from the 2 pairs of chromosomes in new chromosomes

Meiosis
In each iteration, cells are chosen to perform meiosis are randomly paired together and meiosis is carried out for each of them. The gametes produced from this are then made to reproduce new cells with their pairs. Like mitosis, meiosis also requires interphase to happen. Thus again, the chromosomes are duplicated. In meiosis, a crossover of genes occurs in prophase I. This crossover is a facilitating factor for exploration in a region. After interphase duplication, two intermediate daughter cells are birthed.
These intermediary cells then further divide via Meiosis II to produce four gametes.

Algorithm 2 Meiosis
cell ←Cell that has to undergo Meiosis Interphase Duplication(cell) comment: Meiosis I SET new daughters to empty list Perform Crossover of genes Seperate out the odd and even pairs from the four chromosomes in the cell ADD all daughters to new daughters comment: Meiosis II SET gametes to empty list for daughter cell ∈ (new daughters) do SET gamete 1 to empty list SET gamete 2 to empty list ADD gamete 1 and gamete 2 to gametes end for

Reproduction of cells after Meiosis
Since the gametes produced by Meiosis are haploid, in this case, contain only one solution. For a cell that contains two solutions, two gametes must fuse to produce the zygote or the new cell.

Population Control
Like a cell colony, the colony in this algorithm has an exponential growth rate. Since each cell divides in each iteration, the population almost doubles in each iteration. The concept of lifespan cannot be introduced in this context since the previous cell dies to produce new daughter cells.
Hence, an inspiration is taken from nature for keeping a check on the population of the colony. Natural disasters and other factors have been simulated in which only the fittest of the cells survive. The distribution of survival probability is such that the better the fitness of the cell, the higher is the survival probability. SET genes1 to empty list 14: SET genes2 to empty list 15: for j ← (1 to dim) do 16: Append random value in the given bounds to genes1 17: Append random value in the given bounds to genes2 18: end for 19: Append genes1 and genes2 to colony 20: end for 21: SET best = 0 22: for cell ∈ colony do 23: EVALUATE cell with obj f unc 24: if cell → f itness < best then 25: Update best 26: end if 27: end for 28: for t ← (0 to max iter) do 29: SET new cells to empty list 30: SET mitosis list to empty list 31: SET meiosis list to empty list 32: for cell ∈ colony do 33: Use In this section, In this section, the comparative performance of the proposed methods in comparison to the state-of-the-art algorithms is presented. All algorithms were implemented using Python 3.7.9 and performed on a 3.1 GHz Dual-Core Intel Core i5 with 8GB of memory running Mac OS 11.2.

Benchmark Functions
In this paper, the 20 classical benchmark functions and 30 benchmark functions with varying landscapes and modalities have been used for empirical analysis.  Tables 2 and 3 show functions, f 1 to f 19 which are the simple functions used by other complex functions. Table 4 and 5 represent the 20 classical benchmark functions. Tables 6 shows basic functions, F 1 to F 10. Function F 2 has been declared as deprecated and thus not taken. Tables 7 and 8 show the Functions F 11 to F 20. These are hybrid Functions. In Hybrid Functions, sub-components of the variables have different properties. The mathematical formulation of these hybrid functions is given by the following equation.
The tabulated summaries of the complex functions F 21 to F 30 are presented in Tables Tables 9  and 10.
The mathematical formulation of these composite functions is given by the following equation.
F (x) : composition function g i (x) : i th basic function used to construct the composition function N : number of basic functions o i : new shifted optimum position for each g i (x), define the global and local optima's position bias i : defines which optimum is global optimum σ i : used to control each g i (x) 's coverage range, a small σ i give a narrow range for that g i (x) λ i : used to control each g i (x) 's height w i : weight value for each g i (x), calculated as below: In addition to these functions, two real-world problems, pressure vessel design and frequency modulated sound wave, have been used to verify the performance of the proposed meta-heuristic algorithm for real world problems. Refer to Section 7.
The performance of the proposed algorithm is tested with the ten well-known meta-heuristic optimization algorithms Particle Swarm Optimization(PSO) (1999) [

Parameters
Algorithms considered in this paper were initialized randomly and allowed to run 10 times independently with maximum 500 iterations, while the size of the population in each algorithm is kept at 10 to 500 . Table 11 shows the parameters that govern the experiments performed. The parameters have been set up in such a way that a fair comparison of the algorithms under consideration can be made

Results and Discussion
Tables 12 and 13 represents the outputs of the considered meta-heuristics on the 20 different classical benchmark functions. The algorithms are run 10 times on each benchmark function and the best, mean and standard deviation are recorded. Similar experiments have been performed for the 30 benchmark functions from the CEC2017 repository [99]. Table 14 shows the results for the basic functions F1 to F10. Table 15 shows the results of the hybrid functions, F11 to F20.
Expanded Schaffer's F6 Function g(x, y) = 0.5 + Lunacek bi-Rastrigin Function The conducted analysis demonstrates that the CDO performs better than the other competitive methods when dealing with the functions of complex landscapes and real life problems. Section 8 discusses the possible reasons behind the performance of the proposed algorithm.
The results on various benchmark functions show that the proposed algorithm is generally superior to other algorithms in most cases. Section 6.5 shows the results of Statistical Tests done on these results which shows the significant difference Ackley's Function Weierstrass Function Griewank's Function HappyCat Function HGBat Function f (x * ) = 0 x * = (420.968 · · · 420.968) f (0, · · · , 0) = 0   between the proposed and currently used algorithms. Section 8 explores the reasons why CDO is performing better based on its inspiration and implementation.

Convergence Evaluation
Convergence is evaluated by recording the progress of each algorithm in each iteration. Horizontal lines in the graphs represent the algorithm not progressing. Having no progress for multiple consecutive iterations implies that the algorithm is stuck in a local optima. Fig 8 and Fig 9 show the convergence characteristics of the algorithms in comparison to the proposed CDO algorithm on Ackley and Xin She Yang N2 benchmark functions. It can be observed in these two cases that except Cuckoo Search and CDO other algorithms considered converge before the earlier. In both cases, CDO consistently improves without horizontal lines, which implies it improves itself in  (Fig. 10, 11, 12 and 13).

Statistical Significance
It is important to compare the statistical significance of the data collected using the proposed algorithm to the algorithms considered in its competition. Student's T-Test has been used to determine this importance. This statistical test is used to assess if two population means differ where the variances are known and the sample size is high. The results are listed in Table 17 and 18. These scores have been calculated with a confidence level (α = 0.95). P-values have been calculated for each pair among the considered variants of PSO. P-values less than 0.05 represent that the difference between the means of the two algorithms is statistically significant. Here, + denotes the proposed algorithm performs significantly better the the compared algorithm on the other hand − denotes the algorithm is not significantly better than the competitive methods.
On CEC2017 benchmark function, for F1 function CDO performs better than the all competitive methods except Krill Herd. For F14 and F27, it perform better than the all competitive methods. As shown in tables 17 and 18, CDO shows

Real-Life Applications
In this section, the proposed algorithm is evaluated in solving two constrained real engineering problems, namely pressure vessel design problem and frequency modulated sound wave problem. These problems contain several inequality constraints. For both these problems, all algorithms have been run for 500 iterations and the same parameters as defined in Section 6.2.

Pressure Vessel Design
The pressure vessel design problem is to minimize the total cost of material, forming and welding of a cylindrical vessel [104]. There are four design variables. 1.
The mathematical formulation becomes: Table 19 shows the results of the algorithms on the Pressure Vessel Optimization Problem. The performance of the proposed algorithm is tested in comparison to the recently proposed algorithms Warm Starting CMA-ES [103] and iL-SHADE [102]. It is evident from Table 19 that the CDO gives results at par with the currently used algorithms however the consistency of the algorithm are better than the algorithm.

Frequency-Modulated sound wave
FM Sound Wave synthesis has an important role in the modern music industry and the problem here is to optimize the six parameters governing this which are [a 1 , a 2 , a 3 , w 1 , w 2 , w 3 ]. The problem is to generate a sound similar to a given sound. The expressions of the estimated sound and the target sound wave are given as: where θ = 2π/100 and the domain of the variables is [−6.4, 6.35].
The final function to optimize becomes:

Exploration and exploitation evaluation
Unimodal Benchmark functions provide suitable verification for the exploitation ability of the algorithm to find an optimal solution. To verify the     Functions. When compared to other algorithms, the results show the superiority of the algorithm in terms of either their best optimum value or the standard deviation in the results. Due to the Genetic Diversity Factor that is added due to its inspiration from the Cell division process, CDO shows competitive exploration abilities while using Levy Flight and Mutation to exploit a particular region.

Local Optima Avoidance
Local Optima Avoidance can be directly linked to the exploration ability of an algorithm. Thus in this case, Multimodal, Hybrid and Composite Functions provide a way to test the proposed algorithm. Moreover, the balance between exploitation and exploration can concurrently be benchmarked by these functions. The Convergence Graphs in Section 6.4 shows how the proposed algorithm

Conclusion and Future Work
This work proposed a novel Meta-heuristic called the Cell Division Optimizer. The proposed algorithm takes inspiration from the reproduction methods of cells in a colony. The 50 well-known benchmark functions were used to benchmark the performance of the algorithm. The results show that CDO was able to provide highly competitive results when compared to other meta-heuristics such as PSO, CS, Krill Herd, FruitFly algorithm, Whale Optimizer, Artificial Bee Colony, Grey Wolf Optimier and Water-wave Optimizer. The benchmarks considered included Unimodal, Multimodal and Composite functions, and the results show that the superiority of the proposed algorithm. Convergence analysis was performed on these algorithms, which asserted the convergence of this algorithm. The algorithm was tested on two real lifeengineering problems which showed its versatility in tackling different problems. Finally, the statistical analysis was performed using Student's T-Test which proved the significance of the algorithm.
For future work, we are going to test the algorithm to solve the different real-life problems to assess its capabilities.

Statements and Declarations
The authors have no relevant financial or nonfinancial interests to disclose.

Data Availability
All data generated or analysed during this study are included in this published article.