Modified grasshopper optimization algorithm-based genetic algorithm for global optimization problems: the system of nonlinear equations case study

Grasshopper optimization algorithm (GOA) is one of the promising optimization algorithms for optimization problems. However, it has the main drawback of trapping into a local minimum, which causes slow convergence or inability to detect a solution. Several modifications and combinations were suggested to overcome this problem. This paper presents a modified grasshopper optimization algorithm (MGOA)-based genetic algorithm to overcome this problem. Modifications rely on certain mathematical assumptions and varying the domain of the control parameter, Cmax, to escape from the local minimum and move the search process to an improved point. Parameter C is one of the essential parameters in GOA, where it balances the exploration and exploitation of the search space. These modifications aim to speed up the convergence rate by reducing the repeated solutions and the number of iterations. Both the original GOA and the proposed algorithms are tested with 19 main test functions to investigate the influence of the proposed modifications. In addition, the algorithm will be applied to solve five different cases of nonlinear systems with different types of dimensions and regularity to show the reliability and efficiency of the proposed algorithm. Promising results are achieved compared to the original GOA. The proposed approach shows an average percentage of improvement of 96.18 as illustrated in the detailed results.

One of the novel MAs based on the swarming nature of grasshoppers is the grasshopper optimization algorithm (GOA). In GOA, the optimal global optimum optimization values of the problem primarily depend on social interaction forces. It is commonly used in several optimization problems due to its fast implementation and high precision, robustness, and performance (Meraihi et al. 2021;Sezavar et al. 2019a). On the other hand, GOA has some drawbacks, including; (1) An unbalancing of exploiting and exploring processes.
So, different GOA modifications solve these limitations in the literature (Algamal et al. 2021;Feng et al. 2020). In (Alphonsa and MohanaSundaram 2019), GOA is enhanced by using a nonlinear convergence parameter, niche mechanism, and the b-hill-climbing technique to overcome the above shortcomings. In (Dwivedi et al. 2020), the authors employed the idea of chaos to create a uniformly distributed population, increase the consistency of initial populations and the capacity to look for a new space, and leveraging the existing search space. To resolve the shortcomings of a traditional GOA and to detect the autism spectrum disorder at all times of life, the authors in Goel et al. 2020 suggested a modified GOA. In (Sezavar et al. 2019b), modified GOA is proposed as a new search method combined with a convolutional neural network (CNN) to solve modeled problems and retrieve similar images efficiently. In (Purushothaman et al. 2020), hybridized gray wolf optimization (GWO) with GOA was suggested to create more acceptable trade-off between diversification and intensification and produce substantially better performance than traditional GWO and GOA. In (Taher et al. 2019), dynamic population quantum binary GOA is proposed as an enhanced GOA for feature selection based on shared knowledge and rough set theory. In , improved GOA was suggested, using the nonlinear comfort zone parameter to enable the iterations of the algorithm and the Lévy flight method to maximize randomness and extend the local search range. Moreover, a random hopping technique helps the algorithm to leap from the local optimum has been implemented in Zhao et al. (2019).
This revealed that GOA has undergone several modifications to address the aforementioned flaws. Consequently, efforts should be done to introduce new methods to enhance the performance of traditional GOA. From this motivation, this paper proposed a modified grasshopper optimization algorithm (MGOA) to overcome the shortcomings of the original GOA and improve its accuracy. The original GOA has a significant parameter C performing as a reducing coefficient that decreases three zones: the comfort zone, the zone of repulsion, and the attraction zone. The coefficient C decreases according to the number of iterations to balance exploration and exploitation in the grasshopper algorithm. It is calculated based on the parameters C max (upper bound of C), C min (lower bound of C), and the maximum number of iterations. The proposed modifications depend on varying the upper bound C max through the genetic algorithm (GA) to help GOA out of the local optima when it falls into it; then, the parameter C is updated according to its classical formula. This paper aims to; (1) Introduce a local search mechanism based on a genetic algorithm (GA) to bring GOA out from the local minimum.
(2) Improving the overall performance of GOA by updating the parameter value C max after each GA stage.
(3) Prevent the divergence of objective function value from the best-reached value by controlling the expected calculated objective function space. Even with a strong local minimum, if no noticeable improvement in the objective function occurred, the modifications transfer the search process to another part of the domain, taking GOA out of the local minimum. (4) Prevent time-consuming, calculation burden and wasting iterations by limiting the number of repeated solutions yielded from GOA by decide whether it trapped into the local minimum or not. (5) Prevent GA from working as a global search or dominating the search process by defining and controlling the search space of GA parameters and keeping track of the parameter C max .
The rest of the paper is presented as follows. Section 2 is shown the preliminaries about the problem. Section 3 introduces genetic algorithms (GA) and grasshopper optimization algorithms (GOA) and explains the proposed MGOA in detail. Section 4 presents the numerical results with discussions. Finally, the conclusion and future works are given in Sect. 5.

Global optimization
Optimization is the search process for the optimum parameters of a given objective function. Its mathematical formula can be written as follows for the single objective optimization problem: Find x ¼ x 1 ; x 2 ; :::; x n ð Þsuch that: Min f x ð Þ; Subject to : where x is the solution vector, f x ð Þ is an objective function, and L d and U d represent the lower and the upper bounds, respectively, for decision variables x d 8 d ¼ 1; 2; :::; n.

The System of Nonlinear Equations
The system of nonlinear equations (SNEs) is defined mathematically as: where each function f i 8i ¼ 1; :::; m is a nonlinear function, which maps the vector x ¼ x 1 ; x 2 ; :::; x n ð Þof the n-dimensional space R n to the real line. Some of the functions may be linear and others nonlinear. The solution for a nonlinear system is to find solutions so that each of the above functions f i 8i ¼ 1; :::; m is equal to zero. Definition 1 If 8i; i ¼ 1; :::; m; f i x ð Þ ¼ 0; then the solution x ¼ x Ã 1 ; x Ã 2 ; :::; x Ã n À Á is called the optimal solution of the SNEs.
Different methods convert the SNEs to an optimization problem (Zhao et al. 2019;Nie 2004Nie , 2006Grosan and Abraham 2008). To solve SNEs, it is usually transformed into an equivalent single unconstrained objective optimization problem, where the objective function is represented as the sum of squared residuals of nonlinear equations. The problem can be represented as: where F x ð Þ is the objective function.

The proposed algorithm
This section presents an introduction of genetic algorithms (GA) and grasshopper optimization algorithm (GOA). In addition, the proposed algorithm is discussed in detail.

Genetic algorithms (GA)
The invention of genetic algorithms (GA) was dated back to the 1960s by Holland and is further described by Goldberg (More et al. 1978). The GA was successfully applied to problems in many areas, including optimization design, neural networks, fuzzy logic control, planning, expert systems, and many more (Failed 2014). GA codes the solution as a chromosome to any optimization problem. It then determines the first population of those chromosomes that is part of the problem's solution space. The search space is defined as the space for the solution in which every chromosome represents each feasible solution.
Before the search begins, the initial population is picked by random chromosomes from the search space. The individuals would then be chosen competitively using calculations based on their fitness which is calculated by a particular objective function. Genetic search operators, including selection, crossover, and mutation, are then used one after the other to produce new chromosomes generation. This process is repeated until the end criterion has been met and the finished solution is the best chromosome of the latest generation. The general GA pseudo-code is displayed in Fig. 1.

Grasshopper optimization algorithm (GOA)
GOA is a recent swarm algorithm that depends on simulating the swarming nature of the grasshoppers. The main feature of the grasshopper's swarm is described in two phases. The first one is the larval phase, where the steps are small and slow, while, in the second adulthood phase, the steps are long range and fast. Food source seeking is another important characteristic of the swarming of grasshoppers. The process of foraging can be divided into two branches exploration and exploitation.
In the GOA, each grasshopper represents a solution in the population. The grasshopper's position calculations depend upon three types of forces. These forces are the social interaction between each one and other grasshoppers S i , the gravity force G i , and the wind movement A i . The resultant force that affects each grasshopper is defined as: The social interaction force between each grasshopper and other grasshoppers is defined as: where d ij is the distance between the grasshopper i and grasshopper j.
There are two main types of forces between grasshoppers, the attraction force, and repulsion force. The function (s) represents the strength of these two-social forces. It is defined as: where f and l are the intensity of the attraction and the attractive length scale. The changing of these parameters affects the social behavior of the grasshoppers. The interval [0, 2.079] represents the distance, where a repulsion force will occur between two grasshoppers. It should be mentioned that when the distance between two grasshoppers is 2.079, there will be no social force between grasshoppers which will form the comfortable zone. If the distance is more than 2.079, the attraction force will increase. Then, the attraction force will gradually decrease when it reaches 4. The distance between every two grasshoppers affects the calculations of social forces. When the distance between the grasshoppers is more than 10, their function (s) may fail to apply the forces between them. As a result, the distance of grasshoppers is mapped in the interval Dorigo and Stützle 2004). The second affected force is the gravity force which is defined as: where g is the gravitational constant and b e g is the center of earth unity victor.
The nymph grasshoppers' movements correlated with wind direction because they have no wings. The wind direction can be calculated as: where u is a constant drift and c e w is a wind direction unity vector.
The position of the grasshopper can be calculated as: To achieve convergence and prevent the grasshoppers from reaching the comfort zone before reaching the solution, Eq. (9) can be represented as: where ub d ; lb d are the upper and lower bound and c T d is the target value in the dth dimension. C is a decreasing parameter for controlling the zone of attraction, repulsion, and comforting zone. It decreases proportionally to the number of iterations to balance the exploration and exploitation of the algorithm. This constant can be calculated as: where C max and C min are the maximum and minimum values, l is the current iteration, and L is the maximum number of iterations. The steps of the GOA algorithm can be summarized as: 1. Initialization In this step, the values of the parameters are defined. These values include C max , C min ; F, l, and the maximum number of iterations. 2. Population initialized The initial population of grasshoppers is generated randomly. 3. Evaluation The fitness of each candidate (grasshoppers) in this population is evaluated. 4. Assign the best solution After evaluating all the solutions, the best solution is defined as the target position ( c T d ), and its corresponding fitness is assigned as the target fitness. 5. Updating the decreasing coefficient parameter The value of the decreasing coefficient C is then calculated at each iteration depending on Eq. (11) 6. Updating solution Each solution in the population will be updated as in Eq. (10). 7. Solution boundaries violation After updating the solution, all solutions should be examined to be within the predetermined boundaries (feasible). If any solution violates its upper or lower bounds, it is returned to its domain. 8. Repeated the main loop Steps 3-7 are repeated until the termination condition is fulfilled.
9. Termination criteria The algorithm is terminated, and the best solution is displayed when either the termination condition or the maximum number of iterations is met. Figure 2 displays the flow chart of the proposed algorithm. The figure shows that the GOA algorithm starts the optimization process by using steps in subsection 3.2. If the termination criterion is met, the algorithm is terminated, and the best solution c T d with its objective value is displayed as the desired solution. Otherwise, the algorithm would be checked whether it is trapping into a local minimum. If the answer is no, the optimization process will be completed with the GOA. If the answer is yes, the genetic algorithm (GA) is applied to help GOA get out of the local minimum. This goal will be achieved by detecting another new best target position c T d , defining a modified value of the parameter C max . Again, if the termination criterion is reached, the algorithm terminates, and the best solution c T d and its corresponding objective value is displayed. Otherwise, the target position c T d , its fitness, and C max will be updated, and GOA will continue the optimization process based on these updated values. These steps are repeated until the defined termination criterion is reached and the optimal solution is obtained.

GA stage to get GOA out of the local minimum
In this stage, the optimization problem, i.e., variables and objective functions, is defined based on certain assumptions and mathematical formulations. As shown in Fig. 3, suppose GOA is stuck at a local minimum point with a target position X 1 , fitness value of F 1 , and value of the parameter C ¼ C 1 . GA aims to transfer the search process to a better point to get a new c T d , and target fitness. Assuming the next point yielded from GA is (C 2 ; F 2P ) as shown in Fig. 3. Hence, according to the figure, the value of C will be changed where C 2 is the new value of parameter C and F 2P is the predetermined enhanced objective value.
According to Fig. 3, the change in parameter C can be calculated as: As F 2P \F 1 , it can be expressed as: Then, the presumed change in fitness function can be calculated as: According to the values of fh; a}, the change in parameter C can be determined and the new value of C can be calculated as:  To ensure that the new point is always better than the previous one, the domain of the angle (h) is restricted to the interval p; 2p ½ : According to the previous equations and assumptions, the variables in the genetic algorithm will be h; a; b 1 ; b 2 ; . . .; b n ½ where h 2 p; 2p ½ ; a 2 0; 1 ½ Þ; and b i 2 À p 2 ; p 2 Â Ã . As shown in Fig. 4, if the old point has a target position (solution) of X 1 ¼ X 11 ; X 21 ; . . .; X n1 ½ at C ¼ C 1 where n is the number of variables in the system.
At the new point where C ¼ C 2 , the value of each variable can increase or decrease according to the value of the assumed angle (b). So that the new candidate solution can be calculated as: For n variables in the system, the new component of candidate solution or new point can be calculated as: The new solution position will be: Then, the value of actual fitness value of the new point X 2 is calculated as: The parameters in GA will be adapted to enforce the value of actual fitness to be equal to the presumed (desired) improved fitness value yielded from Eq. (13). The objective function of GA can be represented as: After applying GA, the algorithm will terminate if the new target fitness achieves the termination criterion, and the final solution is displayed. Else, the new value of the target position (X 2 ) will be entered as c T d , the new fitness value (F 2A ) will be the target fitness, and the new value of the parameter (C 2 ) will be the value of parameter C max and they will be entered into GOA. The new parameters which are introduced to GOA are c T d ¼ X 2 , TargetFitness ¼ F 2A ; and C max ¼ C 2 . Then, the calculations of GOA will start again, and the parameter C will decrease according to the formula of Eq. (11) in the classical GOA based on the updated value of C max .
For more illustration of the methodology and its steps, Fig. 5 contains the pseudo-code of the MGOA. Figure 5 summarizes the MGOA algorithm in the following steps: 1. The standard GOA algorithm is used to begin the search process until it is unable to improve the obtained best solution in the population without meeting the termination condition, where GOA is stuck in a local minimum. 2. As detailed in Sect. 3.3.1, the GA seeks to pull GOA out of the local minimum. 3. At the end of the GA loop, C max , the best solution and its corresponding best fitness value are updated.
3:1 If the termination criterion is met, the algorithm terminates and the optimum solution is displayed.
4. The GOA search continues based on the updated parameter and best solution until the optimal solution is found or another local minimum is reached. 5. If GOA becomes trapped in a new local minimum, the GA loop will be repeated. 6. These steps are repeated until the termination criterion is met or the maximum number of iterations is reached.

Numerical results and discussions
The performance of the proposed algorithm is investigated in this section using the 19 test functions that were assessed using the original GOA algorithm (Shehab et al. 2019). These test functions are unimodal, multimodal, and composite. The mathematical formulation of these test functions is available in Shehab et al. (2019). The results will be compared with those obtained by the original GOA to show the proposed modifications' benefits and their influence on reaching an optimal solution. Furthermore, five systems of Fig. 4 Predicted change in X 1 according to changes in C nonlinear equations are solved as a case study for MGOA (Broyden 1996). The proposed algorithm was coded in MATLAB. 2020a and implemented with a Core (TM) i7, Intel(R) CPU with 3.2 GHz and 16 GB RAM. For computational studies, a population size of 20 and 50 generations is used, crossover fraction is 0.8, and all other parameters and functions were default as in GA tool. Also, the termination criterion for both algorithms (GOA and MGOA) is defined as: where kF optimum k is the optimum value of the objective function which is (0) in all test functions and nonlinear system cases while kF l k is the calculated objective function at each iteration l.
It should be noted here that both algorithms (GOA and MGOA) have the same maximum number of iterations, which was identified to (7000), and all results have been taken from the first run. In addition, when either of them has met the termination criterion, the calculations terminates, and the number of consumed iterations is displayed. Furthermore, the mechanism of MGOA to decide whether GOA is trapped into the local minimum or not is to count a limited number of iterations that produced the same solution. This number was limited to 3 consecutive iterations in MGOA. Finally, in Eq. (13), F 2P is calculated as F 2P ¼ aF 1 where a 2 ½0; 1Þ. The upper limit of the domain of a should not be precisely 1 (a\1) to avoid continuous trapping into the same local minimum point and wasting more iterations without significant improvement in the objective function. In the MGOA, the domain of a is defined as [0,0.9]. Finally, it should be mentioned that the parameter C max equals to 1, and the parameter C min equals to 1e-05. Table 1 shows the results of unimodal functions where it presents comparing between the original GOA and the proposed MGOA in terms of their obtained solution and number of iterations, while Fig. 6 shows the convergence curves of the function value. Besides, in test function (F 5 ), the resulting objective values were too large so, they were rationalized by the maximum reached value (F max ) to show the results of both algorithms more clearly.

Results
As shown in Table 1, MGOA found the optimal solutions after fewer iterations than GOA. For example, after 4960 iterations, the GOA found the solution in F 1 . While MGOA found it after 131 iterations, 37 of them were used by the GA stage to avoid trapping into a local minimum. In addition, the GOA could not find a solution until it reached the maximum number of iterations as in cases of test functions (F 2 and F 7 ). The results illustrate how the MGOA converged faster and was more reliable than the original GOA. Table 1's last column indicates the suggested algorithm's improvement as a percentage relation, which is determined as follows: As shown in Fig. 6, the straight segments in GOA curves represent the periods of non-improving of the objective function due to trapping in a local minimum. In some cases, the GOA succeeded in getting out of the local minimum. During the computations, however, it falls into other local minimums until it reaches the optimal solution as illustrated in (F 1 , F 3 , F 4 , F 5 , F 6 , and F 8 ). In addition, GOA was unable to solve the two cases (F 2 and F 7 ) until the maximum number of iterations had been used. On the other hand, MGOA improved all results significantly as indicated in Table 1. This revealed that MGOA guides GOA to eliminate the local minimum and enhance the search results, reducing the number of iterations and, as a result, time, by preventing iterations from being used without improvement and convergence to the solution.
For multimodal test functions, Table 2 shows the results of these functions, presenting the results of both GOA and  Fig. 6 Convergence curves of unimodal functions MGOA.  Figure 7 shows the convergence curves of the function value of these test functions. The figures are drawn as (r) versus (F), where r is the current iteration's ratio to the algorithm's maximum number of iterations. F is the value of the objective function to show the convergence curves of the two algorithms more clearly. It is worth mentioning that r was calculated after the termination of each algorithm and according to its results. The ratio r is represented as: r ¼ iteration The whole number of iterations consumed by both algorithm As shown in Table 2 and Fig. 7, MGOA reached the solution in fewer iterations than that ofthe original GOA. In addition, the horizontal segments in convergence curves of GOA show the wasted iterations in remaining in the same local minimum. On the other hand, MGOA overcame this drawback and reached an optimum solution by continuously minimizing the objective function. The results and figures illustrate the effectiveness of the proposed GA stage with its parameters and objective function to improve the performance of GOA. Table 3 presents the results of the composite test functions, along with the percentage of results improvements. As shown in Table 3, MGOA found the optimum solution for all cases in fewer iterations, whereas GOA only resolved two test functions after consuming a large number of iterations. Figure 8 shows show the curves of the objective value of absolute error function (F) versus the parameter (C) in both algorithms. The aim of plotting this relationship is to show how changing the parameter C max can improve MGOA's performance in terms of convergence to the best solution. In addition, it shows that the classical decreasing C max leads to trapping into the local minimum. For example, in the test function F 17 , in GOA, the value of parameter C max starts with 1, decreasing according to Eq. (11) during the optimization process. However, MGOA starts with 1 until the GOA falls into a local minimum. Its domain was then edited using the GA stage whereby the parameter C max increased to 1.05 then and then started to decrease again from this updated value according to Eq. (11). In test function F 18, the value C max continuously reduced until it reached almost 0.85; then, GOA was trapped into a local minimum, and the GA stage was needed. By applying the GA stage, the value of C max was updated to nearly 0.95 and introduced with other parameters to GOA again. Furthermore, in the test functions F 15 and F 16, the value of C max was updated by decreasing but in a different slope than resulted by Eq. (11). It is worthy here to state that the main target of the GA stage is to update the value of C max by increasing or decreasing it to escape from the local optima. Finally, these results show that how the proposed MGOA led to continuous decreasing of the objective function value until the desired tolerance and optimum solution were achieved.

Analysis of results and discussion
In this subsection, different important remarks of the proposed algorithm and analysis of results are discussed.
1. MGOA does not use both GOA and GA to solve the optimization problem and to find an optimal solution. 3. Liberating the domain of parameter C max helps MGOA to get GOA out of the local minimum and to achieve a balance between exploration and exploitation. 4. In addition, GA is not concerned with solving the original optimization problem since it is applied when GOA is fallen into a local minimum. So, in MGOA, GA acts as a local search algorithm, not a global one. 5. The proposed hybridization or modification achieves the aimed enhancement in the GOA performance, where the average percentage of improvement in MGOA for all 19 test cases illustrated above is about 96.18% which is calculated as: 6. For further illustration, Fig. 9 shows the parameter space and search history for the two test functions F 1 and F 14, using GOA and MGOA.
a It is evident from Fig. 9 that the GOA distribution of grasshoppers around the solution was so condensed because of remaining in local minimum for extensive periods or a large number of iterations. After 4960 iterations, GOA in F 1 was, however, able to identify a solution. While in F 14 , GOA was unable to detect a solution because of trapping into the local minimum and spending all iterations at the same point. b On the contrary, MGOA got rid of several local minimum points and continues to explore the search space until an optimum solution is found for both the F 1 and F 14 test functions in a smaller number of iterations. In addition, as trapping is eliminated at the local minimum as quickly as possible using the GA stage, the distribution of grasshoppers is more widespread around the optimal solution point, allowing MGOA to explore and exploit more rapidly.

Case study: solving nonlinear systems of equations
In this subsection, MGOA will be applied to various types of nonlinear systems of equations. All details of these nonlinear systems can be found in More et al. (1978). First, the system of nonlinear equations is converted to an optimization problem according to Eq. (2). Then, this optimization problem is solved by the proposed algorithm MGOA. The results will be compared to Newton's method (NM) (Nie 2004(Nie , 2006Grosan and Abraham 2008) and Levenberg-Marquardt's algorithm (LMA) (Failed 2014;Broyden 1996;Jarry and Beneat 2016). In each case, n represents the number of variables; however, m is the number of equations. The descriptions of these nonlinear equation systems are as follows: 1 Freudenstein and Roth function: This system is square with n ¼ m ¼ 2; 2 Kowalik and Osborne function: This system is nonsquare with n ¼ 4; m ¼ 11, where y 0 i s and u 0 i s are constants and their definition can be found in Broyden (1996). 3 Biggs EXP6 function: This system is square with n ¼ 6; m ¼ n; f i x ð Þ ¼ x 3 e Àt i x 1 À x 4 e Àt i x 2 þ x 6 e Àt i x 5 À y i ; where t i ¼ 0:1i. 4 Osborne 1 function: This system is non-square with n ¼ 5; m ¼ 33; where y i 0s are constants and its values can be found in More et al. (1978) Table 4 shows the results of these nonlinear systems using MGOA compared with Newton's method (NM) and Levenberg-Marquardt (LM) algorithm as the most popular conventional algorithms in solving nonlinear systems. Figure 10 shows the convergence curves of the three algorithms. Table 4 clarifies how the MGOA algorithm solved all nonlinear systems in fewer iterations. In addition, despite the high ability of the LM algorithm to deal with nonlinear systems, it failed to detect the solution in cases of singularity. However, MGOA overcame this issue as it does not require derivatives calculations. Table 4 and Fig. 10 show the ability of the proposed algorithm MGOA to detect a solution despite the singularity of these systems because it does not depend on the derivatives. However, both NM and LM failed to obtain the Parameter space and search history for F 1 and F 14 using GOA for MGOA solution in addition to the calculation burden due to calculations of the jacobian matrices. The results show that MGOA successfully handles nonlinear systems that are even singular square or non-square.

Conclusions
In this paper, a modified grasshopper optimization algorithm (MGOA) is proposed by modifying the classic grasshopper optimization algorithm (GOA) and the genetic algorithm (GA). The modification aims to save the GOA from falling into local minimum and to improve the convergence rate. The MGOA achieves that by reducing the number of repeated solutions and decreasing the number of spent iterations. These modifications are based on some mathematical assumptions and relations that depend on releasing the calculations of the parameter C to move to a new and improved point in the search space. MGOA was tested on the same 19 test functions that the original GOA solved to investigate and verify the influence and responses of the proposed modifications. The results showed a significant improvement in the acceleration of convergence and the success of MGOA in finding the optimal solution compared to the classic GOA. The test functions' results illustrate via tables and figures how the MGOA succeeded in reaching the optimum solution in fewer iterations than the GOA. Besides, the convergence curves show how the GOA consumes time in trapping into the local minima. Hence, the MGOA terminates this sticking in the local minimum and moves the GOA to a new better search space by using the GA. The proposed hybridization achieves the aimed enhancement in the GOA performance, where the average percentage of improvement in MGOA was about 96.18. Also, MGOA was used to solve five applications of nonlinear systems based on transforming their equations into a single unconstrained optimization problem with the objective function of minimizing overall residual function. The MGOA demonstrated remarkable success in solving the various square, non-square, and singular nonlinear systems cases, whereas classical methods such as Newton's method or the Levenberg-Marquardt algorithm failed to detect the solution. All results and comparisons ensure the effectiveness, reliability, and validity of the modified algorithm.
In future works, the proposed approach can be modified to solve other optimization problems such as constrained optimization problems, nonlinear bilevel programming problems, interval quadratic programming problems, data clustering problems. Large-scale engineering challenges, such as resource allocation difficulties, economical load transfer problems, unit commitment concerns, wind farm optimization of wind turbines, and real-time applications, may all be addressed using the suggested approach. Finally, it can be extended to solve multiobjective optimization problems. Author contributions MAE contributed to conceptualization; MAE and HAO contributed to investigation, writing-review and editing, have read, and agreed to the published version of the manuscript; and HAO contributed to methodology and writing-original draft.
Funding The authors received no specific funding for this work.
Data availability All data used to support the findings of this study are included in the article.

Declarations
Conflict of interest The authors declare that there is no conflict of interest regarding the publication of this paper.
Ethical approval This study is not supported by any sources or any organizations.
Human and animal rights This article does not contain any studies with human participants or animals performed by any of the authors.

Freudenstein and Roth function: Kowalik and Osborne function
Biggs EXP6 function Osborne 1 function Penalty function П Fig. 10 Convergence curves for solving nonlinear system of equations