2.2.2. Optimization with MHGA
Since 1960, imitations of natural phenomena have been considered for using in powerful algorithms to solve optimization problems, which have been called evolutionary computation methods. Over the past 20 years, a new type of heuristic algorithm has emerged that essentially seeks to combine basic exploration methods with the aim of efficient and effective search of search space in higher-level frameworks. In simple terms, the algorithm creates a population of results for the problem. Then, the best results are selected. These results are reproduced and create another new environment of results in a different situation from the previous one. This process ends when the results obtained meet certain criteria. Nowadays, these methods are called meta-heuristic algorithms. Genetic algorithm is one of the most important heuristic algorithms that are used to optimize the performance of various functions. In this algorithm, past information is extracted according to the inheritance of the algorithm and used in the search process (Masihi et al. 2012). Perceptron is a machine learning algorithm that falls into the category of supervised learning. The Perceptron algorithm is a binary classification algorithm and is a type of classification that can decide whether an input belongs to a class or not, depending on the input vector. On the other hand, this algorithm is a linear classifier, in the sense that it predicts according to the weighted linear composition of the algorithm input. In addition, this algorithm is an online algorithm because it checks its inputs individually over time. The Perceptron algorithm was invented in 1957 in the Cornell Aeronautical Laboratory by Rosenblatt, (1957). In fact, this algorithm is one of the first artificial neural networks. Holland introduced (1975) a basic concept of genetic algorithm. In this algorithm, after formation of the initial population, operators such as cross over or genetic mutation improve the population cycle and finally, population of the next cycle are selected by a targeted random selection. Multiple generations in next cycles improve the population and lead continuous improvement of the society individuals. The ant colony optimization (ACO) algorithm for accessing food was introduced by Dorigo, (1992) as a multi-factor solution to optimization problems such as the traveling salesman problem (TSP). This algorithm is based on the ants finding food method, and the process is such that each ant has an effect on finding food. Thus, if one ant finds food, it puts two effects on the path due to back and forth, and if it has not found any food, it has one effect, so if another ant wants to choose a path, it will choose a more effective path and thus better routes are selected in next cycles. Bird algorithm or particle swarm optimization (PSO) algorithm was proposed by Kennedy and Eberhart, (1992). It is an evolutionary computational algorithm inspired by nature and based on repetition. The source of inspiration for this algorithm was the social behavior of animals, such as the mass movement of birds and fish. The imperialist competitive algorithm (ICA) provides an algorithm for solving mathematical optimization problems by mathematical modeling of the socio-political evolutionary process. The ICA developed by Atashpaz-Gargari and Lucas, (2007) is a primary set of possible answers. These primary responses are known in the genetic algorithm as chromosomes, in the PSO algorithm as the particle, and in the ICA as the country. The ICA improves gradually these initial answers (countries) with a certain trend, and finally provides the appropriate answer of the optimization problem (desirable country). The main foundations of this algorithm are the policy of assimilation, colonial competition and revolution. This algorithm, by imitating the process of social, economic and political development of countries and by mathematical modeling of parts of this process, presents operators in a regular form as an algorithm that can help solving complex optimization problems. Simulated annealing algorithm is a popular algorithm that has been widely used in many solutions to solve optimization problems after its introduction by Kirkpatrick et al. (1983).This method is a random local search style based on the principles of nature. This algorithm is based on the jumping of metal drops during cooling, which is done according to a certain number. Von Frisch, (1967) was one of the first who used the basic and simple bee practices to solve hybrid optimization problems. He introduced the bee system and used it to solve the famous problem of the traveling salesman. The bee algorithm includes a population-based search algorithm that was first developed in 2005. This algorithm simulates the search behavior of bee groups. Each bee announces three messages to the other bees through dance after finding food. These messages carry the specifications of the distance, type and direction of the desired flower location to the hive. These characteristics include the process for other bees to reach the food location and find the best flower. Other algorithms have been introduced in recent years, including the intelligent water drop (IWD) algorithm or hammer algorithm. This algorithm is an optimization algorithm based on group intelligence developed by Shah-Hosseini, (2009). The IWD algorithm is an algorithm that works as a group and is inspired by nature. This algorithm is originally used for hybrid optimization, but it can also be prepared for continuous optimization. The algorithm was first proposed in 2007 to find a solution to the problem of the traveling salesman problem (TSP). Since then, a number of researchers have sought to improve and apply this algorithm to a variety of problems and issues. The basis of the IWD algorithm is to find the path of water at the lowest point of a surface. Another algorithm introduced by Aghay-Kaboli et al. (2017) was the rain-fall optimization (RFO), which, searches to find the lowest answer to the problem based on the method of finding the lowest point by raindrops. In addition, it was compared with other algorithms in terms of its efficiency in solving several mathematical problems, its speed and answers convergence and the results were satisfactory.
Sajedi and Razak, (2010) modeled the relationship between 7-day and 28-day strength of the high-strength concrete. To do this, they prepare 650 different specimens made of cement, water, microsilica, aggregate and superplasticizer using 50 mix designs, and then obtained compressive strength at the ages of 1, 3, 7 and 28 days. The results of their research showed that multidimensional regression, neural network, and linear regression performed better than each other in estimating 28-days strength compared to 7-days strength. The proposed relationship was well adapted to the processing conditions of the specimens in the steam room and immersion in saturated limewater at 25°C and specimens without slag. Grunewald et al. (2012) proposed an optimal mix design to improve the tensile microstructure of high-performance fiber reinforced cementinious composite (HPFRCC) using an ant algorithm. Guerra and Kiousis, (2006) provided an optimal design for the structure of reinforced concrete using a nonlinear algorithm in MATLAB. The expense of forming, concreting, and reinforcing by workers, as well as materials were considered as inputs of the problem. The aim of this study was to optimize the dimensions of beams and columns in reinforced concrete that was achieved using various experiments and algorithm programming in MATLAB. Chopra et al. (2016) used an artificial network and genetic algorithm to predict the Increase of strength at 28, 56 and 91 days of age. In this study, 49 mix designs with common materials and 27 designs by adding fly ash in laboratory conditions were studied and the studies showed better performance of neural network in predicting strength Increase over time. Wang et al. (2019) proposed a systematic approach for producing a random model with realistic stone properties to simulate a number of stone-based materials. In the first step, the inverse discrete Fourier transform (IDFT) method was used to randomly generate real stones with predetermined particle shape properties. Then, the overlap detection algorithm was proposed to facilitate the random and rapid allocation of irregular stones considering stone material, stone size distribution and stone orientations. Finally, they provided various examples to show that the proposed approach could produce fast and accurate stone materials with the stone properties prescribed. Hosein and Elsadadedy, (2019) examined the effect of high temperature on the subsidence of strength of the HSC using 617 concrete specimens of various researchers' studies. To find the equation for the effect of temperature on strength, the dependent and independent variables were first identified using the neural- network and then the effect of independent coefficients on the dependent coefficients was obtained by applying regression. Finally, the aim of this study, which was to invent several simple regression equations to predict the strength of concrete under the influence of temperature, was achieved. The results showed that, as expected, the maximum temperature plays an important role in controlling the subsidence of the compressive strength of high-strength concrete. The remaining parameters show almost the same level of sensitivity, and the effect of these parameters compared to the reduced temperature is completely negligible. Algorithms have a great ability to find the optimal answer in a large and numerous search spaces. Concretes with multiple components also have a variety of mix designs, so finding the optimal mix design has always been in question.
Genetic algorithm is one of the most widely used heuristic algorithms that is used to optimize various functions. In this algorithm, past information is extracted according to the inheritance of the algorithm and used in the search process. First, Holland introduced a basic concept of genetic algorithm, and then, others expanded it. Genetic algorithms are random search methods that work based on natural selection and natural genealogy. These algorithms have fundamental differences from conventional search and optimization methods, which have been summarized by researchers as fowling (Masihi et al. 2012).
1. The genetic algorithm works with a set of coded responses, not with themselves.
2. The genetic algorithm starts searching in one population of answers and with a set of them, not with one answer.
3. The genetic algorithm uses the information of the fitness function, not the derivatives or other auxiliary sciences.
4. The genetic algorithm uses probabilistic transfer rules, not definitive rules.
Genetic algorithm is a mathematical algorithm that uses Darwinian operational patterns of multiplying the survival of the fittest and converts a collection (population) of individual mathematical objects (usually fixed-length character strings as chromosomes) with a specific adaptation rate into a new population (for example, the next generation) based on the natural process of genetics.
Shahsavari Pour et al. (2010) used a meta- heuristic genetic algorithm to solve the problem of time- cost-quality balance of multi-state projects. The results showed a good success of the algorithm. Masihi et al. (2015) solved the time-cost-resource constrain balance in multi-state projects using a modified genetic algorithm in which the chances of better members of the population were greater and the result of the balance was very satisfactory. Taheri Amiri et al. (2017) presented the time-cost balance for planning the project's critical network by the genetic algorithm. In other words, in this study, time scheduling and balancing the cost-time have been performed simultaneously using the genetic algorithm. Haghighi and Ahmadi-Najl, (2016) optimized operating rules and rule curves for multi-reservoir systems using a self-adaptive simulation-genetic algorithm model. In this design, the algorithm has performed optimization in successive cycles. Han et al. (2022) used Combining Artificial Neural Network with Particle Swarm Optimization to optimize the mix proportion of wet-mix shotcrete. 16 specimens were made and this hybrid model was applied to optimize mix proportions of wet-mix shotcrete in the Jinchuan mine. The results revealed that the ANN model yielded a mean relative error (MRE) of 2.755% and an R2 of 0.980, indicating an excellent prediction to establish the reasonable objective function. Additionally, PSO spent less than 60 s obtaining an optimal mix proportion of wet-mix shotcrete required by the mine. Consequently, this ANN–PSO model can be used as an efficient design guide to facilitate decision making, prior to the construction phase. Using a numerical method and Meta-heuristic Algorithms for estimating the optimal mixture design of concrete pavements proposed by Shirzadi Javid et al. (2021). A novel method is developed to predict the optimal mixture proportion, which enhances fundamental characteristics of concrete, including flexural strength, abrasion strength, slump, drying shrinkage, and freezing–thawing resistance, and its unit cost. The outcomes of this model reveal that the performance of this combination method is considerably better than those of heuristic algorithms. Moreover, the performance of mixtures designed by all algorithms is much better than the performance of initial specimens designed by experimental tests. By virtue of multi-objective optimization, the average of abrasion strength, drying shrinkage, freeze and thaw resistance, flexural strength and cost in the introduced model are improved 18%, 12%, 5%, 4%, and 1%, respectively. However, the average of slump deteriorates 9%. Sadrossadat et al. (2021) proposed machine learning algorithms and metaheuristics for mixture design and optimization of steel fiber reinforced UHPC. Investigations aims to demonstrate a procedure integrating machine learning (ML) algorithms such as ANN and Gaussian Process Regression (GPR) to develop high-accuracy models, and PSO for multi-objective mixture design and optimization of UHPC reinforced with steel fibers. The comparison of the obtained results with the experimental results validates the capability of the proposed procedure for multi-objective mixture design and optimization of steel fiber reinforced UHPC. The proposed procedure not only reduces the efforts in the experimental design of UHPC but also leads to the optimal mixtures when the designer faces strength-flow ability-cost paradoxes.