Intelligent recommendation system of the injection molding process parameters based on CAE simulation, process window, and machine learning

In this research, a recommendation system was designed for optimizing the injection molding process parameters. The system incorporates the utilization of process windows, eXtreme Gradient Boosting (XGBoost), and genetic algorithms. Computer-aided engineering (CAE) simulations were conducted to generate process window data and simulation data. Automatic hyperparameter optimization of the XGBoost was performed using grid search and cross-validation methods. The system employs 5 injection molding feature parameters as input and one product feature as output, and the strengthen elitist genetic algorithms (SEGA) was used for predicting the optimal injection molding process parameters. The performance of the prediction model was evaluated using an RMSE of 0.0202 and an R2 of 0.9826. The accuracy of the system was verified by conducting real production. The deviation of the product weight obtained from real production from the desired weight is 0.22%, which means that the prediction model achieves a correct rate of 99.78%. This recommendation system has a significant application value in reducing production costs and cycle time, as it can provide initial injection process parameter suggestions solely through the mold’s digital data.


Introduction
Injection molding is a crucial polymer molding technology with widespread application in various sectors, including automotive and communication industries.Its rapid production cycle and excellent product consistency make it the most molding technology among manufacturers [1][2][3].However, the quality of the injection-molded product is significantly affected the process parameters such as melt temperature, mold temperature, packing pressure, and packing time [4,5].During the initial molding process, an intricate commissioning process is necessary to determine the optimal combination of process parameters, resulting in significant materials waste and prolonged trial time.
To address the challenges associated with the manual tuning of process parameters, computer simulation-aided technology emerged as a viable approach for optimizing product design and assessing quality.However, the simulation results may from actual molding conditions due to variations in resin and machine response and deviations in operating conditions makes it challenging to rely entirely on computer-aided engineering (CAE) simulation to recommend optimal injection molding process parameters [6].To overcome these limitations, machine learning techniques have become increasingly popular as a tool for recommending optimal injection molding process parameters and predicting product quality.However, the acquisition of accurate molding data is expensive, and training data for machine learning models are mostly obtained through CAE simulations, leading to overly complicated and opaque prediction models.Crucially, hyperparameter adjustment is vital to the accuracy of the prediction model, and the manual process of adjusting hyperparameters increases the complexity of the model and reduces its replicability and generalizability.As a result, while machine learning has been used in injection molding for an extended period, the mentioned limitations hindered its suitability for commercialization.
Intelligence has been widely recognized as an essential tool for cost reduction in injection molding.Utilizing the support of CAE simulation and machine learning, the recommendation of optimal process parameters has been shown to be feasible.However, in order to make the predictive model commercially viable, it is necessary to qualify and train it using real molding conditions.This paper aims to address this purpose.

Literature review
The complexity of process parameters in injection molding renders linear models inadequate for accurately describing them.As a result, the setting of process parameters in injection molding predominantly relies on the expert knowledge and experience of engineers.Machine learning is a data fitting optimization technique that includes various artificial intelligence algorithms such as decision trees [7], random forests [8], support vector machines [9], Gaussian processes [10,11], and artificial neural networks [12,13].These algorithms mainly involve gathering sample data, fitting data models, and an iterative optimization search for data relationship models.
With the advancement of computer technology, machine learning is increasingly being used for recommending and optimizing injection molding process parameters.Hashimoto et al. [14] optimized the injection speed by radial basis function network to achieve a balance between product weld line and production cycle time.Yang et al. [15] established an RBF prediction model for process parameters and product quality in water-assisted injection molding process, through which accurate prediction of product hollow rate and wall thickness difference were achieved with average errors less than 5%.The Gaussian process has also been used for injection molding process optimization due to its advantage of processing small samples of complex data.Turng et al. [11] first used Gaussian processes for the optimization of injection molding process parameters, and the method allowed simultaneous result prediction and confidence assessment, providing a clear direction for model training and confirming the applicability of Gaussian prediction in the field of injection molding process optimization.Luo et al. [10] analyzed the mold filling melt flow length by building a mixed-effect Gaussian process prediction model, and the results showed that the performance of the mixed model was outperformed by the single Gaussian model.
Artificial neural networks are particularly useful for characterizing complex data relationships, making them ideal for mapping the coupling effect between injection molding process parameters and product quality parameters.To improve their application, neural networks are often paired with optimization algorithms.For example, Kwong et al. [16] developed a hybrid system that combined neural networks with genetic algorithms, effectively reducing the adjustment time required for initial process parameters.This study demonstrated the potential of using neural networks for recommending initial injection molding process parameters.Mao et al. [17] developed a BP-GA prediction model to achieve the initial process parameters with the clamping force and product warpage as the target, which effectively reduced the product production cycle and cost.
Training data is a key factor in determining the accuracy of machine learning prediction models.Changes in cavity structure can also affect the applicability of the prediction model.Lee et al. [18] used a mixed sampling method combining Taguchi sampling and random sampling to collect simulation and experimental results from 36 different molds to construct and train an artificial neural network (ANN) prediction model.However, the prediction results of the model for the new cavity deviated greatly and conformed poorly to the real molding conditions.Hopmann et al. [19] used induced network migration learning to analyze the structural similarity of parts, which effectively improved the practical value of neural networks by reducing the amount of training data by up to 88% through the holistic migration approach.In addition, they proposed a migration learning-based adjustment technique by performing computer numerical simulations with different polymer materials on the same part to achieve fast and accurate prediction of process parameters even after material replacement, and the results showed that the adaptation of neural network models to different polymer classes could be facilitated by migration learning [20].
To improve the accuracy of the training data, process windows are used for parameter constraints.Park et al. [21] developed an engineering analysis model to simulate a real injection molding process by constructing a process window with mold temperature and pressure as limits.The quality consistency of the product was improved by monitoring the process window.Wang et al. [8] established a machine learning-based process window for injection molding product quality assessment.A random forest and regressor were used to predict the product quality with 100% prediction accuracy.However, the prediction systems for process parameters currently available in industry are still unsatisfactory.Acquiring large amounts of training data is expensive, and predicting new types of cavities based on past training data is ineffective due to differences in the mold structures.The impact of training data formed by molds with different structures on the prediction model remains unclear.In addition, current research primarily focuses on constructing the prediction model while neglecting the effect of training data validity on its effectiveness.Training data not related to the predicted products fails to improve prediction accuracy.Therefore, devising an individualized analysis for molds is a promising approach to improving the prediction of process parameters.
In this study, an innovative injection molding process parameter recommendation system was developed, which successfully combines CAE simulation and machine learning.The system aims to achieve automatic construction of the prediction model, enhance the model's applicability for predicting materials and cavity structures, and improve the efficacy of obtaining training data.To achieve this, a process window is generated via CAE simulation, and Taguchi's experimental design is executed based on the process window.The simulation results are then used as training data for the prediction model.The eXtreme Gradient Boosting (XGBoost) is employed to develop the regression model, while grid search and cross-validation are applied to optimize the hyperparameters automatically.Additionally, the species strengthen elitist algorithm (SEGA) is utilized for iterative optimization of the XGBoost regression model prediction results.The prediction model recommends optimal process parameters based on the desired product quality targets inputted by the user.The system's components are depicted in Fig. 1.The remainder of this paper is composed as follows.Section 3 depicts the development methodology and operation process of the entire recommendation system.Section 4 describes the method of data acquisition.Section 5 specifies the prediction model and algorithm of the optimal process parameter.Section 6 verifies the prediction accuracy of the recommendation system.Section 7 presents the conclusions.

Methodology
Figure 2 depicts the overall flow of the system for recommending initial process parameters, which consists of three main parts: data acquisition, prediction model, and process recommendation.The digital model of the product's mold can be employed to anticipate the initial injection molding parameters during the mold manufacturing period.Firstly, the characteristics of the digital mold, such as the sprue, runners, cavities, and cooling channels, are identified using CAE simulation software.The "machine analysis" mode is then utilized, and the machine refers to the injection molding machine model specified by the user.The simulation is carried out according to the principles of scientific molding to generate the process window, i.e., the effective range of key injection molding parameters.The Taguchi experimental design is executed within the process window and simulated to generate training and testing sets, with 70% of the simulation results used for training and the remaining 30% for testing.Next, the prediction model is trained using XGBoost with the training set data, which is randomly reorganized.The optimization of hyperparameters Fig. 1 The components of the process parameter recommendation system is performed using grid search and cross-validation, and iterative optimization of XGBoost is achieved using SEGA.Finally, the prediction model recommends initial injection molding process parameters, including melt temperature, mold temperature, injection speed, packing pressure, and packing time, based on the user-defined product quality target.The system was developed using Python 3.9, and its execution relies on the Moldex3D API for data exchange with the simulation software.Utilizing the system is made relatively simple, requiring only the import of the digital mold and the definition of the desired product quality target values.Subsequently, initiated injection molding parameters can be recommended within a specified timeframe.

Product information
A flat panel sample is utilized to test the effectiveness of the recommendation system, and the product is divided into cavity 1# and cavity 2# for easy description.The geometric of the part is shown in Fig. 3.The resin used is polypropylene (579S, SABIC).The computer numerical simulation software used is Moldex3D 2022.

Data-driven orthogonal experiment
Geometric information of the product plays a crucial role in setting the process parameters.Although previous studies [18] have attempted to develop a generic description of cavity information in terms of projected area, volume, and surface area, and to build software for automatic recognition of cavity information, the commercial results were not satisfactory.Therefore, to achieve adaptivity for different cavities, Moldex3D is employed to identify mold information and define mold cooling channels, sprue, runners, and product cavities that match the actual situation.
Previous studies have shown that melt temperature, mold temperature, injection speed, packing time, and packing pressure are variables that significantly affect product quality and are conducted as input variables.Product quality is mainly characterized by weight, warpage, volume shrinkage, and surface quality.However, except for weight, other parameter characterizations are often influenced by artificial subjective factors.Harry et al. [22] found a strong linear correlation between product length and weight, while Min et al. [23] demonstrated a remarkable correlation between product shrinkage and weight.Therefore, weight is considered as the indicator of product quality, and optimal product weight can be determined by production personnel based on the volume and material density of the product.
The quality and validity of the training data are critical factors in determining the accuracy of the prediction model.Ensuring the validity of the data is equally important in enhancing the overall efficiency of the model training process.Process windows are designed to establish valid ranges of key parameters based on scientific molding principles [24].It narrows the range of valid data for Taguchi experiments and improves computing efficiency.The injection molding process windows were obtained by CAE simulation, as listed in Table 1.Using Taguchi's experimental method, the obtained process windows were divided into four scenarios 5-level 5-factor, and resulting in 100 simulated data, and the results are listed in the Table S3.The calculation workflow and methodology of the process window are described in the Supporting Information.

XGBoost algorithm
XGBoost is an improved version of GBDT (Gradient Boosting Decision Tree), which is called an extreme gradient boosting decision tree [25].It is composed of several weak learners to create a strong learner with superior accuracy.XGBoost reduces the risk of overfitting by adding a loss function with a regularization term.Furthermore, XGBoost uses the firstorder derivative and second-order derivative values of the loss function and makes it possible to train the model faster and more efficiently with techniques such as greedy algorithms and weighted quantile sketch.In dealing with regression problems, XGBoost consists of multiple CART (Classification and Regression Tree), and the final prediction is the sum of the prediction results of each tree.The predicted value is calculated from: where ŷi is the predicted value of x i and f t represents a deci- sion tree.
The objective function expression starts like this: In the above expression, l(y i , ŷi ) represents loss function that measures the difference between the prediction ŷi and the target y i .To prevent overfitting, the regular term Ω(f k ) is usu- ally added to the equation.
XGBoost is trained in an additive manner.Therefore, ŷi can be rewritten as: and the objective function of XGBoost can be rewritten as: A second-order Taylor expansion is applied to the above objective function, and the higher-order infinitesimal terms are removed, which can be rewritten as: In the above expression, g i = ŷ(t−1) l(y i , ŷ(t−1) ) and h i = 2 ŷ(t−1) l(y i , ŷ(t−1) ) .In addition, the objective function can be simplified by removing the constant term, and the specific process is: g i , and H j = ∑ i∈I j h i , bringing G j and H j into the objective function: Taking the derivative of w j in the objective function, the outcome is w j = − G j H j + , and the corresponding optimal value can be calculated as:

Data preprocessing
The training and test data sets utilized in this paper were obtained through a combination of CAE simulation and scientific molding principles.The training set consisted of 70 sets of data, while the test set comprised 30 sets of data.As noted previously, the weight of the injection molded product is primarily influenced by the melt temperature, mold temperature, injection speed, packing pressure, and packing time.These five parameters were utilized as inputs for the prediction model, with the weight of the product being ( 7) the output.Random shuffling and reorganization of the data were conducted to form the current training dataset.Both the test set and the training set were constructed using the same data format.

Hyperparameter tuning
There are ten hyperparameters commonly used in Boost: max_depth, n_estimators, min_child_weight, subsample, colsample_bytree, learning_rate, gamma, alpha, lambda, eval_metric.The value ranges and meanings of these ten hyperparameters are shown in Table 2. Hyperparameters are crucial for optimizing the performance of machine learning models.To ensure that values are chosen, hyperparameters must be adjusted appropriately.While manual tuning is a commonly used method hyperparameter adjustment, it has several limitations.Firstly, domain-specific knowledge is required to select appropriate hyperparameters, and a lack thereof may result in decreased model performance.Additionally, the subjective nature of the manual tuning process may lead to suboptimal hyperparameters being chosen.Secondly, the enormous search space for hyperparameters, trying all possible hyperparameter combinations can be computationally expensive.Thirdly, manual tuning is heavily reliant on empirical adjustments, thus providing no guarantees for finding globally optimal hyperparameter combinations.
To address these limitations associated with manual tuning, alternative methods have been developed to optimize hyperparameters.Two such methods are grid search and cross-validation.Grid search involves systematically evaluating all possible hyperparameter combinations within a predefined range.On the other hand, cross-validation splits the data into subsets for both training and validation, allowing for the evaluation of model performance.These methods offer a more automated, reliable, and efficient approach to hyperparameter tuning, facilitating the discovery of the globally optimal combination.As a result, grid search and cross-validation have gained widespread adoption in the optimization of hyperparameters for machine learning models.These methods contribute to the development of more robust and effective models, ultimately enhancing their performance.
To save time and computational resources, cross-validation and grid search are typically carried out on only the most crucial hyperparameters in an algorithm.For example, critical hyperparameters for XGBoost include max_depth, n_estimators, and learning_rate.Identifying the optimal values of these hyperparameters enables efficient and costeffective development of a-performing model.

Measure metric
In terms of prediction effectiveness, this paper uses RMSE, MAE, and R-squared (R 2 ) as evaluation indicators, which are defined as follows: where y i represents the actual value of the product weight, ŷ represents the predicted value of the product weight, y represents the average of the actual value of the product weight, and n represents the number of training samples.In the evaluation metrics, lower values of RMSE and MAE indicate better model performance, while a value closer to 1 for R 2 indicates better model performance.

Implementation of grid search
Grid search is a widely used method for hyperparameter optimization, enabling users to achieve highly accurate predictions by systematically exploring search ranges for hyperparameters.With adequate computational resources, this approach facilitates the discovery of optimal hyperparameter combinations for superior model performance.As each experiment is conducted separately, parallel computation is easily implementable.Nevertheless, dealing with an excessive number of models hyperparameters may demand a large amount of computational resources, leading to slower optimization.Consequently, in practice, grid search is typically recommended for models with no more than three hyperparameters to maintain efficient optimization [26].While there are indeed alternative hyperparameter search algorithms like random search, genetic algorithms, Bayesian optimization, and others, grid search remains the most widely used method due to its mathematical simplicity [27].
In the XGBoost algorithm, max_depth, learning_rate, and n_estimators are its most significant hyperparameters, which have the greatest impact on the performance of the model.In order to evaluate the effect of different hyperparameters on the model results, the hyperparameters can be optimized by using grid search.The specific process regards MAE as the evaluation metric, and by setting the range of max_depth, its optimal value can be obtained while keeping other hyperparameters at their default values.The optimal values of learning_rate and n_estimators are obtained with the same method.The process of optimizing the three hyperparameters is shown in Fig. 4.
From Fig. 4a, it can be seen that the XGBoost model performs the best when the value of max_depth is 2 during the training process.Then, as max_depth increases, the MAE value of the model begins to increase, which shows that overfitting occurs.In Fig. 4b, as the learning_rate gradually increases, the MAE gradually decreases.When it exceeds a certain value, the MAE tends to stabilize and no longer changes.However, too large learning_rate may lead to instability in the iterative process, making the algorithm unable to converge to the optimal solution, so the learning_rate is set to 0.05.In Fig. 4c, as n_estimators gradually increases, the effect on the MAE result does not change significantly, so 100 is chosen as the final value of n_estimators.When the values of max_depth, learning_rate and n_estimators are 2, 0.05, and 100, respectively, the MAE value is 0.0615.However, when the values of max_depth, learning_rate, and n_estimators are 4, 0.05, and 350, respectively, the MAE value is minimized to 0.0193, which indicates that the risk of model overfitting can be reduced when the three hyperparameters are jointly optimized.

XGBoost results analysis
In constructing the XGBoost model, we utilize the greedy algorithm to build the decision tree, which employs the "gain" metric to select the optimal splitting point.XGBoost evaluates all feasible combinations of features and thresholds, calculates the gain of each combination, and then selects the with the highest gain as the optimal splitting point.Following the selection of an optimal split point, XGBoost splits the current node into two children and performs the same operation recursively on the children until predefined stopping condition is attained.The "gain score" is calculated as follows: where g i and h i represent the gradient and second-order derivative of the ith sample, respectively, I L and I R are the instance sets of left and right nodes after the split, I represents the set of samples at that node, and λ and γ are the regularization parameters.
By calculating the "gain score," the model can rapidly determine which features have the greatest impact on predictability, eliminating irrelevant features, reducing feature space dimensionality, and improving the efficiency and accuracy of the model.Furthermore, calculating the "gain score" allows for the identification of features that may trigger overfitting, leading to better generalization performance and preventing overfitting.Figure 5 displays the results of the gain scores, which were utilized to evaluate the importance of the input features in predicting outcomes.The results demonstrate that packing pressure has the most significant influence on predictive performance, followed by packing time.The contributions of melt temperature, mold temperature, and injection speed are nearly indistinguishable.When the max_depth, learning_rate, and n_estimators parameters of the XGBoost algorithm are set to 4, 0.05, and 350, respectively, the grid search-eXtreme Gradient Boosting (GS-XGBoost) model for product weight prediction is successfully obtained.The GS-XGBoost model was tested using a test set, and the results are shown in Fig. 6.The abscissa is the serial number of data points in the test set, and the ordinate is the product weight value.A red polyline with a circle represents the actual value of product weight, and a blue polyline with a circle represents the predicted value of product weight.The RMSE, MAE, and R 2 values of the GS-XGBoost model are 0.0202, 0.0193, and 0.9826, respectively.Therefore, the GS-XGBoost model trained has high accuracy in this study.

Optimization algorithm
The genetic algorithm (GA) was first proposed by Holland in 1974 [28].It is an optimization algorithm that simulates the natural evolution process by simulating processes such as "genetic inheritance," "natural selection," and "fitness evaluation," to search for the optimal or suboptimal solution from the initial solution.The basic idea of genetic algorithm is to regard each solution (individual) as a collection of genes, and the genes represent the parameters or variables of the problem.New solutions are continuously generated by performing operations such as crossover, mutation, and selection on these genes.Following that, the solutions undergo assessment through the fitness function, which serves to determine the individuals capable of surviving and reproducing into subsequent generations.This process is repeated until the specified termination criteria are satisfied.SEGA is a variant of GA.Compared to GA, in each generation, SEGA selects the best individuals (called elites) to be retained in the population to avoid optimal solutions being discarded during evolution.During the evolutionary process, evolutionary operations such as crossover and mutation are optimized to make them more suitable for specific problems and objectives and to improve the search efficiency of the algorithm.The workflow of SEGA is illustrated in Fig. 7.
In this study, a GS-XGBoost model was developed to accurately predict the product weight based on various injection molding process parameters.To determine the corresponding injection molding process parameters for a desired product weight, the inverse problem was solved using SEGA.This approach ensures the maintenance of population diversity and helps avoid convergence to local optimal solutions.With scientific molding principles, the upper and lower bounds of the decision variables can be set, such as setting melt temperature, mold temperature, injection speed, packing time, and packing pressure.The range of the set variables is shown in Table 1.In SEGA, the population size is set to 40, and the maximum number of where y is the desired product weight set by the user and ŷ is the product weight predicted by the GS-XGBoost model based on the input of decision variables.When the objective function achieves the maximum value, the optimal solution is obtained and the decision variables at this point are the optimal injection molding process parameters.Actually, the weight of a product is subject to fluctuations within a specific range owing to changes in injection parameters, and production personnel can set desired weight targets in accordance with product requirements.To demonstrate the effectiveness of the proposed model, the target weight of cavity 1# in the product was varied across three distinct levels, 9.75g, 9.85g, and 9.95g, respectively.To demonstrate the comparison effect, GA and SEGA were employed in conjunction with the GS-XGBoost model respectively to optimize the injection molding process parameters and determine the optimal process parameters for a target weight.Figures 8 and 9 show the evolutionary curves for GA and SEGA, respectively.The horizontal axis denotes the number of population generations, while the vertical axis represents the objective function value.The objective function value increases as the population evolves, and finally, the objective function reaches an optimal solution with a value of 0. Obviously, the optimal value of GA is fluctuating during the iteration process, and the set target value cannot be reached in the end.While SEGA can reach the optimal solution quickly and its optimization effect is more significant.The optimum process parameters that meet the target values were finally recommended by SEGA, as listed in Table 3.

Results and discussion
To verify the accuracy and effectiveness of the system for recommending process parameters, the process parameters recommended by the prediction model were verified through real molding.The deviation of the formed weight of the product from the target weight is calculated by the following equation: where m t represents the target weight of the product and m m represents the forming weight of the product.The all-electric injection molding machine boasts higher response speeds and execution accuracy for parameters, which effectively reduces the differences between simulation and real molding caused by machine response delays.As such, a ZHAFIR ZE1200 (Ningbo, China) all-electric injection molding machine, the same model used for "machine analysis" in the Moldex3D simulation, was selected to verify the accuracy of the recommended system.Figure 10 shows the injection molding machine.Detailed machine characteristics can be found in Table S4.Additionally, all molded products were weighed  using a high-precision electronic balance.Throughout the injection molding production process, the mold temperature was controlled by a separate temperature control machine.

Plate sample validation
The real injection molding conditions are depicted in Fig. 11.The first 5 samples were removed due to instability in the injection of the initial several samples, and the weight of cavity 1# was measured for 10 consecutive products under a stable production cycle, as shown in Table 4.The average weight of the product was chosen as the verification indicator due to potential wear of the non-return valve or response lag of the injection molding machine.During the validation process for various target weights, the maximum deviation observed between the actual molded product weight and the target weight was 0.22%.Therefore, it can be concluded that the proposed prediction model exhibits an accuracy of at least 99.78%.

Goggles sample validation
To validate the effectiveness of the recommender system, goggle product was utilized for purposes.Figure 12 presents the assembly line and the goggles used in the validation  process, with polycarbonate (LEXAN 940A, SABIC) as the material.Following a consistent methodology, a process window was established for this product.Subsequently, the recommender system was trained and optimized based on a dataset generated within the process window, with a target weight of 40.5 g in mind.The process window for the goggles and the recommended process parameters generated by the system are detailed in Tables 5 and 6, respectively.To assess the performance of the system, the weights of consecutive products produced under the recommended process parameters were recorded in Table 7.The average weight deviation of these products from the desired value was found to be 0.46%.It should be noted that the more intricate structure of the goggles may have contributed to a slightly larger deviation.Nonetheless, this discrepancy falls well within the acceptable limits defined by current industrial standards.

Future work
The primary objective of the proposed recommender system is to streamline the process of debugging injection molding process parameters, thus reducing both time and costs prior to actual production.By simply inputting the 3D digital mold information into the recommendation system, customers can easily access optimized process parameters.
It is important to note that further improvements can still be made to enhance the system's capabilities, such as implementing automated procedures for determining process windows, calculating plasticizing parameters, and utilizing image recognition to assess residual stress in simulated products.By effectively utilizing these techniques, the system can recommend process parameters that optimize both optical and surface quality.Ultimately, the recommendation system is poised to become a valuable web service for industrial applications.

Conclusions
The optimization of process parameters is critical to achieving high-quality, cost-effective production.Traditional debugging or Taguchi methods used by engineers can be time-consuming and ineffective in finding optimal process parameters.However, by employing the GS-XGBoost and SEGA methods, engineers can quickly determine the optimal injection molding process parameters.In this research, a process window was generated through CAE simulation during mold manufacturing, and the prediction model was developed to predict process parameters within the window.The GS-XGBoost model demonstrated high accuracy, with RMSE and R 2 values of 0.0202 and 0.9826, respectively.The prediction effect of the model was verified through real molding.The same type of injection molding machine was used in actual production, and the smallest deviation the product weight obtained from the desired weight of the prediction model was 0.22%, which means that the prediction model achieves a correct rate of 99.78%.This prediction model provides intelligent and reliable prediction of digital injection molding process parameters, without the need for physical molds, resulting in significant benefits in terms of reduced cost and cycle time.

Fig. 2 Fig. 3
Fig. 2 Development and operation flow of the system for recommending initial process parameters

Fig. 4 Fig. 5
Fig. 4 Curve of the relationship between the MAE and the XGBoost hyperparameters: a max_depth; b learning rate; c n_estimators

Fig. 6
Fig. 6 GS-XGBoost models for product weight prediction Fig. 7 Workflow diagram of SEGA

Fig. 10
Fig. 10 Injection molding machine for validation

Fig. 12 Table 5
Fig. 12 Real molding conditions: a assembly line and b parts

Table 1
Injection molding process window

Table 2
Meanings

Table 3
Injection molding process parameter setting

Table 4
Weight of real molded products cavity 1#

Table 6
Process parameters suggested by the recommended system