Issue |
Manufacturing Rev.
Volume 12, 2025
|
|
---|---|---|
Article Number | 10 | |
Number of page(s) | 12 | |
DOI | https://doi.org/10.1051/mfreview/2025005 | |
Published online | 31 March 2025 |
Original Article
Process optimization of SiC/(Mo, W)Si2 nanocomposites preparation based on BOA-SSA
Department of Inspection, Testing and Certification, Changzhou Vocational Institute of Engineering, Changzhou 213164, PR China
* e-mail: liutingyu_czie@126.com
Received:
12
November
2024
Accepted:
28
February
2025
Silicon carbide/(molybdenum, tungsten) disilicide nanocomposites have important application value in aerospace, high-temperature structural materials and other fields, but the current preparation process still needs to be optimized. In order to prepare high-performance silicon carbide/(molybdenum, tungsten) disilicide nanocomposites, a three-layer structure backpropagation neural network model is constructed, and the specific construction process and training structure are analyzed. Sparrow search algorithm is used for model optimization. At the same time, in response to the problem easily getting stuck in local optima, the butterfly optimization algorithm is proposed to optimize it and solve the optimal preparation process. The results indicated that the error between the predicted and experimental values of the bending strength of the prepared material using the backpropagation neural network model was less than 2%, and the minimum relative error was only 0.15%. Meanwhile, the best process combination obtained by the Butterfly Optimization Algorithm-Sparrow Search achieved a bending strength of up to 752 MPa, while the traditional Sparrow Search only achieved a bending strength of 662 MPa. The material preparation process optimization method has significant advantages and can effectively improve the bending strength of the prepared silicon carbide/(molybdenum, tungsten) disilicide nanocomposites, providing an effective solution for solving multidimensional nonlinear optimization problems.
Key words: SiC/(Mo, W)Si2 / Nanometer material / SSA / BOA / BP neural network
© T. Liu, Published by EDP Sciences 2025
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1 Introduction
In modern aerospace engineering, facing extreme working environments, materials must be able to withstand multiple challenges such as high temperature, high pressure, and corrosion. The widely used nickel based and titanium based alloys are no longer able to meet this challenge. Therefore, there is a need to develop new high-performance high-temperature structural materials [1,2]. Among them, molybdenum disilicide (MoSi2) has a very high operating temperature and can work continuously in environments above 1500 °C. Its melting point is as high as 2030 °C, far exceeding traditional metal alloys [3,4]. Related studies have shown that by adding tungsten disilicide (WSi2) for alloying and combining it with nano silicon carbide (SiC) particles, silicon carbide/(molybdenum, tungsten) disilicide SiC/(Mo, W)Si2 nanocomposites can be prepared, which have excellent mechanical properties [5]. Among them, the process parameters such as hot pressing temperature, addition amount, hot pressing pressure, and insulation time of WSi2 and SiC particles all have an impact on the material properties, and they exhibit nonlinear characteristics. Conventional statistical regression methods are difficult to decide the optimal preparation process. Therefore, more advanced strategies should be developed. The Sparrow Search Algorithm (SSA), due to its strong global search capability, can achieve higher quality preparation of nanocomposites. It is gradually applied to the solution of artificial neural network models.
The traditional SSA has slow convergence speed and susceptibility to getting stuck in local optima. Relevant researchers have taken a series of optimization measures. Yue et al. designed an optimized SSA to address the original SSA being prone to falling into local optima and slow convergence speed. It was applied in fields such as image processing, fault diagnosis, and power grid load forecasting. The results showed that this method was significantly superior to particle swarm optimization and emperor butterfly method on functional testing comparison and performance analysis [6]. Zhou et al. built a mixed SSA method for predicting dissolved oxygen. This method modified the global search process, introduced the Butterfly Optimization Algorithm (BOA) to facilitate information exchange, and used Cauchy mutation and greedy rules to enhance the ability to jump out of local optima. The results showed that the hybrid improved SSA outperformed other intelligent methods on 10 benchmark functions [7]. Kathiroli et al. built a hybrid SSA and differential evolution algorithm to address the limited node energy and high data transmission energy consumption. The method improved energy efficiency by selecting the optimal cluster head. The improved strategy had significant advantages in remaining power, throughput, and number of active nodes [8]. Geng et al. designed a new SSA to address the traditional simulated annealing algorithm easily getting stuck in local optima and poor convergence in feature selection optimization. This algorithm introduced three chaotic mappings to enhance population diversity and used reverse learning to enhance global search capability. The method outperformed other methods in optimizing 18 classic test functions [9].
Driven by computer technology, artificial neural networks are gradually being applied to optimize the preparation process of composite materials. Razavi et al. used artificial neural networks and limit learning machines to analyze input factors and predict conductivity to optimize the performance of conductive polymer based composites. The predictions of artificial neural networks and extreme learning machines were consistent with experimental data. The highest conductivity was obtained using 10%, 15%, and 25% carbon nanotubes, expanded graphite, and carbon black [10]. Gao et al. proposed the Bagging model for simulating the AC conductivity prediction of polyimide based composite materials. The Bagging model using 5-fold cross validation was superior to existing methods in terms of the agreement between predicted values and measured values, with lower error values [11]. Vidakis et al. used a fully factorial design approach based on artificial neural networks to predict the mechanical response of antibacterial PA12/TiO2 nanocomposites, analyzing the effects of different parameters during the 3D printing process. An increase in TiO2 loading reduced mechanical response but improved antibacterial performance, while higher nozzle temperatures and zero deposition angles optimized mechanical performance [12]. Shabani et al. used a novel ternary catalyst CoFe2O4/TiO2/Au for photocatalytic reduction of methyl orange in order to remove organic pollutants from drinking water. Artificial neural networks and genetic algorithms were used for model prediction. The catalyst had good catalytic performance and recyclability under UV irradiation, and the algorithm had high accuracy in predicting removal efficiency [13].
In summary, researchers have designed various artificial neural networks for optimizing the preparation process of composite materials, providing reliable modeling tools for material design. However, there have been no studies using Backpropagation Neural Networks (BP) for model construction. Traditional genetic algorithms have low efficiency and accuracy in optimizing BP models. A BP model is developed to address this type of problem. A combination of BOA and SSA is used to achieve model optimization and solve the optimal preparation process of SiC/(Mo, W)Si2 nanocomposites, in order to obtain SiC/(Mo, W)Si2 nanocomposites with better performance. The innovation of the research lies in the use of BOA-SSA composite algorithm for solving nonlinear optimization models. In the solution of nonlinear problems, the BOA-SSA algorithm introduces a skip mechanism of simulated annealing, which can effectively avoid early getting stuck in local optima and accelerate convergence speed during the optimization process. At the same time, BOA can increase the search space, provide richer initialization information for SSA, and avoid the slow convergence speed and local optimum problems of traditional optimization algorithms such as genetic algorithm, particle swarm optimization algorithm, simulated annealing algorithm, etc. This improvement method provides an effective optimization solution for the preparation of materials.
2 Methods and materials
The study first transforms the preparation of SiC/(Mo, W)Si2 nanocomposites into a nonlinear optimization problem. Therefore, a BP model with a three-layer structure is constructed. Then, in order to achieve model optimization, the BOA-SSA is introduced, and the optimization process of the algorithm is described in detail.
2.1 Construction of nonlinear optimization model for BP
In the preparation of SiC/(Mo, W)Si2 nanocomposites, various process parameters of WSi2 and SiC particles exhibit nonlinear characteristics. General statistical-regression methods are difficult to determine the optimal preparation process [14,15]. For the solution of nonlinear characteristics, artificial neural networks have been introduced into the research, which have strong self-learning and adaptive abilities. Without prior knowledge of raw materials and process parameters, the performance of the prepared materials is inferred. Due to the ability of BP to quickly learn complex nonlinear relationships between inputs and outputs, the research mainly uses BP for model construction. This model is a universal model suitable for tasks that involve non-linear functional relationships between input and output, including material design and optimization, image processing and pattern recognition, time series prediction, and other fields. In general, a three-layer BP can approximate any rational function. Therefore, this study uses a three-layer BP for model construction to approximate the functional relationships between data. The specific procedure is shown in Figure 1.
Figure 1 shows the input matrix used for model training and its corresponding target output matrix. p and t represent the input and output matrices, respectively. To facilitate better learning and convergence of neural networks, the input and output matrices need to be normalized, mainly using premnmx statements for normalization. The standardized input-output matrices are P and T. In the construction of the model, the newff function is used to decide the structure of the BP. In the hidden layer, the S-shaped tangent function tansig is used as an activation, which can introduce nonlinear characteristics and help the network learn complex input-output mapping relationships. The neurons in the output layer use the linear activation function purelin, which enables the neural network to output a continuous value. The network training function is trainlm, which has fast convergence. The newff function will ultimately return a trainable BP network. Then, the train function is used to train the network, with a training target of 0.0001 and a training frequency of 1000 times. Finally, net, maxP, minP, maxT, and minT are saved for future use. The training structure of the BP used in the study is shown in Figure 2.
In Figure 2, the activation function of the BP model is the sigmoid function, which can map any linear input to a value in the [0,1] interval, as expressed in equation (1).
(1)
In equation (1), z signifies the function's input value. After training the BP model, the training results are then reproduced using the sim function to achieve the y = f(x) −function relationship. The specific program steps are to first define the function, as displayed in equation (2).
(2)
In equation (2), function represents the defined function. x represents an individual. y represents the individual fitness value. Then, the neural network parameters are loaded, represented as load file net minP maxP minT maxT. The tramnmx function is used to normalize the input values to the [0,1] interval, as displayed in equation (3).
(3)
Next, the sim function is called to forward propagate the training result net to the normalized input values, as shown in equation (4).
(4)
In equation (4), an represents the output value. Finally, the postmmx function is used to denormalize the output value, restore the original fitness range, and assign a value of y. The specific expression is shown in equation (5).
(5)
![]() |
Fig. 1 Specific procedures for model construction. |
![]() |
Fig. 2 BP training structure. |
2.2 Nonlinear optimization model solution based on BOA-SSA
After training the neural network model, the network output needs to be converted into fitness values suitable for algorithm optimization to deal with the proposed nonlinear optimization problem [16]. To obtain the optimal raw material ratio and hot pressing process and ensure the optimal performance of SiC/(Mo, W)Si2 nanocomposites, the SSA is introduced in the study. In this algorithm, sparrow populations can be divided into two categories: joiners and discoverers. The task of the discoverer is to search for food and guide the direction for the joiners. Participants seek food based on the discoverer's location. The space where the sparrow is located can be expressed as equation (6).
(6)
In equation (6), xm represents individual sparrows. d signifies the number of optimized variables. n signifies the quantity of sparrows. The position update calculation of the discoverer is shown in equation (7).
(7)
In equation (7), w signifies the current iteration count. j represents dimension. i signifies the number of participants. signifies the position of the discoverer in the w-th iteration. α1 refers to a random number within [0,1].
signifies the maximum number of iterations. Q1 signifies a random number that adheres to a normal distribution. L signifies a matrix with all elements being 1. R represents the warning value. Sw represents the safety threshold. The updated position of the joiner is shown in equation (8).
(8)
In equation (8), xmworst and represent the current worst and best positions, respectively. A+ represents a matrix with an absolute value of 1, as shown in equation (9).
(9)
when faced with a threat, sparrows will adjust their position information, as shown in equation (10).
(10)
In equation (10), Q2 signifies a random number that adheres to a normal distribution. α2 represents a random number within [0,1]. fw signifies the global worst fitness. fi and fg respectively represent the current global maximum fitness value and minimum fitness. ε represents a minimal constant to prevent the denominator from being zero. When fi > fg is present, sparrows are more susceptible to predator attacks. If the two are equal, it means that the sparrow in danger will approach other sparrows to prevent being caught. The foraging behavior in the SSA is displayed in Figure 3.
The conventional SSA has fewer mathematical model parameters and simpler principles, which can simulate biological behavior as a mathematical model and solve problems [17,18]. During the initialization phase, this algorithm randomly generates sparrows with different numbers, fitness values, and positions, resulting in individuals with lower fitness values. Such individuals may experience a decrease in population diversity as the iteration increases, leading them to potentially fall into local optima when searching for optimal weights and thresholds [19]. In response to this issue, the BOA is introduced to optimize global search capability and reduce the possibility of local optima. In the BOA, each butterfly individual has unique olfactory abilities and fragrances. Fragrance comes from perceptual form, power index, and stimulus intensity. The perception mode includes temperature, light, sound, and odor. The intensity of stimulation is correlated with an individual's fitness. The stronger the fragrance produced by an individual, the stronger their attraction. The power index represents the intensity index, which is mainly related to linear and compressive response. In the BOA, the expression for an individual producing fragrance is shown in equation (11).
(11)
In equation (11), c represents the perceptual modality. a is the power index. I is the intensity of stimulation. f represents the perceived intensity of fragrance. Among them, the value of perception mode c ranges from 0 to 1. A larger c value will enhance the butterfly's ability to perceive fragrance, making it easier to perceive the position of the global optimal solution and enhancing the algorithm's global search capability. However, excessively large c values can cause the algorithm to be overly sensitive to global information and ignore local information, thereby reducing the search accuracy of the algorithm. Therefore, the c value should be selected reasonably. The algorithm includes local search and global search, and the calculation of global search is shown in equation (12).
(12)
In equation (12), fd signifies the fragrance of the d-th butterfly. signifies the solution vector of the d-th butterfly at the u-th iteration. g∗ stands for the best solution among all current solutions. r represents a random number between [0,1]. The calculation of local search is shown in equation (13).
(13)
In equation (13), signifies the solution vector of the d1-th butterfly, and
signifies the solution vector of the d2-th butterfly, both belonging to the same population. In addition, the algorithm uses a switching probability Sp to select the search method for the algorithm. The switching probability controls the frequency of butterfly switching between global search and local search. A higher Sp value will encourage more butterflies to perform global search, avoiding premature convergence and maintaining good exploration ability. However, if the algorithm overly leans towards global search, it may lead to neglecting local exploration of details during the optimization process and missing potential local optimal solutions. Lower Sp values tend to favor local search and enable detailed optimization in potential high-quality areas. But Sp cannot be too low, otherwise it will cause the algorithm to fall into local optima too early and miss the opportunity for the global optimal solution. The flowchart of BOA is shown in Figure 4.
The study optimizes the SSA through the BOA to obtain the BOA-SSA, which effectively enhances the global search ability of the SSA and reduces the possibility of local optima. After the SSA is completed, the BOA is used to further improve the optimization results. At this stage, the main goal of the BOA is to perturb the individual positions of the population based on their fitness values, to explore better solutions. The BOA-SSA model is shown in Figure 5.
In Figure 5, the fitness functions of SSA and BOA are selected as mean squared error, a shown in equation (14).
(14)
In equation (14), εo signifies the difference between the true and the actual values. O signifies the quantity of samples in the testing set. Firstly, based on the SSA to find the sparrow position, the initial fitness of the butterfly individual is obtained using the BOA. Next, the BOA introduces a switching probability mechanism to achieve flexible switching between local search and global search. When conducting a global search, individual butterflies explore a broader solution space to find new positions. In the local search phase, butterfly individuals focus more on the area near the current solution. During these two stages, the position of individual butterflies continuously update. Every time an individual generates a new position, their fitness value is recalculated. If the new fitness exceeds the current fitness, the individual's position and fitness value are updated. Otherwise, the individual remains in their original position unchanged. Through this mechanism, the BOA can achieve effective self-adjustment, comprehensively utilizing the breadth of global search and the depth of local search, thereby improving the efficiency of finding the global optimal solution. Finally, the results are assigned to the trained BP model to achieve the optimal solution of hot pressing process parameters and raw material ratios.
![]() |
Fig. 3 Sparrow foraging behavior graph in SSA. |
![]() |
Fig. 4 The flowchart of BOA. |
![]() |
Fig. 5 Flowchart of BOA-SSA model. |
3 Results
The study first verifies the performance impact of parameters on the BOA and determines the algorithm parameters. Then, standard test functions are used to prove the performance advantages of BOA-SSA, including convergence performance, optimization accuracy, and stability. Subsequently, the prediction accuracy of the BP model for material preparation under different experimental conditions is validated, and the superiority of the material properties obtained by the BOA-SSA is verified.
3.1 Performance analysis of BOA-SSA
In the BOA, the switching probability Sp of the perceptual modal parameter c directly affects the performance. The study mainly explores the impact of different parameter sizes on the performance of the BOA. The test function used in the experiment is , and different c and Sp are selected for simulation testing, with a cycle of 30 times. Table 1 displays the experimental environment.
The simulation results obtained with different parameter sizes are shown in Figure 6. As shown in Figure 6a, with the increase of c value, the solving speed of BOA significantly improved. When c was 0.3, the algorithm only iterated to 8 times before approaching 0. A larger value of c helps the algorithm better explore the solution space, but reduces the accuracy. After comprehensive consideration, the study chooses 0.1 as the c value. As shown in Figure 6b, as the switching probability Sp decreased, the solving speed of BOA increased accordingly. When Sp was 0.6, the number of convergence iterations was only 24. If the switching probability is too small, it can cause local optima. Therefore, 0.8 is used as the value of Sp.
To verify the effectiveness of the BOA-SSA, a standard test function is used to test its performance. The Improved Sparrow Search Algorithm (ISSA) optimized based on Firefly Algorithm (FA), SSA, Levy flight strategy, and sine cosine strategy is compared. The population size of all four algorithms is 100, and the maximum iteration is 500. Four standard test functions from CEC2017 are selected to verify the optimization capability of IBOA. The formulas and related information for each test function are displayed in Table 2. In Table 2, the first two functions are high-dimensional unimodal functions, while the last two are high-dimensional multimodal functions.
The study conducts 40 independent simulation tests on four standard test functions using four algorithms, recording the standard deviation, mean, optimal value, and worst value, as presented in Table 3. The BOA-SSA could obtain the optimal solutions of various functions, with a standard deviation of 0, good stability, and significantly better convergence accuracy than other methods. The convergence accuracy of the FA is the lowest, with poor stability.
Further research is conducted to investigate the convergence performance of various algorithms in two high-dimensional unimodal functions. The convergence results are displayed in Figure 7. From Figure 7a, in the F1 function, BOA-SSA had a faster convergence speed. When iterating to around 310 times, the optimal value of the objective function was much smaller than 10−300. From Figure 7b, in the F2 function, when the BOA-SSA iterated to about 430 times, the optimal value of the objective function was much smaller than 10−300. However, the convergence curves of other algorithms did not change significantly in 500 iterations.
The convergence performance in two high-dimensional multimodal functions is displayed in Figure 8. From Figure 8a, in the F3 function, when the BOA-SSA was iterated to about 80 times, the optimal value of the objective function approached 0 infinitely. From Figure 8b, in the F4 function, when the BOA-SSA was iterated to about 150 times, the optimal value of the objective function approached 0 infinitely. Compared with other algorithms, BOA-SSA had significantly better convergence speed.
Experimental environment setup.
![]() |
Fig. 6 Simulation results with different parameter sizes. |
Formulas and related information for each test function.
Independent simulation test results.
![]() |
Fig. 7 The convergence performance of high-dimensional unimodal functions. |
![]() |
Fig. 8 The convergence performance of high-dimensional multimodal functions. |
3.2 The practical application effects of BOA-SSA and BP
To verify the practical application effect of BOA-SSA and BP in the preparation of SiC/(Mo, W)Si2 nanocomposites, a BP model was first constructed based on the experimental materials. The raw materials used in the experiment are (Mo, W)Si2 composite powders with different tungsten contents and Polycarbosilane (PCS). PCS is dissolved in the organic solvent n-hexane. In order to ensure sufficient mixing of PCS with metal silicides such as (Mo, W)Si2, the dissolved PCS was uniformly mixed with (Mo, W)Si2 powder in a certain proportion. After processing, it is loaded into the mold and subjected to sintering treatment under argon protection. After completing the hot pressing sintering, remove the mold to obtain the final composite material. Finally, the three-point bending method is applied to decide the bending strength of the sample. During the testing process, the sample size is 3 × 4 × 36 mm, the distance between two support points is 30 mm, and the loading rate is 0.5 mm/min. The experimental results were used for neural network modeling. The preparation process parameters of each test sample were the input of the BP, and the bending strength was the output. By combining the parameters in the material preparation process modeled by BP neural network with the optimization objective of BOA-SSA algorithm, which is to maximize the bending strength, the consistency between experimental design and optimization objective is ensured. The actual test results of flexural strength provide a quantitative basis for optimizing the objective. In the training of the BP model, 10 samples were used as training data and 3 samples were taken as testing data. When the number of training cycles reached 9, the network converged to 0.0001. The specific convergence process is shown in Figure 9.
The study selects the insulation time, hot pressing temperature, nano SiC particle content, and WSi2 content during the preparation process of SiC/(Mo, W)Si2 nanocomposites as variables to investigate the error size between the predicted and experimental values of the BP model for SiC/(Mo, W)Si2 nanocomposites under different combinations. A total of 10 samples are collected, and the experimental conditions for each sample are shown in Table 4.
The comparison and relative error between the predicted and actual values of each sample by the BP are shown in Figure 10. As shown in Figure 10, the BP model had a minimum error of only 0.15% and a maximum error of only 1.91% between the predicted and experimental values for the bending strength of the prepared material. This indicates that the model can effectively predict the material preparation effect under different experimental conditions.
The research continues to input the optimal combination of insulation time, hot pressing temperature, nano SiC particle content, and WSi2 content obtained by various algorithms into the BP neural network, and compare the output bending strength values. The output results of each algorithm are shown in Table 5. According to Table 5, the optimal combination of conditions obtained by the BOA-SSA algorithm includes a holding time of 55 min, a hot pressing temperature of 1704–°C, a nano SiC particle content of 17.2%, and a ratio of WSi2 content to MoSi2 content of 0.48. Input the optimal combination into the BP neural network to obtain a bending strength of 752 MPa. The bending strength of the optimal combination obtained by SSA algorithm and ISSA algorithm is only 662 MPa and 701 MPa, respectively, which is significantly lower than that of BOA-SSA algorithm.
Finally, the optimal combination obtained by the BOA-SSA algorithm was tested, and the three-point bending method was used to determine the bending strength of the sample. The results are shown in Figure 11. According to Figure 11, in actual testing, when the holding time is 55 min, the hot pressing temperature is 1704 °C, the content of nano SiC particles is 17.2%, and the ratio of WSi2 content to MoSi2 content is 0.48, the bending strength of SiC/(Mo, W) Si2 nanocomposite material is 749 MPa, with a relative error of only 0.4% compared to the predicted value by BP neural network. The preparation scheme of SiC/(Mo, W) Si2 nanocomposites obtained by BOA-SSA algorithm can achieve better material properties.
![]() |
Fig. 9 Model training convergence process. |
Experimental conditions for each sample.
![]() |
Fig. 10 Comparison between model prediction results and actual values. |
The optimal bending strength obtained through different algorithms.
![]() |
Fig. 11 Actual measured values and model predicted values. |
4 Discussion
In order to optimize the preparation process of SiC/(Mo, W)Si2 nanocomposites, a BP model was constructed and optimized using BOA-SSA to improve material properties. In the testing of two high-dimensional unimodal functions, the convergence speed of BOA-SSA exceeded other methods. In the F1 function, when the algorithm was iterated to about 310 times, the optimal value of the objective function was much smaller than 10−300. In the F2 function, when the BOA-SSA iterated to about 430 times, the optimal value of the objective function was much smaller than 10−300. The convergence curves of the remaining algorithms did not change significantly in 500 iterations. In the testing of two high-dimensional multimodal functions, in the F3 function, when the BOA-SSA iterated only about 80 times, the optimal value approached zero infinitely. In the F4 function, when the algorithm iterated to about 150 times, the optimal value approached 0 infinitely. Compared with other algorithms, BOA-SSA had significantly better convergence speed. The reason is that the BOA-SSA effectively enhances the global exploration and local development capabilities during the search process, thereby accelerating convergence speed and improving solution accuracy. In related research, an improved optimization technique combining BOA and SSA was proposed for fault diagnosis of charging piles, optimizing the weights and thresholds of the BP. Compared with the traditional BP model, this method improved the diagnostic accuracy by 14.85% [20]. In the optimal condition combination obtained by BOA-SSA algorithm, the holding time is 55 minutes, the hot pressing temperature is 1704 °C, the content of nano SiC particles is 17.2%, and the ratio of WSi2 content to MoSi2 content is 0.48. The bending strength obtained by inputting the optimal combination into the BP neural network is 752 MPa, which is higher than the SSA algorithm and ISSA algorithm. The reason is that the BOA-SSA algorithm combines the advantages of BOA and SSA, with stronger global search capability and faster convergence speed. The performance of BP neural networks is highly dependent on their parameter settings, and the BOA-SSA algorithm can optimize these parameters more effectively, resulting in higher accuracy of BP neural networks in predicting bending strength. Similarly, Zhu Y et al. proposed an adaptive SSA algorithm for model parameter optimization. The results showed that compared with the traditional SSA algorithm, this optimization algorithm achieved smaller error values and higher efficiency [21]. In summary, the BOA-SSA algorithm has significant advantages in optimizing the parameters of BP neural networks, which can effectively improve prediction accuracy and efficiency.
5 Conclusion
A BP neural network model was developed for the preparation process optimization of SiC/(Mo, W) Si2 nanocomposites, and the BOA-SSA algorithm was used to optimize the model. The results show that as the c increased, the solving speed of the BOA significantly improved. As the switching probability Sp decreased, the solving speed of BOA increased accordingly. If the two parameters were too small, it caused local optima and reduced the accuracy of the solution. Therefore, the study set the c value to 0.1 and the Sp value to 0.8. By comparing the predicted values of the BP neural network model with the actual values, it can be seen that the model has a minimum error of only 0.15% and a maximum error of only 1.91% between the predicted and experimental values of flexural strength. In actual testing, when the insulation time is 55 min, the hot pressing temperature is 1704 °C, the content of nano SiC particles is 17.2%, and the ratio of WSi2 content to MoSi2 content is 0.48, the bending strength of SiC/(Mo, W) Si2 nanocomposite material is 749 MPa, with a relative error of only 0.4% compared to the value predicted by BP neural network. The BOA-SSA can effectively solve the BP model and optimize the preparation process of SiC/(Mo, W)Si2 nanocomposites. At present, the BOA-SSA algorithm has good performance under specific parameter settings, but its adaptability and generalization ability under different material systems or process conditions are also important research directions. Future research should delve into how the BOA-SSA algorithm adjusts the diversity of different material systems and algorithm parameter settings when facing different process conditions, objective functions, and optimization objectives. Among them, the differences in raw material ratios and compositions between SiC/(Mo, W) Si2 nanocomposites and other metal based composites or ceramic based composites can affect the convergence speed and accuracy of the BOA-SSA algorithm. Subsequent research needs to set up comparative experiments to evaluate its generalization performance.
Funding
This research received no external funding.
Conflicts of interest
The authors claim no conflict of interest.
Data availability statement
Data are provided within the manuscript.
Author contribution statement
T.L. conducted experiments, recorded data, analyzed the results, and wrote a manuscript. T.L. is an independent author who has made full contributions to this study without any conflicts of interest.
References
- S. Fooladpanjeh, A. Dadrasi, A.A. Gharahbagh, V. Parvaneh, Fuzzy neural network and coupled gene expression programming/multivariate non-linear regression approach on mechanical features of hydroxyapatite/graphene oxide/epoxy: empirical and optimization study, J. Mech. Eng. Sci. 235 (2021) 7169–7179 [Google Scholar]
- I. Najjar, A.M. Sadoun, A. Ibrahim, H. Ahmadian, A. Fathy, A modified artificial neural network to predict the tribological properties of Al-SiC nanocomposites fabricated by accumulative roll bonding process, J. Compos. Mater. 57 (2023) 3433–3445 [CrossRef] [Google Scholar]
- N.X. Ho, T.T. Le, M.V. Le, Development of artificial intelligence based model for the prediction of Young's modulus of polymer/carbon-nanotubes composites, Mech. Adv. Mater. Struct. 29 (2022) 5965–5978 [Google Scholar]
- P.K. Kharwar, R.K. Verma, A. Singh, Neural network modeling and combined compromise solution (CoCoSo) method for optimization of drilling performances in polymer nanocomposites, J. Thermoplastic Compos. Mater. 35 (2022) 1604–1631 [Google Scholar]
- M. Wang, W. Wang, S. Feng, L. Li, Adaptive multi-class segmentation model of aggregate image based on improved sparrow search algorithm, KSII Trans. Internet Inf. Syst. 17 (2023) 391–411 [Google Scholar]
- Y. Yue, L. Cao, D. Lu, Z. Hu, M. Xu, S. Wang, H. Ding, Review and empirical analysis of sparrow search algorithm, Artif. Intell. Rev. 56 (2023) 10867–10919 [Google Scholar]
- X. Zhou, J. Wang, H. Zhang, Q. Duan, Application of a hybrid improved sparrow search algorithm for the prediction and control of dissolved oxygen in the aquaculture industry, Appl. Intell. 53 (2023) 8482–8502 [Google Scholar]
- P. Kathiroli, K. Selvadurai, Energy efficient cluster head selection using improved Sparrow Search Algorithm in wireless sensor networks, J. King Saud Univ. Comput. Inf. Sci. 34 (2022) 8564–8575 [Google Scholar]
- J. Geng, X. Sun, H. Wang, X. Bu, D. Liu, F. Li, Z. Zhao, A modified adaptive sparrow search algorithm based on chaotic reverse learning and spiral search for global optimization, Neural Comput. Appl. 35 (2023) 24603–24620 [Google Scholar]
- S.M. Razavi, A. Sadollah, A.K. Al-Shamiri, Prediction and optimization of electrical conductivity for polymer-based composites using design of experiment and artificial neural networks, Neural Comput. Appl. 34 (2022) 7653–7671 [Google Scholar]
- S. Gao, X. Liu, X. Liu, D. Chen, H. Guo, J. Yin, Predicting the AC conductivity of nanocomposite films using the bagging model, Polym. Sci. Ser. A 64 (2022) 662–672 [Google Scholar]
- N. Vidakis, M. Petousis, N. Mountakis, E. Maravelakis, S. Zaoutsos, J. Kechagias, Mechanical response assessment of antibacterial PA12/TiO2 3D printed parts: parameters optimization through artificial neural networks modeling, Int. J. Adv. Manufactur. Technol. 121 (2022) 785–803 [Google Scholar]
- A. Shabani, G. Nabiyouni, D. Ghanbari, Preparation and photocatalytic study of CoFe2O4/TiO2/Au nanocomposites and their applications in organic pollutant degradation and modeling by an artificial neural network (ANN), J. Mater. Sci.: Mater. Electr. 33 (2022) 9885–9904 [Google Scholar]
- Y. Elmoghazy, E.M.O. Abuelgasim, S.A. Osman, Y.R. Afaneh, O.M. Eissa, B. Safaei, Effective mechanical properties evaluation of unidirectional and bidirectional composites using virtual domain approach at microscale, Arch. Adv. Eng. Sci. 1 (2023) 27–37 [Google Scholar]
- Z. Zhang, Z. Jiao, R. Shen, P. Song, Q. Wang, Accelerated design of flame retardant polymeric nanocomposites via machine learning prediction, ACS Appl. Eng. Mater. 1 (2022) 596–605 [Google Scholar]
- A.G. Gad, K.M. Sallam, R.K. Chakrabortty, M.J. Ryan, A.A.Abohany, An improved binary sparrow search algorithm for feature selection in data classification, Neural Comput. Appl. 34 (2022) 15705–15752 [Google Scholar]
- I. Najjar, A. Sadoun, A. Fathy, On the understanding and prediction of tribological properties of Al-TiO2 nanocomposites using artificial neural network, J. Compos. Mater. 57 (2023) 2325–2337 [CrossRef] [Google Scholar]
- L. Sun, S. Si, W. Ding, J. Xu, Y. Zhang, BSSFS: binary sparrow search algorithm for feature selection, Int. J. Mach. Learn. Cybern. 14 (2023) 2633–2657 [Google Scholar]
- X. Yuan, J.S. Pan, A.Q. Tian, S.C. Chu, Binary sparrow search algorithm for feature selection, J. Internet Technol. 24 (2023) 217–232 [Google Scholar]
- A. Naik, S.C. Satapathy, A comparative study of social group optimization with a few recent optimization algorithms, Complex Intell. Syst. 7 (2021) 249–295 [Google Scholar]
- Y. Zhu, N. Yousefi, Optimal parameter identification of PEMFC stacks using adaptive sparrow search algorithm, Int. J. Hydrogen Energy 46 (2021) 9541–9552 [Google Scholar]
Cite this article as: Tingyu Liu, Process optimization of SiC/(Mo, W)Si2 nanocomposites preparation based on BOA-SSA, Manufacturing Rev. 12, 10 (2025), https://doi.org/10.1051/mfreview/2025005
All Tables
All Figures
![]() |
Fig. 1 Specific procedures for model construction. |
In the text |
![]() |
Fig. 2 BP training structure. |
In the text |
![]() |
Fig. 3 Sparrow foraging behavior graph in SSA. |
In the text |
![]() |
Fig. 4 The flowchart of BOA. |
In the text |
![]() |
Fig. 5 Flowchart of BOA-SSA model. |
In the text |
![]() |
Fig. 6 Simulation results with different parameter sizes. |
In the text |
![]() |
Fig. 7 The convergence performance of high-dimensional unimodal functions. |
In the text |
![]() |
Fig. 8 The convergence performance of high-dimensional multimodal functions. |
In the text |
![]() |
Fig. 9 Model training convergence process. |
In the text |
![]() |
Fig. 10 Comparison between model prediction results and actual values. |
In the text |
![]() |
Fig. 11 Actual measured values and model predicted values. |
In the text |
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.