Suppliers' selection in supply chain management (SCM) has attracted considerable research interests in recent years. Recent literatures show that neural networks achieve better performance than traditional statisti...Suppliers' selection in supply chain management (SCM) has attracted considerable research interests in recent years. Recent literatures show that neural networks achieve better performance than traditional statistical methods. However, neural networks have inherent drawbacks, such as local optimization solution, lack generalization, and uncontrolled convergence. A relatively new machine learning technique, support vector machine (SVM), which overcomes the drawbacks of neural networks, is introduced to provide a model with better explanatory power to select ideal supplier partners. Meanwhile, in practice, the suppliers' samples are very insufficient. SVMs are adaptive to deal with small samples' training and testing. The prediction accuracies for BPNN and SVM methods are compared to choose the appreciating suppliers. The actual examples illustrate that SVM methods are superior to BPNN.展开更多
The classical job shop scheduling problem(JSP) is the most popular machine scheduling model in practice and is known as NP-hard.The formulation of the JSP is based on the assumption that for each part type or job ther...The classical job shop scheduling problem(JSP) is the most popular machine scheduling model in practice and is known as NP-hard.The formulation of the JSP is based on the assumption that for each part type or job there is only one process plan that prescribes the sequence of operations and the machine on which each operation has to be performed.However,JSP with alternative machines for various operations is an extension of the classical JSP,which allows an operation to be processed by any machine from a given set of machines.Since this problem requires an additional decision of machine allocation during scheduling,it is much more complex than JSP.We present a domain independent genetic algorithm(GA) approach for the job shop scheduling problem with alternative machines.The GA is implemented in a spreadsheet environment.The performance of the proposed GA is analyzed by comparing with various problem instances taken from the literatures.The result shows that the proposed GA is competitive with the existing approaches.A simplified approach that would be beneficial to both practitioners and researchers is presented for solving scheduling problems with alternative machines.展开更多
Robustly stable multi-step-ahead model predictive control (MPC) based on parallel support vector machines (SVMs) with linear kernel was proposed. First, an analytical solution of optimal control laws of parallel SVMs ...Robustly stable multi-step-ahead model predictive control (MPC) based on parallel support vector machines (SVMs) with linear kernel was proposed. First, an analytical solution of optimal control laws of parallel SVMs based MPC was derived, and then the necessary and sufficient stability condition for MPC closed loop was given according to SVM model, and finally a method of judging the discrepancy between SVM model and the actual plant was presented, and consequently the constraint sets, which can guarantee that the stability condition is still robust for model/plant mismatch within some given bounds, were obtained by applying small-gain theorem. Simulation experiments show the proposed stability condition and robust constraint sets can provide a convenient way of adjusting controller parameters to ensure a closed-loop with larger stable margin.展开更多
Energy consumption of block-cutting machines represents a major cost item in the processing of travertines and other natural stones. Therefore, determining the optimum sawing conditions for a particular stone is of ma...Energy consumption of block-cutting machines represents a major cost item in the processing of travertines and other natural stones. Therefore, determining the optimum sawing conditions for a particular stone is of major importance in the natural stone-processing industry. An experimental study was carried out utilizing a fully instrumented block-cutter to investigate the sawing performances of five different types of travertine blocks during cutting with a circular diamond saw. The sawing tests were performed in the down-cutting mode. Performance measurements were determined by measuring the cutting speed and energy consumption. Then, specific energy was determined. The one main cutting parameter, cutting speed, was varied in the investigation of optimum cutting performance. Furthermore, some physico-mechanical properties of file travertine blocks were determined in the laboratory. As a result, it is found that the energy consumption (specific energy) of block cutting machines is highly affected by cutting speed. It is determined that specific energy value usually decreases when cutting speed increases. When the cutting speed is higher than the determined value, the diamond saw can become stuck in the travertine block; this situation can be a problem for the block-cutting machine. As a result, the optimum cutting speed obtained for the travertine mines examined is approximately 1.5-2.0 m/min.展开更多
This paper considers the uniform parallel machine scheduling problem with unequal release dates and delivery times to minimize the maximum completion time.For this NP-hard problem,the largest sum of release date,proce...This paper considers the uniform parallel machine scheduling problem with unequal release dates and delivery times to minimize the maximum completion time.For this NP-hard problem,the largest sum of release date,processing time and delivery time first rule is designed to determine a certain machine for each job,and the largest difference between delivery time and release date first rule is designed to sequence the jobs scheduled on the same machine,and then a novel algorithm for the scheduling problem is built.To evaluate the performance of the proposed algorithm,a lower bound for the problem is proposed.The accuracy of the proposed algorithm is tested based on the data with problem size varying from 200 jobs to 600 jobs.The computational results indicate that the average relative error between the proposed algorithm and the lower bound is only 0.667%,therefore the solutions obtained by the proposed algorithm are very accurate.展开更多
The origin and influence factors of sand liquefaction were analyzed, and the relation between liquefaction and its influence factors was founded. A model based on support vector machines (SVM) was established whose in...The origin and influence factors of sand liquefaction were analyzed, and the relation between liquefaction and its influence factors was founded. A model based on support vector machines (SVM) was established whose input parameters were selected as following influence factors of sand liquefaction: magnitude (M), the value of SPT, effective pressure of superstratum, the content of clay and the average of grain diameter. Sand was divided into two classes: liquefaction and non-liquefaction, and the class label was treated as output parameter of the model. Then the model was used to estimate sand samples, 20 support vectors and 17 borderline support vectors were gotten, then the parameters were optimized, 14 support vectors and 6 borderline support vectors were gotten, and the prediction precision reaches 100%. In order to verify the generalization of the SVM method, two other practical samples' data from two cities, Tangshan of Hebei province and Sanshui of Guangdong province, were dealt with by another more intricate model for polytomies, which also considered some influence factors of sand liquefaction as the input parameters and divided sand into four liquefaction grades: serious liquefaction, medium liquefaction, slight liquefaction and non-liquefaction as the output parameters. The simulation results show that the latter model has a very high precision, and using SVM model to estimate sand liquefaction is completely feasible.展开更多
The performance of cutting machines in terms of energy consumption and vibration directly affects the production costs. In this work, our aim was to evaluate the performance of cutting machines using hybrid intelligen...The performance of cutting machines in terms of energy consumption and vibration directly affects the production costs. In this work, our aim was to evaluate the performance of cutting machines using hybrid intelligent models. For this purpose, a systematic experimental work was performed. A database of the carbonate and granite rocks was established, in which the physical and mechanical properties of these rocks (i.e., UCS, elastic modulus, Mohs hardness, and Schmiazek abrasivity factor) and the operational parameters (i.e., depth of cut and feed rate) were considered as the input parameters. The predictive models were developed incorporating a combination of the multi-layered perceptron artificial neural networks and genetic algorithm (GANN-BP) and the support vector regression method and Cuckoo optimization algorithm (COA-SVR). The results obtained indicated that the performance of the developed GANN-BP and COA-SVR models was close to each other and that these models had good agreements with the measured values. These results also showed that these proposed models were suitable tools in evaluating the performance of cutting machines.展开更多
The gears of new energy vehicles are required to withstand higher rotational speeds and greater loads,which puts forward higher precision essentials for gear manufacturing.However,machining process parameters can caus...The gears of new energy vehicles are required to withstand higher rotational speeds and greater loads,which puts forward higher precision essentials for gear manufacturing.However,machining process parameters can cause changes in cutting force/heat,resulting in affecting gear machining precision.Therefore,this paper studies the effect of different process parameters on gear machining precision.A multi-objective optimization model is established for the relationship between process parameters and tooth surface deviations,tooth profile deviations,and tooth lead deviations through the cutting speed,feed rate,and cutting depth of the worm wheel gear grinding machine.The response surface method(RSM)is used for experimental design,and the corresponding experimental results and optimal process parameters are obtained.Subsequently,gray relational analysis-principal component analysis(GRA-PCA),particle swarm optimization(PSO),and genetic algorithm-particle swarm optimization(GA-PSO)methods are used to analyze the experimental results and obtain different optimal process parameters.The results show that optimal process parameters obtained by the GRA-PCA,PSO,and GA-PSO methods improve the gear machining precision.Moreover,the gear machining precision obtained by GA-PSO is superior to other methods.展开更多
Machine picking in cotton is an emerging practice in India,to solve the problems of labour shortages and production costs increasing.Cotton production has been declining in recent years;however,the high density planti...Machine picking in cotton is an emerging practice in India,to solve the problems of labour shortages and production costs increasing.Cotton production has been declining in recent years;however,the high density planting system(HDPS)offers a viable method to enhance productivity by increasing plant populations per unit area,optimizing resource utilization,and facilitating machine picking.Cotton is an indeterminate plant that produce excessive vegeta-tive growth in favorable soil fertility and moisture conditions,which posing challenges for efficient machine picking.To address this issue,the application of plant growth retardants(PGRs)is essential for controlling canopy architecture.PGRs reduce internode elongation,promote regulated branching,and increase plant compactness,making cotton plants better suited for machine picking.PGRs application also optimizes photosynthates distribution between veg-etative and reproductive growth,resulting in higher yields and improved fibre quality.The integration of HDPS and PGRs applications results in an optimal plant architecture for improving machine picking efficiency.However,the success of this integration is determined by some factors,including cotton variety,environmental conditions,and geographical variations.These approaches not only address yield stagnation and labour shortages but also help to establish more effective and sustainable cotton farming practices,resulting in higher cotton productivity.展开更多
The graded density impactor(GDI)dynamic loading technique is crucial for acquiring the dynamic physical property parameters of materials used in weapons.The accuracy and timeliness of GDI structural design are key to ...The graded density impactor(GDI)dynamic loading technique is crucial for acquiring the dynamic physical property parameters of materials used in weapons.The accuracy and timeliness of GDI structural design are key to achieving controllable stress-strain rate loading.In this study,we have,for the first time,combined one-dimensional fluid computational software with machine learning methods.We first elucidated the mechanisms by which GDI structures control stress and strain rates.Subsequently,we constructed a machine learning model to create a structure-property response surface.The results show that altering the loading velocity and interlayer thickness has a pronounced regulatory effect on stress and strain rates.In contrast,the impedance distribution index and target thickness have less significant effects on stress regulation,although there is a matching relationship between target thickness and interlayer thickness.Compared with traditional design methods,the machine learning approach offers a10^(4)—10^(5)times increase in efficiency and the potential to achieve a global optimum,holding promise for guiding the design of GDI.展开更多
This paper focuses on the applications of the support vector machines in solving the problem of blind recovery in digital communication systems.We introduce the technique of support vector machines briefly,the develop...This paper focuses on the applications of the support vector machines in solving the problem of blind recovery in digital communication systems.We introduce the technique of support vector machines briefly,the development of blind equalization and analyze the problems which need to be resolved of the blind problems.Then the applicability of support vector machines in blind problem is highlighted and deduced.Finally,merit and shortage of blind equalization using support vector machines which is already exist to be discussed and the direction of further research is indicated.展开更多
3-Nitro-1,2,4-triazol-5-one(NTO)is a typical high-energy,low-sensitivity explosive,and accurate concentration monitoring is critical for crystallization process control.In this study,a high-precision quantitative anal...3-Nitro-1,2,4-triazol-5-one(NTO)is a typical high-energy,low-sensitivity explosive,and accurate concentration monitoring is critical for crystallization process control.In this study,a high-precision quantitative analytical model for NTO concentration in ethanol solutions was developed by integrating real-time ATR-FTIR spectroscopy with chemometric and machine learning techniques.Dynamic spectral data were obtained by designing multi-concentration gradient heating-cooling cycle experiments,abnormal samples were eliminated using the isolation forest algorithm,and the effects of various preprocessing methods on model performance were systematically evaluated.The results show that partial least squares regression(PLSR)exhibits superior generalization ability compared to other models.Vibrational bands corresponding to C=O and–NO_(2)were identified as key predictors for concentration estimation.This work provides an efficient and reliable solution for real-time concentration monitoring during NTO crystallization and holds significant potential for process analytical applications in energetic material manufacturing.展开更多
Background The geo-traceability of cotton is crucial for ensuring the quality and integrity of cotton brands. However, effective methods for achieving this traceability are currently lacking. This study investigates t...Background The geo-traceability of cotton is crucial for ensuring the quality and integrity of cotton brands. However, effective methods for achieving this traceability are currently lacking. This study investigates the potential of explainable machine learning for the geo-traceability of raw cotton.Results The findings indicate that principal component analysis(PCA) exhibits limited effectiveness in tracing cotton origins. In contrast, partial least squares discriminant analysis(PLS-DA) demonstrates superior classification performance, identifying seven discriminating variables: Na, Mn, Ba, Rb, Al, As, and Pb. The use of decision tree(DT), support vector machine(SVM), and random forest(RF) models for origin discrimination yielded accuracies of 90%, 87%, and 97%, respectively. Notably, the light gradient boosting machine(Light GBM) model achieved perfect performance metrics, with accuracy, precision, and recall rate all reaching 100% on the test set. The output of the Light GBM model was further evaluated using the SHapley Additive ex Planation(SHAP) technique, which highlighted differences in the elemental composition of raw cotton from various countries. Specifically, the elements Pb, Ni, Na, Al, As, Ba, and Rb significantly influenced the model's predictions.Conclusion These findings suggest that explainable machine learning techniques can provide insights into the complex relationships between geographic information and raw cotton. Consequently, these methodologies enhances the precision and reliability of geographic traceability for raw cotton.展开更多
Background Plant tissue culture has emerged as a tool for improving cotton propagation and genetics,but recalcitrance nature of cotton makes it difficult to develop in vitro regeneration.Cotton’s recalcitrance is inf...Background Plant tissue culture has emerged as a tool for improving cotton propagation and genetics,but recalcitrance nature of cotton makes it difficult to develop in vitro regeneration.Cotton’s recalcitrance is influenced by genotype,explant type,and environmental conditions.To overcome these issues,this study uses different machine learning-based predictive models by employing multiple input factors.Cotyledonary node explants of two commercial cotton cultivars(STN-468 and GSN-12)were isolated from 7–8 days old seedlings,preconditioned with 5,10,and 20 mg·L^(-1) kinetin(KIN)for 10 days.Thereafter,explants were postconditioned on full Murashige and Skoog(MS),1/2MS,1/4MS,and full MS+0.05 mg·L^(-1) KIN,cultured in growth room enlightened with red and blue light-emitting diodes(LED)combination.Statistical analysis(analysis of variance,regression analysis)was employed to assess the impact of different treatments on shoot regeneration,with artificial intelligence(AI)models used for confirming the findings.Results GSN-12 exhibited superior shoot regeneration potential compared with STN-468,with an average of 4.99 shoots per explant versus 3.97.Optimal results were achieved with 5 mg·L^(-1) KIN preconditioning,1/4MS postconditioning,and 80%red LED,with maximum of 7.75 shoot count for GSN-12 under these conditions;while STN-468 reached 6.00 shoots under the conditions of 10 mg·L^(-1) KIN preconditioning,MS with 0.05 mg·L^(-1) KIN(postconditioning)and 75.0%red LED.Rooting was successfully achieved with naphthalene acetic acid and activated charcoal.Additionally,three different powerful AI-based models,namely,extreme gradient boost(XGBoost),random forest(RF),and the artificial neural network-based multilayer perceptron(MLP)regression models validated the findings.Conclusion GSN-12 outperformed STN-468 with optimal results from 5 mg·L^(-1) KIN+1/4MS+80%red LED.Application of machine learning-based prediction models to optimize cotton tissue culture protocols for shoot regeneration is helpful to improve cotton regeneration efficiency.展开更多
Blast-induced ground vibration,quantified by peak particle velocity(PPV),is a crucial factor in mitigating environmental and structural risks in mining and geotechnical engineering.Accurate PPV prediction facilitates ...Blast-induced ground vibration,quantified by peak particle velocity(PPV),is a crucial factor in mitigating environmental and structural risks in mining and geotechnical engineering.Accurate PPV prediction facilitates safer and more sustainable blasting operations by minimizing adverse impacts and ensuring regulatory compliance.This study presents an advanced predictive framework integrating Cat Boost(CB)with nature-inspired optimization algorithms,including the Bat Algorithm(BAT),Sparrow Search Algorithm(SSA),Butterfly Optimization Algorithm(BOA),and Grasshopper Optimization Algorithm(GOA).A comprehensive dataset from the Sarcheshmeh Copper Mine in Iran was utilized to develop and evaluate these models using key performance metrics such as the Index of Agreement(IoA),Nash-Sutcliffe Efficiency(NSE),and the coefficient of determination(R^(2)).The hybrid CB-BOA model outperformed other approaches,achieving the highest accuracy(R^(2)=0.989)and the lowest prediction errors.SHAP analysis identified Distance(Di)as the most influential variable affecting PPV,while uncertainty analysis confirmed CB-BOA as the most reliable model,featuring the narrowest prediction interval.These findings highlight the effectiveness of hybrid machine learning models in refining PPV predictions,contributing to improved blast design strategies,enhanced structural safety,and reduced environmental impacts in mining and geotechnical engineering.展开更多
Driven by rapid technological advancements and economic growth,mineral extraction and metal refining have increased dramatically,generating huge volumes of tailings and mine waste(TMWs).Investigating the morphological...Driven by rapid technological advancements and economic growth,mineral extraction and metal refining have increased dramatically,generating huge volumes of tailings and mine waste(TMWs).Investigating the morphological fractions of heavy metals and metalloids(HMMs)in TMWs is key to evaluating their leaching potential into the environment;however,traditional experiments are time-consuming and labor-intensive.In this study,10 machine learning(ML)algorithms were used and compared for rapidly predicting the morphological fractions of HMMs in TMWs.A dataset comprising 2376 data points was used,with mineral composition,elemental properties,and total concentration used as inputs and concentration of morphological fraction used as output.After grid search optimization,the extra tree model performed the best,achieving coefficient of determination(R2)of 0.946 and 0.942 on the validation and test sets,respectively.Electronegativity was found to have the greatest impact on the morphological fraction.The models’performance was enhanced by applying an ensemble method to the top three optimal ML models,including gradient boosting decision tree,extra trees and categorical boosting.Overall,the proposed framework can accurately predict the concentrations of different morphological fractions of HMMs in TMWs.This approach can minimize detection time,aid in the safe management and recovery of TMWs.展开更多
Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,su...Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,such as textile,medicine,and automobile industries,it has greater commercial importance.The crop’s performance is greatly influenced by prevailing weather dynamics.As climate changes,assessing how weather changes affect crop performance is essential.Among various techniques that are available,crop models are the most effective and widely used tools for predicting yields.Results This study compares statistical and machine learning models to assess their ability to predict cotton yield across major producing districts of Karnataka,India,utilizing a long-term dataset spanning from 1990 to 2023 that includes yield and weather factors.The artificial neural networks(ANNs)performed superiorly with acceptable yield deviations ranging within±10%during both vegetative stage(F1)and mid stage(F2)for cotton.The model evaluation metrics such as root mean square error(RMSE),normalized root mean square error(nRMSE),and modelling efficiency(EF)were also within the acceptance limits in most districts.Furthermore,the tested ANN model was used to assess the importance of the dominant weather factors influencing crop yield in each district.Specifically,the use of morning relative humidity as an individual parameter and its interaction with maximum and minimum tempera-ture had a major influence on cotton yield in most of the yield predicted districts.These differences highlighted the differential interactions of weather factors in each district for cotton yield formation,highlighting individual response of each weather factor under different soils and management conditions over the major cotton growing districts of Karnataka.Conclusions Compared with statistical models,machine learning models such as ANNs proved higher efficiency in forecasting the cotton yield due to their ability to consider the interactive effects of weather factors on yield forma-tion at different growth stages.This highlights the best suitability of ANNs for yield forecasting in rainfed conditions and for the study on relative impacts of weather factors on yield.Thus,the study aims to provide valuable insights to support stakeholders in planning effective crop management strategies and formulating relevant policies.展开更多
A typical Whipple shield consists of double-layered plates with a certain gap.The space debris impacts the outer plate and is broken into a debris cloud(shattered,molten,vaporized)with dispersed energy and momentum,wh...A typical Whipple shield consists of double-layered plates with a certain gap.The space debris impacts the outer plate and is broken into a debris cloud(shattered,molten,vaporized)with dispersed energy and momentum,which reduces the risk of penetrating the bulkhead.In the realm of hypervelocity impact,strain rate(>10^(5)s^(-1))effects are negligible,and fluid dynamics is employed to describe the impact process.Efficient numerical tools for precisely predicting the damage degree can greatly accelerate the design and optimization of advanced protective structures.Current hypervelocity impact research primarily focuses on the interaction between projectile and front plate and the movement of debris cloud.However,the damage mechanism of debris cloud impacts on rear plates-the critical threat component-remains underexplored owing to complex multi-physics processes and prohibitive computational costs.Existing approaches,ranging from semi-empirical equations to a machine learningbased ballistic limit prediction method,are constrained to binary penetration classification.Alternatively,the uneven data from experiments and simulations caused these methods to be ineffective when the projectile has irregular shapes and complicate flight attitude.Therefore,it is urgent to develop a new damage prediction method for predicting the rear plate damage,which can help to gain a deeper understanding of the damage mechanism.In this study,a machine learning(ML)method is developed to predict the damage distribution in the rear plate.Based on the unit velocity space,the discretized information of debris cloud and rear plate damage from rare simulation cases is used as input data for training the ML models,while the generalization ability for damage distribution prediction is tested by other simulation cases with different attack angles.The results demonstrate that the training and prediction accuracies using the Random Forest(RF)algorithm significantly surpass those using Artificial Neural Networks(ANNs)and Support Vector Machine(SVM).The RF-based model effectively identifies damage features in sparsely distributed debris cloud and cumulative effect.This study establishes an expandable new dataset that accommodates additional parameters to improve the prediction accuracy.Results demonstrate the model's ability to overcome data imbalance limitations through debris cloud features,enabling rapid and accurate rear plate damage prediction across wider scenarios with minimal data requirements.展开更多
介绍了STEP-NC的概念、数据模型及其结构特点,然后通过对比MLP(Machining Line Planner)和STEP-NC数控程序对特征和操作的不同定义方法,分析了在MLP中特征及加工工艺与STEP-NC的对应关系,探讨了在MLP中实现输出STEP-NC格式的零件加工程...介绍了STEP-NC的概念、数据模型及其结构特点,然后通过对比MLP(Machining Line Planner)和STEP-NC数控程序对特征和操作的不同定义方法,分析了在MLP中特征及加工工艺与STEP-NC的对应关系,探讨了在MLP中实现输出STEP-NC格式的零件加工程序的方法。展开更多
文摘Suppliers' selection in supply chain management (SCM) has attracted considerable research interests in recent years. Recent literatures show that neural networks achieve better performance than traditional statistical methods. However, neural networks have inherent drawbacks, such as local optimization solution, lack generalization, and uncontrolled convergence. A relatively new machine learning technique, support vector machine (SVM), which overcomes the drawbacks of neural networks, is introduced to provide a model with better explanatory power to select ideal supplier partners. Meanwhile, in practice, the suppliers' samples are very insufficient. SVMs are adaptive to deal with small samples' training and testing. The prediction accuracies for BPNN and SVM methods are compared to choose the appreciating suppliers. The actual examples illustrate that SVM methods are superior to BPNN.
文摘The classical job shop scheduling problem(JSP) is the most popular machine scheduling model in practice and is known as NP-hard.The formulation of the JSP is based on the assumption that for each part type or job there is only one process plan that prescribes the sequence of operations and the machine on which each operation has to be performed.However,JSP with alternative machines for various operations is an extension of the classical JSP,which allows an operation to be processed by any machine from a given set of machines.Since this problem requires an additional decision of machine allocation during scheduling,it is much more complex than JSP.We present a domain independent genetic algorithm(GA) approach for the job shop scheduling problem with alternative machines.The GA is implemented in a spreadsheet environment.The performance of the proposed GA is analyzed by comparing with various problem instances taken from the literatures.The result shows that the proposed GA is competitive with the existing approaches.A simplified approach that would be beneficial to both practitioners and researchers is presented for solving scheduling problems with alternative machines.
基金Project(2002CB312200) supported by the National Key Fundamental Research and Development Program of China project(60574019) supported by the National Natural Science Foundation of China
文摘Robustly stable multi-step-ahead model predictive control (MPC) based on parallel support vector machines (SVMs) with linear kernel was proposed. First, an analytical solution of optimal control laws of parallel SVMs based MPC was derived, and then the necessary and sufficient stability condition for MPC closed loop was given according to SVM model, and finally a method of judging the discrepancy between SVM model and the actual plant was presented, and consequently the constraint sets, which can guarantee that the stability condition is still robust for model/plant mismatch within some given bounds, were obtained by applying small-gain theorem. Simulation experiments show the proposed stability condition and robust constraint sets can provide a convenient way of adjusting controller parameters to ensure a closed-loop with larger stable margin.
文摘Energy consumption of block-cutting machines represents a major cost item in the processing of travertines and other natural stones. Therefore, determining the optimum sawing conditions for a particular stone is of major importance in the natural stone-processing industry. An experimental study was carried out utilizing a fully instrumented block-cutter to investigate the sawing performances of five different types of travertine blocks during cutting with a circular diamond saw. The sawing tests were performed in the down-cutting mode. Performance measurements were determined by measuring the cutting speed and energy consumption. Then, specific energy was determined. The one main cutting parameter, cutting speed, was varied in the investigation of optimum cutting performance. Furthermore, some physico-mechanical properties of file travertine blocks were determined in the laboratory. As a result, it is found that the energy consumption (specific energy) of block cutting machines is highly affected by cutting speed. It is determined that specific energy value usually decreases when cutting speed increases. When the cutting speed is higher than the determined value, the diamond saw can become stuck in the travertine block; this situation can be a problem for the block-cutting machine. As a result, the optimum cutting speed obtained for the travertine mines examined is approximately 1.5-2.0 m/min.
基金supported by the National Natural Science Foundation of China (7087103290924021+2 种基金70971035)the National High Technology Research and Development Program of China (863 Program) (2008AA042901)Anhui Provincial Natural Science Foundation (11040606Q27)
文摘This paper considers the uniform parallel machine scheduling problem with unequal release dates and delivery times to minimize the maximum completion time.For this NP-hard problem,the largest sum of release date,processing time and delivery time first rule is designed to determine a certain machine for each job,and the largest difference between delivery time and release date first rule is designed to sequence the jobs scheduled on the same machine,and then a novel algorithm for the scheduling problem is built.To evaluate the performance of the proposed algorithm,a lower bound for the problem is proposed.The accuracy of the proposed algorithm is tested based on the data with problem size varying from 200 jobs to 600 jobs.The computational results indicate that the average relative error between the proposed algorithm and the lower bound is only 0.667%,therefore the solutions obtained by the proposed algorithm are very accurate.
文摘The origin and influence factors of sand liquefaction were analyzed, and the relation between liquefaction and its influence factors was founded. A model based on support vector machines (SVM) was established whose input parameters were selected as following influence factors of sand liquefaction: magnitude (M), the value of SPT, effective pressure of superstratum, the content of clay and the average of grain diameter. Sand was divided into two classes: liquefaction and non-liquefaction, and the class label was treated as output parameter of the model. Then the model was used to estimate sand samples, 20 support vectors and 17 borderline support vectors were gotten, then the parameters were optimized, 14 support vectors and 6 borderline support vectors were gotten, and the prediction precision reaches 100%. In order to verify the generalization of the SVM method, two other practical samples' data from two cities, Tangshan of Hebei province and Sanshui of Guangdong province, were dealt with by another more intricate model for polytomies, which also considered some influence factors of sand liquefaction as the input parameters and divided sand into four liquefaction grades: serious liquefaction, medium liquefaction, slight liquefaction and non-liquefaction as the output parameters. The simulation results show that the latter model has a very high precision, and using SVM model to estimate sand liquefaction is completely feasible.
基金Supported by the National Creative Research Groups Science Foundation of P.R. China (NCRGSFC: 60421002) and National High Technology Research and Development Program of China (863 Program) (2006AA04 Z182)
基金Project(11039)supported by Shahrood University of Technology,Iran
文摘The performance of cutting machines in terms of energy consumption and vibration directly affects the production costs. In this work, our aim was to evaluate the performance of cutting machines using hybrid intelligent models. For this purpose, a systematic experimental work was performed. A database of the carbonate and granite rocks was established, in which the physical and mechanical properties of these rocks (i.e., UCS, elastic modulus, Mohs hardness, and Schmiazek abrasivity factor) and the operational parameters (i.e., depth of cut and feed rate) were considered as the input parameters. The predictive models were developed incorporating a combination of the multi-layered perceptron artificial neural networks and genetic algorithm (GANN-BP) and the support vector regression method and Cuckoo optimization algorithm (COA-SVR). The results obtained indicated that the performance of the developed GANN-BP and COA-SVR models was close to each other and that these models had good agreements with the measured values. These results also showed that these proposed models were suitable tools in evaluating the performance of cutting machines.
基金Projects(U22B2084,52275483,52075142)supported by the National Natural Science Foundation of ChinaProject(2023ZY01050)supported by the Ministry of Industry and Information Technology High Quality Development,China。
文摘The gears of new energy vehicles are required to withstand higher rotational speeds and greater loads,which puts forward higher precision essentials for gear manufacturing.However,machining process parameters can cause changes in cutting force/heat,resulting in affecting gear machining precision.Therefore,this paper studies the effect of different process parameters on gear machining precision.A multi-objective optimization model is established for the relationship between process parameters and tooth surface deviations,tooth profile deviations,and tooth lead deviations through the cutting speed,feed rate,and cutting depth of the worm wheel gear grinding machine.The response surface method(RSM)is used for experimental design,and the corresponding experimental results and optimal process parameters are obtained.Subsequently,gray relational analysis-principal component analysis(GRA-PCA),particle swarm optimization(PSO),and genetic algorithm-particle swarm optimization(GA-PSO)methods are used to analyze the experimental results and obtain different optimal process parameters.The results show that optimal process parameters obtained by the GRA-PCA,PSO,and GA-PSO methods improve the gear machining precision.Moreover,the gear machining precision obtained by GA-PSO is superior to other methods.
文摘Machine picking in cotton is an emerging practice in India,to solve the problems of labour shortages and production costs increasing.Cotton production has been declining in recent years;however,the high density planting system(HDPS)offers a viable method to enhance productivity by increasing plant populations per unit area,optimizing resource utilization,and facilitating machine picking.Cotton is an indeterminate plant that produce excessive vegeta-tive growth in favorable soil fertility and moisture conditions,which posing challenges for efficient machine picking.To address this issue,the application of plant growth retardants(PGRs)is essential for controlling canopy architecture.PGRs reduce internode elongation,promote regulated branching,and increase plant compactness,making cotton plants better suited for machine picking.PGRs application also optimizes photosynthates distribution between veg-etative and reproductive growth,resulting in higher yields and improved fibre quality.The integration of HDPS and PGRs applications results in an optimal plant architecture for improving machine picking efficiency.However,the success of this integration is determined by some factors,including cotton variety,environmental conditions,and geographical variations.These approaches not only address yield stagnation and labour shortages but also help to establish more effective and sustainable cotton farming practices,resulting in higher cotton productivity.
基金supported by the Guangdong Major Project of Basic and Applied Basic Research(Grant No.2021B0301030001)the National Key Research and Development Program of China(Grant No.2021YFB3802300)the Foundation of National Key Laboratory of Shock Wave and Detonation Physics(Grant No.JCKYS2022212004)。
文摘The graded density impactor(GDI)dynamic loading technique is crucial for acquiring the dynamic physical property parameters of materials used in weapons.The accuracy and timeliness of GDI structural design are key to achieving controllable stress-strain rate loading.In this study,we have,for the first time,combined one-dimensional fluid computational software with machine learning methods.We first elucidated the mechanisms by which GDI structures control stress and strain rates.Subsequently,we constructed a machine learning model to create a structure-property response surface.The results show that altering the loading velocity and interlayer thickness has a pronounced regulatory effect on stress and strain rates.In contrast,the impedance distribution index and target thickness have less significant effects on stress regulation,although there is a matching relationship between target thickness and interlayer thickness.Compared with traditional design methods,the machine learning approach offers a10^(4)—10^(5)times increase in efficiency and the potential to achieve a global optimum,holding promise for guiding the design of GDI.
基金supported in part by the National Natural Science Foundation of China(No.60772060 )the project of NJUPT(No.NY207056)
文摘This paper focuses on the applications of the support vector machines in solving the problem of blind recovery in digital communication systems.We introduce the technique of support vector machines briefly,the development of blind equalization and analyze the problems which need to be resolved of the blind problems.Then the applicability of support vector machines in blind problem is highlighted and deduced.Finally,merit and shortage of blind equalization using support vector machines which is already exist to be discussed and the direction of further research is indicated.
基金supported by the Aeronautical Science Foundation of China(Grant No.20230018072011)。
文摘3-Nitro-1,2,4-triazol-5-one(NTO)is a typical high-energy,low-sensitivity explosive,and accurate concentration monitoring is critical for crystallization process control.In this study,a high-precision quantitative analytical model for NTO concentration in ethanol solutions was developed by integrating real-time ATR-FTIR spectroscopy with chemometric and machine learning techniques.Dynamic spectral data were obtained by designing multi-concentration gradient heating-cooling cycle experiments,abnormal samples were eliminated using the isolation forest algorithm,and the effects of various preprocessing methods on model performance were systematically evaluated.The results show that partial least squares regression(PLSR)exhibits superior generalization ability compared to other models.Vibrational bands corresponding to C=O and–NO_(2)were identified as key predictors for concentration estimation.This work provides an efficient and reliable solution for real-time concentration monitoring during NTO crystallization and holds significant potential for process analytical applications in energetic material manufacturing.
基金supported by Agricultural Science and Technology Innovation Program of Chinese Academy of Agricultural Science。
文摘Background The geo-traceability of cotton is crucial for ensuring the quality and integrity of cotton brands. However, effective methods for achieving this traceability are currently lacking. This study investigates the potential of explainable machine learning for the geo-traceability of raw cotton.Results The findings indicate that principal component analysis(PCA) exhibits limited effectiveness in tracing cotton origins. In contrast, partial least squares discriminant analysis(PLS-DA) demonstrates superior classification performance, identifying seven discriminating variables: Na, Mn, Ba, Rb, Al, As, and Pb. The use of decision tree(DT), support vector machine(SVM), and random forest(RF) models for origin discrimination yielded accuracies of 90%, 87%, and 97%, respectively. Notably, the light gradient boosting machine(Light GBM) model achieved perfect performance metrics, with accuracy, precision, and recall rate all reaching 100% on the test set. The output of the Light GBM model was further evaluated using the SHapley Additive ex Planation(SHAP) technique, which highlighted differences in the elemental composition of raw cotton from various countries. Specifically, the elements Pb, Ni, Na, Al, As, Ba, and Rb significantly influenced the model's predictions.Conclusion These findings suggest that explainable machine learning techniques can provide insights into the complex relationships between geographic information and raw cotton. Consequently, these methodologies enhances the precision and reliability of geographic traceability for raw cotton.
文摘Background Plant tissue culture has emerged as a tool for improving cotton propagation and genetics,but recalcitrance nature of cotton makes it difficult to develop in vitro regeneration.Cotton’s recalcitrance is influenced by genotype,explant type,and environmental conditions.To overcome these issues,this study uses different machine learning-based predictive models by employing multiple input factors.Cotyledonary node explants of two commercial cotton cultivars(STN-468 and GSN-12)were isolated from 7–8 days old seedlings,preconditioned with 5,10,and 20 mg·L^(-1) kinetin(KIN)for 10 days.Thereafter,explants were postconditioned on full Murashige and Skoog(MS),1/2MS,1/4MS,and full MS+0.05 mg·L^(-1) KIN,cultured in growth room enlightened with red and blue light-emitting diodes(LED)combination.Statistical analysis(analysis of variance,regression analysis)was employed to assess the impact of different treatments on shoot regeneration,with artificial intelligence(AI)models used for confirming the findings.Results GSN-12 exhibited superior shoot regeneration potential compared with STN-468,with an average of 4.99 shoots per explant versus 3.97.Optimal results were achieved with 5 mg·L^(-1) KIN preconditioning,1/4MS postconditioning,and 80%red LED,with maximum of 7.75 shoot count for GSN-12 under these conditions;while STN-468 reached 6.00 shoots under the conditions of 10 mg·L^(-1) KIN preconditioning,MS with 0.05 mg·L^(-1) KIN(postconditioning)and 75.0%red LED.Rooting was successfully achieved with naphthalene acetic acid and activated charcoal.Additionally,three different powerful AI-based models,namely,extreme gradient boost(XGBoost),random forest(RF),and the artificial neural network-based multilayer perceptron(MLP)regression models validated the findings.Conclusion GSN-12 outperformed STN-468 with optimal results from 5 mg·L^(-1) KIN+1/4MS+80%red LED.Application of machine learning-based prediction models to optimize cotton tissue culture protocols for shoot regeneration is helpful to improve cotton regeneration efficiency.
基金the Deanship of Scientific Research at Northern Border University,Arar,KSA for funding this research work through the project number"NBUFFMRA-2025-2461-09"。
文摘Blast-induced ground vibration,quantified by peak particle velocity(PPV),is a crucial factor in mitigating environmental and structural risks in mining and geotechnical engineering.Accurate PPV prediction facilitates safer and more sustainable blasting operations by minimizing adverse impacts and ensuring regulatory compliance.This study presents an advanced predictive framework integrating Cat Boost(CB)with nature-inspired optimization algorithms,including the Bat Algorithm(BAT),Sparrow Search Algorithm(SSA),Butterfly Optimization Algorithm(BOA),and Grasshopper Optimization Algorithm(GOA).A comprehensive dataset from the Sarcheshmeh Copper Mine in Iran was utilized to develop and evaluate these models using key performance metrics such as the Index of Agreement(IoA),Nash-Sutcliffe Efficiency(NSE),and the coefficient of determination(R^(2)).The hybrid CB-BOA model outperformed other approaches,achieving the highest accuracy(R^(2)=0.989)and the lowest prediction errors.SHAP analysis identified Distance(Di)as the most influential variable affecting PPV,while uncertainty analysis confirmed CB-BOA as the most reliable model,featuring the narrowest prediction interval.These findings highlight the effectiveness of hybrid machine learning models in refining PPV predictions,contributing to improved blast design strategies,enhanced structural safety,and reduced environmental impacts in mining and geotechnical engineering.
基金Project(2024JJ2074) supported by the Natural Science Foundation of Hunan Province,ChinaProject(22376221) supported by the National Natural Science Foundation of ChinaProject(2023QNRC001) supported by the Young Elite Scientists Sponsorship Program by CAST,China。
文摘Driven by rapid technological advancements and economic growth,mineral extraction and metal refining have increased dramatically,generating huge volumes of tailings and mine waste(TMWs).Investigating the morphological fractions of heavy metals and metalloids(HMMs)in TMWs is key to evaluating their leaching potential into the environment;however,traditional experiments are time-consuming and labor-intensive.In this study,10 machine learning(ML)algorithms were used and compared for rapidly predicting the morphological fractions of HMMs in TMWs.A dataset comprising 2376 data points was used,with mineral composition,elemental properties,and total concentration used as inputs and concentration of morphological fraction used as output.After grid search optimization,the extra tree model performed the best,achieving coefficient of determination(R2)of 0.946 and 0.942 on the validation and test sets,respectively.Electronegativity was found to have the greatest impact on the morphological fraction.The models’performance was enhanced by applying an ensemble method to the top three optimal ML models,including gradient boosting decision tree,extra trees and categorical boosting.Overall,the proposed framework can accurately predict the concentrations of different morphological fractions of HMMs in TMWs.This approach can minimize detection time,aid in the safe management and recovery of TMWs.
基金funded through India Meteorological Department,New Delhi,India under the Forecasting Agricultural output using Space,Agrometeorol ogy and Land based observations(FASAL)project and fund number:No.ASC/FASAL/KT-11/01/HQ-2010.
文摘Background Cotton is one of the most important commercial crops after food crops,especially in countries like India,where it’s grown extensively under rainfed conditions.Because of its usage in multiple industries,such as textile,medicine,and automobile industries,it has greater commercial importance.The crop’s performance is greatly influenced by prevailing weather dynamics.As climate changes,assessing how weather changes affect crop performance is essential.Among various techniques that are available,crop models are the most effective and widely used tools for predicting yields.Results This study compares statistical and machine learning models to assess their ability to predict cotton yield across major producing districts of Karnataka,India,utilizing a long-term dataset spanning from 1990 to 2023 that includes yield and weather factors.The artificial neural networks(ANNs)performed superiorly with acceptable yield deviations ranging within±10%during both vegetative stage(F1)and mid stage(F2)for cotton.The model evaluation metrics such as root mean square error(RMSE),normalized root mean square error(nRMSE),and modelling efficiency(EF)were also within the acceptance limits in most districts.Furthermore,the tested ANN model was used to assess the importance of the dominant weather factors influencing crop yield in each district.Specifically,the use of morning relative humidity as an individual parameter and its interaction with maximum and minimum tempera-ture had a major influence on cotton yield in most of the yield predicted districts.These differences highlighted the differential interactions of weather factors in each district for cotton yield formation,highlighting individual response of each weather factor under different soils and management conditions over the major cotton growing districts of Karnataka.Conclusions Compared with statistical models,machine learning models such as ANNs proved higher efficiency in forecasting the cotton yield due to their ability to consider the interactive effects of weather factors on yield forma-tion at different growth stages.This highlights the best suitability of ANNs for yield forecasting in rainfed conditions and for the study on relative impacts of weather factors on yield.Thus,the study aims to provide valuable insights to support stakeholders in planning effective crop management strategies and formulating relevant policies.
基金supported by National Natural Science Foundation of China(Grant No.12432018,12372346)the Innovative Research Groups of the National Natural Science Foundation of China(Grant No.12221002).
文摘A typical Whipple shield consists of double-layered plates with a certain gap.The space debris impacts the outer plate and is broken into a debris cloud(shattered,molten,vaporized)with dispersed energy and momentum,which reduces the risk of penetrating the bulkhead.In the realm of hypervelocity impact,strain rate(>10^(5)s^(-1))effects are negligible,and fluid dynamics is employed to describe the impact process.Efficient numerical tools for precisely predicting the damage degree can greatly accelerate the design and optimization of advanced protective structures.Current hypervelocity impact research primarily focuses on the interaction between projectile and front plate and the movement of debris cloud.However,the damage mechanism of debris cloud impacts on rear plates-the critical threat component-remains underexplored owing to complex multi-physics processes and prohibitive computational costs.Existing approaches,ranging from semi-empirical equations to a machine learningbased ballistic limit prediction method,are constrained to binary penetration classification.Alternatively,the uneven data from experiments and simulations caused these methods to be ineffective when the projectile has irregular shapes and complicate flight attitude.Therefore,it is urgent to develop a new damage prediction method for predicting the rear plate damage,which can help to gain a deeper understanding of the damage mechanism.In this study,a machine learning(ML)method is developed to predict the damage distribution in the rear plate.Based on the unit velocity space,the discretized information of debris cloud and rear plate damage from rare simulation cases is used as input data for training the ML models,while the generalization ability for damage distribution prediction is tested by other simulation cases with different attack angles.The results demonstrate that the training and prediction accuracies using the Random Forest(RF)algorithm significantly surpass those using Artificial Neural Networks(ANNs)and Support Vector Machine(SVM).The RF-based model effectively identifies damage features in sparsely distributed debris cloud and cumulative effect.This study establishes an expandable new dataset that accommodates additional parameters to improve the prediction accuracy.Results demonstrate the model's ability to overcome data imbalance limitations through debris cloud features,enabling rapid and accurate rear plate damage prediction across wider scenarios with minimal data requirements.