High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and ...High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and process optimization,characterized by low efficiency and high costs.The integration of Artificial Intelligence(AI)technologies has provided innovative solutions for HEAs research.This review presented a detailed overview of recent advancements in AI applications for structural modeling and mechanical property prediction of HEAs.Furthermore,it discussed the advantages of big data analytics in facilitating alloy composition design and screening,quality control,and defect prediction,as well as the construction and sharing of specialized material databases.The paper also addressed the existing challenges in current AI-driven HEAs research,including issues related to data quality,model interpretability,and cross-domain knowledge integration.Additionally,it proposed prospects for the synergistic development of AI-enhanced computational materials science and experimental validation systems.展开更多
Objective To observe the value of self-supervised deep learning artificial intelligence(AI)noise reduction technology based on the nearest adjacent layer applicated in ultra-low dose CT(ULDCT)for urinary calculi.Metho...Objective To observe the value of self-supervised deep learning artificial intelligence(AI)noise reduction technology based on the nearest adjacent layer applicated in ultra-low dose CT(ULDCT)for urinary calculi.Methods Eighty-eight urinary calculi patients were prospectively enrolled.Low dose CT(LDCT)and ULDCT scanning were performed,and the effective dose(ED)of each scanning protocol were calculated.The patients were then randomly divided into training set(n=75)and test set(n=13),and a self-supervised deep learning AI noise reduction system based on the nearest adjacent layer constructed with ULDCT images in training set was used for reducing noise of ULDCT images in test set.In test set,the quality of ULDCT images before and after AI noise reduction were compared with LDCT images,i.e.Blind/Referenceless Image Spatial Quality Evaluator(BRISQUE)scores,image noise(SD ROI)and signal-to-noise ratio(SNR).Results The tube current,the volume CT dose index and the dose length product of abdominal ULDCT scanning protocol were all lower compared with those of LDCT scanning protocol(all P<0.05),with a decrease of ED for approximately 82.66%.For 13 patients with urinary calculi in test set,BRISQUE score showed that the quality level of ULDCT images before AI noise reduction reached 54.42%level but raised to 95.76%level of LDCT images after AI noise reduction.Both ULDCT images after AI noise reduction and LDCT images had lower SD ROI and higher SNR than ULDCT images before AI noise reduction(all adjusted P<0.05),whereas no significant difference was found between the former two(both adjusted P>0.05).Conclusion Self-supervised learning AI noise reduction technology based on the nearest adjacent layer could effectively reduce noise and improve image quality of urinary calculi ULDCT images,being conducive for clinical application of ULDCT.展开更多
Artificial intelligence technology is introduced into the simulation of muzzle flow field to improve its simulation efficiency in this paper.A data-physical fusion driven framework is proposed.First,the known flow fie...Artificial intelligence technology is introduced into the simulation of muzzle flow field to improve its simulation efficiency in this paper.A data-physical fusion driven framework is proposed.First,the known flow field data is used to initialize the model parameters,so that the parameters to be trained are close to the optimal value.Then physical prior knowledge is introduced into the training process so that the prediction results not only meet the known flow field information but also meet the physical conservation laws.Through two examples,it is proved that the model under the fusion driven framework can solve the strongly nonlinear flow field problems,and has stronger generalization and expansion.The proposed model is used to solve a muzzle flow field,and the safety clearance behind the barrel side is divided.It is pointed out that the shape of the safety clearance under different launch speeds is roughly the same,and the pressure disturbance in the area within 9.2 m behind the muzzle section exceeds the safety threshold,which is a dangerous area.Comparison with the CFD results shows that the calculation efficiency of the proposed model is greatly improved under the condition of the same calculation accuracy.The proposed model can quickly and accurately simulate the muzzle flow field under various launch conditions.展开更多
Objective To observe the value of artificial intelligence(AI)models based on non-contrast chest CT for measuring bone mineral density(BMD).Methods Totally 380 subjects who underwent both non-contrast chest CT and quan...Objective To observe the value of artificial intelligence(AI)models based on non-contrast chest CT for measuring bone mineral density(BMD).Methods Totally 380 subjects who underwent both non-contrast chest CT and quantitative CT(QCT)BMD examination were retrospectively enrolled and divided into training set(n=304)and test set(n=76)at a ratio of 8∶2.The mean BMD of L1—L3 vertebrae were measured based on QCT.Spongy bones of T5—T10 vertebrae were segmented as ROI,radiomics(Rad)features were extracted,and machine learning(ML),Rad and deep learning(DL)models were constructed for classification of osteoporosis(OP)and evaluating BMD,respectively.Receiver operating characteristic curves were drawn,and area under the curves(AUC)were calculated to evaluate the efficacy of each model for classification of OP.Bland-Altman analysis and Pearson correlation analysis were performed to explore the consistency and correlation of each model with QCT for measuring BMD.Results Among ML and Rad models,ML Bagging-OP and Rad Bagging-OP had the best performances for classification of OP.In test set,AUC of ML Bagging-OP,Rad Bagging-OP and DL OP for classification of OP was 0.943,0.944 and 0.947,respectively,with no significant difference(all P>0.05).BMD obtained with all the above models had good consistency with those measured with QCT(most of the differences were within the range of Ax-G±1.96 s),which were highly positively correlated(r=0.910—0.974,all P<0.001).Conclusion AI models based on non-contrast chest CT had high efficacy for classification of OP,and good consistency of BMD measurements were found between AI models and QCT.展开更多
Artificial intelligence(AI)technology has been increasingly used in medical field with its rapid developments.Echocardiography is one of the best imaging methods for clinical diagnosis of heart diseases,and combining ...Artificial intelligence(AI)technology has been increasingly used in medical field with its rapid developments.Echocardiography is one of the best imaging methods for clinical diagnosis of heart diseases,and combining with AI could further improve its diagnostic efficiency.Though the applications of AI in echocardiography remained at a relatively early stage,a variety of automated quantitative and analytical techniques were rapidly emerging and initially entered clinical practice.The status of clinical applications of AI in echocardiography were reviewed in this article.展开更多
The paper presents the coupling of artificial intelligence-AI and Object-oriented methodology applied for the construction of the model-based decision support system MBDSS.The MBDSS is designed for support the strate...The paper presents the coupling of artificial intelligence-AI and Object-oriented methodology applied for the construction of the model-based decision support system MBDSS.The MBDSS is designed for support the strategic decision making lead to the achievemellt of optimal path towardsmarket economy from the central planning situation in China. To meet user's various requirements,a series of innovations in software development have been carried out, such as system formalization with OBFRAMEs in an object-oriented paradigm for problem solving automation and techniques of modules intelligent cooperation, hybrid system of reasoning, connectionist framework utilization,etc. Integration technology has been highly emphasized and discussed in this article and an outlook to future software engineering is given in the conclusion section.展开更多
The use of artificial intelligence(AI)has increased since the middle of the 20th century,as evidenced by its applications to a wide range of engineering and science problems.Air traffic management(ATM)is becoming incr...The use of artificial intelligence(AI)has increased since the middle of the 20th century,as evidenced by its applications to a wide range of engineering and science problems.Air traffic management(ATM)is becoming increasingly automated and autonomous,making it lucrative for AI applications.This paper presents a systematic review of studies that employ AI techniques for improving ATM capability.A brief account of the history,structure,and advantages of these methods is provided,followed by the description of their applications to several representative ATM tasks,such as air traffic services(ATS),airspace management(AM),air traffic flow management(ATFM),and flight operations(FO).The major contribution of the current review is the professional survey of the AI application to ATM alongside with the description of their specific advantages:(i)these methods provide alternative approaches to conventional physical modeling techniques,(ii)these methods do not require knowing relevant internal system parameters,(iii)these methods are computationally more efficient,and(iv)these methods offer compact solutions to multivariable problems.In addition,this review offers a fresh outlook on future research.One is providing a clear rationale for the model type and structure selection for a given ATM mission.Another is to understand what makes a specific architecture or algorithm effective for a given ATM mission.These are among the most important issues that will continue to attract the attention of the AI research community and ATM work teams in the future.展开更多
In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition contro...In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition controller, the sintering process state controller, and the abnormal conditions diagnosis subsystem. Knowledge base of the sintering process controlling was constructed, and inference engine of the system was established. Sinter chemical compositions were controlled by the strategies of self-adaptive prediction, internal optimization and center on basicity. And the state of sintering was stabilized centering on permeability. In order to meet the needs of process change and make the system clear, the system has learning ability and explanation function. The software of the system was developed in Visual C++ programming language. The application of the system shows that the hitting accuracy of sinter compositions and burning through point prediction are more than 85%; the first-grade rate of sinter chemical composition, stability rate of burning through point and stability rate of sintering process are increased by 3%, 9% and 4%, respectively.展开更多
I firmly believe that of systems engineering is the requirement-driven force for the progress ofsoftware engineering, artificial intelligence and electronic technologies. The development ofsoftware engineering, artifi...I firmly believe that of systems engineering is the requirement-driven force for the progress ofsoftware engineering, artificial intelligence and electronic technologies. The development ofsoftware engineering, artificial intelligence and electronic technologies is the technical supportfor the progress of systems engineering. INTEGRATION can be considered as "bridging" the ex-isting technologies and the People together into a coordinated SYSTEM.展开更多
The history of educational technology in the last 50 years contains few instances of dramatic improvements in learning based on the adoption of a particular technology.An example involving artificial intelligence occu...The history of educational technology in the last 50 years contains few instances of dramatic improvements in learning based on the adoption of a particular technology.An example involving artificial intelligence occurred in the 1990s with the development of intelligent tutoring systems( ITSs). What happened with ITSs was that their success was limited to well-defined and relatively simple declarative and procedural learning tasks(e. g.,learning how to write a recursive function in LISP; doing multi-column addition),and improvements that were observed tended to be more limited than promised(e. g.,one standard deviation improvement at best rather than the promised standard deviation improvement).Still,there was some progress in terms of how to conceptualize learning. A seldom documented limitation was the notion of only viewing learning from only content and cognitive perspectives( i. e.,in terms of memory limitations,prior knowledge,bug libraries,learning hierarchies and sequences etc.). Little attention was paid to education conceived more broadly than developing specific cognitive skills with highly constrained problems. New technologies offer the potential to create dynamic and multi-dimensional models of a particular learner,and to track large data sets of learning activities,resources,interventions,and outcomes over a great many learners. Using those data to personalize learning for a particular learner developing knowledge,competence and understanding in a specific domain of inquiry is finally a real possibility. While the potential to make significant progress is clearly possible,the reality is less not so promising. There are many as yet unmet challenging some of which will be mentioned in this paper. A persistent worry is that educational technologists and computer scientists will again promise too much,too soon at too little cost and with too little effort and attention to the realities in schools and universities.展开更多
The application of various artificial intelligent(AI) techniques,namely artificial neural network(ANN),adaptive neuro fuzzy interface system(ANFIS),genetic algorithm optimized least square support vector machine(GA-LS...The application of various artificial intelligent(AI) techniques,namely artificial neural network(ANN),adaptive neuro fuzzy interface system(ANFIS),genetic algorithm optimized least square support vector machine(GA-LSSVM) and multivariable regression(MVR) models was presented to identify the real power transfer between generators and loads.These AI techniques adopt supervised learning,which first uses modified nodal equation(MNE) method to determine real power contribution from each generator to loads.Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques.The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of various AI methods compared to that of the MNE method.展开更多
A modified artificial bee colony optimizer(MABC)is proposed for image segmentation by using a pool of optimal foraging strategies to balance the exploration and exploitation tradeoff.The main idea of MABC is to enrich...A modified artificial bee colony optimizer(MABC)is proposed for image segmentation by using a pool of optimal foraging strategies to balance the exploration and exploitation tradeoff.The main idea of MABC is to enrichartificial bee foraging behaviors by combining local search and comprehensive learning using multi-dimensional PSO-based equation.With comprehensive learning,the bees incorporate the information of global best solution into the solution search equation to improve the exploration while the local search enables the bees deeply exploit around the promising area,which provides a proper balance between exploration and exploitation.The experimental results on comparing the MABC to several successful EA and SI algorithms on a set of benchmarks demonstrated the effectiveness of the proposed algorithm.Furthermore,we applied the MABC algorithm to image segmentation problem.Experimental results verify the effectiveness of the proposed algorithm.展开更多
The recently invented artificial bee colony (ABC) al- gorithm is an optimization algorithm based on swarm intelligence that has been used to solve many kinds of numerical function optimization problems. It performs ...The recently invented artificial bee colony (ABC) al- gorithm is an optimization algorithm based on swarm intelligence that has been used to solve many kinds of numerical function optimization problems. It performs well in most cases, however, there still exists an insufficiency in the ABC algorithm that ignores the fitness of related pairs of individuals in the mechanism of find- ing a neighboring food source. This paper presents an improved ABC algorithm with mutual learning (MutualABC) that adjusts the produced candidate food source with the higher fitness between two individuals selected by a mutual learning factor. The perfor- mance of the improved MutualABC algorithm is tested on a set of benchmark functions and compared with the basic ABC algo- rithm and some classical versions of improved ABC algorithms. The experimental results show that the MutualABC algorithm with appropriate parameters outperforms other ABC algorithms in most experiments.展开更多
An optimal PID controller with incomplete derivation is proposed based on fuzzy inference and the geneticalgorithm, which is called the fuzzy-GA PID controller with incomplete derivation. It consists of the off-line p...An optimal PID controller with incomplete derivation is proposed based on fuzzy inference and the geneticalgorithm, which is called the fuzzy-GA PID controller with incomplete derivation. It consists of the off-line part andthe on-line part. In the off-line part, by taking the overshoot, rise time, and settling time of system unit step re-sponse as the performance indexes and by using the genetic algorithm, a group of optimal PID parameters K*p , Ti* ,and Tj are obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-linepart, based on K; , Ti* , and T*d and according to the current system error e and its time derivative, a dedicatedprogram is written, which is used to optimize and adjust the PID parameters on line through a fuzzy inference mech-anism to ensure that the system response has optimal dynamic and steady-state performance. The controller has beenused to control the D. C. motor of the intelligent bionic artificial leg designed by the authors. The result of computersimulation shows that this kind of optimal PID controller has excellent control performance and robust performance.展开更多
A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller...A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller. It consists of an off-line part and an on-line part. In the off-line part, for a given control system with a PID controller,by taking the overshoot, setting time and steady-state error of the system unit step response as the performance indexes and by using the ant system algorithm, a group of optimal PID parameters K*p , Ti* and T*d can be obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-line part, based on Kp* , Ti*and Td* and according to the current system error e and its time derivative, a specific program is written, which is used to optimize and adjust the PID parameters on-line through a fuzzy inference mechanism to ensure that the system response has optimal transient and steady-state performance. This kind of intelligent PID controller can be used to control the motor of the intelligent bionic artificial leg designed by the authors. The result of computer simulation experiment shows that the controller has less overshoot and shorter setting time.展开更多
目前新疆植棉区已实现棉花栽培全程机械化与半自动化。在此基础上,通过融合代表新质生产力的第五代移动通信技术(fifth generation of mobile communications technology,5G)、物联网、人工智能(artificial intelligence,AI)、大数据、...目前新疆植棉区已实现棉花栽培全程机械化与半自动化。在此基础上,通过融合代表新质生产力的第五代移动通信技术(fifth generation of mobile communications technology,5G)、物联网、人工智能(artificial intelligence,AI)、大数据、云计算等先进技术,提出棉花“AI栽培体系”构想。论述了“AI栽培体系”硬件与软件的构成及其在棉花栽培中的运行机制,实现棉花栽培全流程精准化、生产集约化、资源最优化、管理高效化、决策智能化和作业无人化的技术突破,为智慧农业装备研发注入创新思路,为新疆未来推广全域自动化智能化棉花栽培提供实践指引。展开更多
文摘High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and process optimization,characterized by low efficiency and high costs.The integration of Artificial Intelligence(AI)technologies has provided innovative solutions for HEAs research.This review presented a detailed overview of recent advancements in AI applications for structural modeling and mechanical property prediction of HEAs.Furthermore,it discussed the advantages of big data analytics in facilitating alloy composition design and screening,quality control,and defect prediction,as well as the construction and sharing of specialized material databases.The paper also addressed the existing challenges in current AI-driven HEAs research,including issues related to data quality,model interpretability,and cross-domain knowledge integration.Additionally,it proposed prospects for the synergistic development of AI-enhanced computational materials science and experimental validation systems.
文摘Objective To observe the value of self-supervised deep learning artificial intelligence(AI)noise reduction technology based on the nearest adjacent layer applicated in ultra-low dose CT(ULDCT)for urinary calculi.Methods Eighty-eight urinary calculi patients were prospectively enrolled.Low dose CT(LDCT)and ULDCT scanning were performed,and the effective dose(ED)of each scanning protocol were calculated.The patients were then randomly divided into training set(n=75)and test set(n=13),and a self-supervised deep learning AI noise reduction system based on the nearest adjacent layer constructed with ULDCT images in training set was used for reducing noise of ULDCT images in test set.In test set,the quality of ULDCT images before and after AI noise reduction were compared with LDCT images,i.e.Blind/Referenceless Image Spatial Quality Evaluator(BRISQUE)scores,image noise(SD ROI)and signal-to-noise ratio(SNR).Results The tube current,the volume CT dose index and the dose length product of abdominal ULDCT scanning protocol were all lower compared with those of LDCT scanning protocol(all P<0.05),with a decrease of ED for approximately 82.66%.For 13 patients with urinary calculi in test set,BRISQUE score showed that the quality level of ULDCT images before AI noise reduction reached 54.42%level but raised to 95.76%level of LDCT images after AI noise reduction.Both ULDCT images after AI noise reduction and LDCT images had lower SD ROI and higher SNR than ULDCT images before AI noise reduction(all adjusted P<0.05),whereas no significant difference was found between the former two(both adjusted P>0.05).Conclusion Self-supervised learning AI noise reduction technology based on the nearest adjacent layer could effectively reduce noise and improve image quality of urinary calculi ULDCT images,being conducive for clinical application of ULDCT.
基金Supported by the Natural Science Foundation of Jiangsu Province of China(Grant No.BK20210347)Supported by the National Natural Science Foundation of China(Grant No.U2141246).
文摘Artificial intelligence technology is introduced into the simulation of muzzle flow field to improve its simulation efficiency in this paper.A data-physical fusion driven framework is proposed.First,the known flow field data is used to initialize the model parameters,so that the parameters to be trained are close to the optimal value.Then physical prior knowledge is introduced into the training process so that the prediction results not only meet the known flow field information but also meet the physical conservation laws.Through two examples,it is proved that the model under the fusion driven framework can solve the strongly nonlinear flow field problems,and has stronger generalization and expansion.The proposed model is used to solve a muzzle flow field,and the safety clearance behind the barrel side is divided.It is pointed out that the shape of the safety clearance under different launch speeds is roughly the same,and the pressure disturbance in the area within 9.2 m behind the muzzle section exceeds the safety threshold,which is a dangerous area.Comparison with the CFD results shows that the calculation efficiency of the proposed model is greatly improved under the condition of the same calculation accuracy.The proposed model can quickly and accurately simulate the muzzle flow field under various launch conditions.
文摘Objective To observe the value of artificial intelligence(AI)models based on non-contrast chest CT for measuring bone mineral density(BMD).Methods Totally 380 subjects who underwent both non-contrast chest CT and quantitative CT(QCT)BMD examination were retrospectively enrolled and divided into training set(n=304)and test set(n=76)at a ratio of 8∶2.The mean BMD of L1—L3 vertebrae were measured based on QCT.Spongy bones of T5—T10 vertebrae were segmented as ROI,radiomics(Rad)features were extracted,and machine learning(ML),Rad and deep learning(DL)models were constructed for classification of osteoporosis(OP)and evaluating BMD,respectively.Receiver operating characteristic curves were drawn,and area under the curves(AUC)were calculated to evaluate the efficacy of each model for classification of OP.Bland-Altman analysis and Pearson correlation analysis were performed to explore the consistency and correlation of each model with QCT for measuring BMD.Results Among ML and Rad models,ML Bagging-OP and Rad Bagging-OP had the best performances for classification of OP.In test set,AUC of ML Bagging-OP,Rad Bagging-OP and DL OP for classification of OP was 0.943,0.944 and 0.947,respectively,with no significant difference(all P>0.05).BMD obtained with all the above models had good consistency with those measured with QCT(most of the differences were within the range of Ax-G±1.96 s),which were highly positively correlated(r=0.910—0.974,all P<0.001).Conclusion AI models based on non-contrast chest CT had high efficacy for classification of OP,and good consistency of BMD measurements were found between AI models and QCT.
文摘Artificial intelligence(AI)technology has been increasingly used in medical field with its rapid developments.Echocardiography is one of the best imaging methods for clinical diagnosis of heart diseases,and combining with AI could further improve its diagnostic efficiency.Though the applications of AI in echocardiography remained at a relatively early stage,a variety of automated quantitative and analytical techniques were rapidly emerging and initially entered clinical practice.The status of clinical applications of AI in echocardiography were reviewed in this article.
文摘The paper presents the coupling of artificial intelligence-AI and Object-oriented methodology applied for the construction of the model-based decision support system MBDSS.The MBDSS is designed for support the strategic decision making lead to the achievemellt of optimal path towardsmarket economy from the central planning situation in China. To meet user's various requirements,a series of innovations in software development have been carried out, such as system formalization with OBFRAMEs in an object-oriented paradigm for problem solving automation and techniques of modules intelligent cooperation, hybrid system of reasoning, connectionist framework utilization,etc. Integration technology has been highly emphasized and discussed in this article and an outlook to future software engineering is given in the conclusion section.
基金supported by the National Natural Science Foundation of China(62073330)the Natural Science Foundation of Hunan Province(2020JJ4339)the Scientific Research Fund of Hunan Province Education Department(20B272).
文摘The use of artificial intelligence(AI)has increased since the middle of the 20th century,as evidenced by its applications to a wide range of engineering and science problems.Air traffic management(ATM)is becoming increasingly automated and autonomous,making it lucrative for AI applications.This paper presents a systematic review of studies that employ AI techniques for improving ATM capability.A brief account of the history,structure,and advantages of these methods is provided,followed by the description of their applications to several representative ATM tasks,such as air traffic services(ATS),airspace management(AM),air traffic flow management(ATFM),and flight operations(FO).The major contribution of the current review is the professional survey of the AI application to ATM alongside with the description of their specific advantages:(i)these methods provide alternative approaches to conventional physical modeling techniques,(ii)these methods do not require knowing relevant internal system parameters,(iii)these methods are computationally more efficient,and(iv)these methods offer compact solutions to multivariable problems.In addition,this review offers a fresh outlook on future research.One is providing a clear rationale for the model type and structure selection for a given ATM mission.Another is to understand what makes a specific architecture or algorithm effective for a given ATM mission.These are among the most important issues that will continue to attract the attention of the AI research community and ATM work teams in the future.
文摘In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition controller, the sintering process state controller, and the abnormal conditions diagnosis subsystem. Knowledge base of the sintering process controlling was constructed, and inference engine of the system was established. Sinter chemical compositions were controlled by the strategies of self-adaptive prediction, internal optimization and center on basicity. And the state of sintering was stabilized centering on permeability. In order to meet the needs of process change and make the system clear, the system has learning ability and explanation function. The software of the system was developed in Visual C++ programming language. The application of the system shows that the hitting accuracy of sinter compositions and burning through point prediction are more than 85%; the first-grade rate of sinter chemical composition, stability rate of burning through point and stability rate of sintering process are increased by 3%, 9% and 4%, respectively.
文摘I firmly believe that of systems engineering is the requirement-driven force for the progress ofsoftware engineering, artificial intelligence and electronic technologies. The development ofsoftware engineering, artificial intelligence and electronic technologies is the technical supportfor the progress of systems engineering. INTEGRATION can be considered as "bridging" the ex-isting technologies and the People together into a coordinated SYSTEM.
文摘The history of educational technology in the last 50 years contains few instances of dramatic improvements in learning based on the adoption of a particular technology.An example involving artificial intelligence occurred in the 1990s with the development of intelligent tutoring systems( ITSs). What happened with ITSs was that their success was limited to well-defined and relatively simple declarative and procedural learning tasks(e. g.,learning how to write a recursive function in LISP; doing multi-column addition),and improvements that were observed tended to be more limited than promised(e. g.,one standard deviation improvement at best rather than the promised standard deviation improvement).Still,there was some progress in terms of how to conceptualize learning. A seldom documented limitation was the notion of only viewing learning from only content and cognitive perspectives( i. e.,in terms of memory limitations,prior knowledge,bug libraries,learning hierarchies and sequences etc.). Little attention was paid to education conceived more broadly than developing specific cognitive skills with highly constrained problems. New technologies offer the potential to create dynamic and multi-dimensional models of a particular learner,and to track large data sets of learning activities,resources,interventions,and outcomes over a great many learners. Using those data to personalize learning for a particular learner developing knowledge,competence and understanding in a specific domain of inquiry is finally a real possibility. While the potential to make significant progress is clearly possible,the reality is less not so promising. There are many as yet unmet challenging some of which will be mentioned in this paper. A persistent worry is that educational technologists and computer scientists will again promise too much,too soon at too little cost and with too little effort and attention to the realities in schools and universities.
基金the Ministry of Higher Education,Malaysia (MOHE) for the financial funding of this projectUniversiti Kebangsaan Malaysia and Universiti Teknologi Malaysia for providing infrastructure and moral support for the research work
文摘The application of various artificial intelligent(AI) techniques,namely artificial neural network(ANN),adaptive neuro fuzzy interface system(ANFIS),genetic algorithm optimized least square support vector machine(GA-LSSVM) and multivariable regression(MVR) models was presented to identify the real power transfer between generators and loads.These AI techniques adopt supervised learning,which first uses modified nodal equation(MNE) method to determine real power contribution from each generator to loads.Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques.The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of various AI methods compared to that of the MNE method.
基金Projects(6177021519,61503373)supported by National Natural Science Foundation of ChinaProject(N161705001)supported by Fundamental Research Funds for the Central University,China
文摘A modified artificial bee colony optimizer(MABC)is proposed for image segmentation by using a pool of optimal foraging strategies to balance the exploration and exploitation tradeoff.The main idea of MABC is to enrichartificial bee foraging behaviors by combining local search and comprehensive learning using multi-dimensional PSO-based equation.With comprehensive learning,the bees incorporate the information of global best solution into the solution search equation to improve the exploration while the local search enables the bees deeply exploit around the promising area,which provides a proper balance between exploration and exploitation.The experimental results on comparing the MABC to several successful EA and SI algorithms on a set of benchmarks demonstrated the effectiveness of the proposed algorithm.Furthermore,we applied the MABC algorithm to image segmentation problem.Experimental results verify the effectiveness of the proposed algorithm.
基金supported by the National Natural Science Foundation of China (60803074)the Fundamental Research Funds for the Central Universities (DUT10JR06)
文摘The recently invented artificial bee colony (ABC) al- gorithm is an optimization algorithm based on swarm intelligence that has been used to solve many kinds of numerical function optimization problems. It performs well in most cases, however, there still exists an insufficiency in the ABC algorithm that ignores the fitness of related pairs of individuals in the mechanism of find- ing a neighboring food source. This paper presents an improved ABC algorithm with mutual learning (MutualABC) that adjusts the produced candidate food source with the higher fitness between two individuals selected by a mutual learning factor. The perfor- mance of the improved MutualABC algorithm is tested on a set of benchmark functions and compared with the basic ABC algo- rithm and some classical versions of improved ABC algorithms. The experimental results show that the MutualABC algorithm with appropriate parameters outperforms other ABC algorithms in most experiments.
基金Project (50275150) supported by the National Natural Science Foundation of ChinaProject (RL200002) supported by the Foundation of the Robotics Laboratory, Chinese Academy of Sciences
文摘An optimal PID controller with incomplete derivation is proposed based on fuzzy inference and the geneticalgorithm, which is called the fuzzy-GA PID controller with incomplete derivation. It consists of the off-line part andthe on-line part. In the off-line part, by taking the overshoot, rise time, and settling time of system unit step re-sponse as the performance indexes and by using the genetic algorithm, a group of optimal PID parameters K*p , Ti* ,and Tj are obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-linepart, based on K; , Ti* , and T*d and according to the current system error e and its time derivative, a dedicatedprogram is written, which is used to optimize and adjust the PID parameters on line through a fuzzy inference mech-anism to ensure that the system response has optimal dynamic and steady-state performance. The controller has beenused to control the D. C. motor of the intelligent bionic artificial leg designed by the authors. The result of computersimulation shows that this kind of optimal PID controller has excellent control performance and robust performance.
文摘A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller. It consists of an off-line part and an on-line part. In the off-line part, for a given control system with a PID controller,by taking the overshoot, setting time and steady-state error of the system unit step response as the performance indexes and by using the ant system algorithm, a group of optimal PID parameters K*p , Ti* and T*d can be obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-line part, based on Kp* , Ti*and Td* and according to the current system error e and its time derivative, a specific program is written, which is used to optimize and adjust the PID parameters on-line through a fuzzy inference mechanism to ensure that the system response has optimal transient and steady-state performance. This kind of intelligent PID controller can be used to control the motor of the intelligent bionic artificial leg designed by the authors. The result of computer simulation experiment shows that the controller has less overshoot and shorter setting time.
文摘目前新疆植棉区已实现棉花栽培全程机械化与半自动化。在此基础上,通过融合代表新质生产力的第五代移动通信技术(fifth generation of mobile communications technology,5G)、物联网、人工智能(artificial intelligence,AI)、大数据、云计算等先进技术,提出棉花“AI栽培体系”构想。论述了“AI栽培体系”硬件与软件的构成及其在棉花栽培中的运行机制,实现棉花栽培全流程精准化、生产集约化、资源最优化、管理高效化、决策智能化和作业无人化的技术突破,为智慧农业装备研发注入创新思路,为新疆未来推广全域自动化智能化棉花栽培提供实践指引。