High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and ...High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and process optimization,characterized by low efficiency and high costs.The integration of Artificial Intelligence(AI)technologies has provided innovative solutions for HEAs research.This review presented a detailed overview of recent advancements in AI applications for structural modeling and mechanical property prediction of HEAs.Furthermore,it discussed the advantages of big data analytics in facilitating alloy composition design and screening,quality control,and defect prediction,as well as the construction and sharing of specialized material databases.The paper also addressed the existing challenges in current AI-driven HEAs research,including issues related to data quality,model interpretability,and cross-domain knowledge integration.Additionally,it proposed prospects for the synergistic development of AI-enhanced computational materials science and experimental validation systems.展开更多
Objective To observe the value of self-supervised deep learning artificial intelligence(AI)noise reduction technology based on the nearest adjacent layer applicated in ultra-low dose CT(ULDCT)for urinary calculi.Metho...Objective To observe the value of self-supervised deep learning artificial intelligence(AI)noise reduction technology based on the nearest adjacent layer applicated in ultra-low dose CT(ULDCT)for urinary calculi.Methods Eighty-eight urinary calculi patients were prospectively enrolled.Low dose CT(LDCT)and ULDCT scanning were performed,and the effective dose(ED)of each scanning protocol were calculated.The patients were then randomly divided into training set(n=75)and test set(n=13),and a self-supervised deep learning AI noise reduction system based on the nearest adjacent layer constructed with ULDCT images in training set was used for reducing noise of ULDCT images in test set.In test set,the quality of ULDCT images before and after AI noise reduction were compared with LDCT images,i.e.Blind/Referenceless Image Spatial Quality Evaluator(BRISQUE)scores,image noise(SD ROI)and signal-to-noise ratio(SNR).Results The tube current,the volume CT dose index and the dose length product of abdominal ULDCT scanning protocol were all lower compared with those of LDCT scanning protocol(all P<0.05),with a decrease of ED for approximately 82.66%.For 13 patients with urinary calculi in test set,BRISQUE score showed that the quality level of ULDCT images before AI noise reduction reached 54.42%level but raised to 95.76%level of LDCT images after AI noise reduction.Both ULDCT images after AI noise reduction and LDCT images had lower SD ROI and higher SNR than ULDCT images before AI noise reduction(all adjusted P<0.05),whereas no significant difference was found between the former two(both adjusted P>0.05).Conclusion Self-supervised learning AI noise reduction technology based on the nearest adjacent layer could effectively reduce noise and improve image quality of urinary calculi ULDCT images,being conducive for clinical application of ULDCT.展开更多
Artificial intelligence technology is introduced into the simulation of muzzle flow field to improve its simulation efficiency in this paper.A data-physical fusion driven framework is proposed.First,the known flow fie...Artificial intelligence technology is introduced into the simulation of muzzle flow field to improve its simulation efficiency in this paper.A data-physical fusion driven framework is proposed.First,the known flow field data is used to initialize the model parameters,so that the parameters to be trained are close to the optimal value.Then physical prior knowledge is introduced into the training process so that the prediction results not only meet the known flow field information but also meet the physical conservation laws.Through two examples,it is proved that the model under the fusion driven framework can solve the strongly nonlinear flow field problems,and has stronger generalization and expansion.The proposed model is used to solve a muzzle flow field,and the safety clearance behind the barrel side is divided.It is pointed out that the shape of the safety clearance under different launch speeds is roughly the same,and the pressure disturbance in the area within 9.2 m behind the muzzle section exceeds the safety threshold,which is a dangerous area.Comparison with the CFD results shows that the calculation efficiency of the proposed model is greatly improved under the condition of the same calculation accuracy.The proposed model can quickly and accurately simulate the muzzle flow field under various launch conditions.展开更多
Artificial intelligence(AI)technology has been increasingly used in medical field with its rapid developments.Echocardiography is one of the best imaging methods for clinical diagnosis of heart diseases,and combining ...Artificial intelligence(AI)technology has been increasingly used in medical field with its rapid developments.Echocardiography is one of the best imaging methods for clinical diagnosis of heart diseases,and combining with AI could further improve its diagnostic efficiency.Though the applications of AI in echocardiography remained at a relatively early stage,a variety of automated quantitative and analytical techniques were rapidly emerging and initially entered clinical practice.The status of clinical applications of AI in echocardiography were reviewed in this article.展开更多
Objective To observe the value of artificial intelligence(AI)models based on non-contrast chest CT for measuring bone mineral density(BMD).Methods Totally 380 subjects who underwent both non-contrast chest CT and quan...Objective To observe the value of artificial intelligence(AI)models based on non-contrast chest CT for measuring bone mineral density(BMD).Methods Totally 380 subjects who underwent both non-contrast chest CT and quantitative CT(QCT)BMD examination were retrospectively enrolled and divided into training set(n=304)and test set(n=76)at a ratio of 8∶2.The mean BMD of L1—L3 vertebrae were measured based on QCT.Spongy bones of T5—T10 vertebrae were segmented as ROI,radiomics(Rad)features were extracted,and machine learning(ML),Rad and deep learning(DL)models were constructed for classification of osteoporosis(OP)and evaluating BMD,respectively.Receiver operating characteristic curves were drawn,and area under the curves(AUC)were calculated to evaluate the efficacy of each model for classification of OP.Bland-Altman analysis and Pearson correlation analysis were performed to explore the consistency and correlation of each model with QCT for measuring BMD.Results Among ML and Rad models,ML Bagging-OP and Rad Bagging-OP had the best performances for classification of OP.In test set,AUC of ML Bagging-OP,Rad Bagging-OP and DL OP for classification of OP was 0.943,0.944 and 0.947,respectively,with no significant difference(all P>0.05).BMD obtained with all the above models had good consistency with those measured with QCT(most of the differences were within the range of Ax-G±1.96 s),which were highly positively correlated(r=0.910—0.974,all P<0.001).Conclusion AI models based on non-contrast chest CT had high efficacy for classification of OP,and good consistency of BMD measurements were found between AI models and QCT.展开更多
The use of artificial intelligence(AI)has increased since the middle of the 20th century,as evidenced by its applications to a wide range of engineering and science problems.Air traffic management(ATM)is becoming incr...The use of artificial intelligence(AI)has increased since the middle of the 20th century,as evidenced by its applications to a wide range of engineering and science problems.Air traffic management(ATM)is becoming increasingly automated and autonomous,making it lucrative for AI applications.This paper presents a systematic review of studies that employ AI techniques for improving ATM capability.A brief account of the history,structure,and advantages of these methods is provided,followed by the description of their applications to several representative ATM tasks,such as air traffic services(ATS),airspace management(AM),air traffic flow management(ATFM),and flight operations(FO).The major contribution of the current review is the professional survey of the AI application to ATM alongside with the description of their specific advantages:(i)these methods provide alternative approaches to conventional physical modeling techniques,(ii)these methods do not require knowing relevant internal system parameters,(iii)these methods are computationally more efficient,and(iv)these methods offer compact solutions to multivariable problems.In addition,this review offers a fresh outlook on future research.One is providing a clear rationale for the model type and structure selection for a given ATM mission.Another is to understand what makes a specific architecture or algorithm effective for a given ATM mission.These are among the most important issues that will continue to attract the attention of the AI research community and ATM work teams in the future.展开更多
The paper presents the coupling of artificial intelligence-AI and Object-oriented methodology applied for the construction of the model-based decision support system MBDSS.The MBDSS is designed for support the strate...The paper presents the coupling of artificial intelligence-AI and Object-oriented methodology applied for the construction of the model-based decision support system MBDSS.The MBDSS is designed for support the strategic decision making lead to the achievemellt of optimal path towardsmarket economy from the central planning situation in China. To meet user's various requirements,a series of innovations in software development have been carried out, such as system formalization with OBFRAMEs in an object-oriented paradigm for problem solving automation and techniques of modules intelligent cooperation, hybrid system of reasoning, connectionist framework utilization,etc. Integration technology has been highly emphasized and discussed in this article and an outlook to future software engineering is given in the conclusion section.展开更多
I firmly believe that of systems engineering is the requirement-driven force for the progress ofsoftware engineering, artificial intelligence and electronic technologies. The development ofsoftware engineering, artifi...I firmly believe that of systems engineering is the requirement-driven force for the progress ofsoftware engineering, artificial intelligence and electronic technologies. The development ofsoftware engineering, artificial intelligence and electronic technologies is the technical supportfor the progress of systems engineering. INTEGRATION can be considered as "bridging" the ex-isting technologies and the People together into a coordinated SYSTEM.展开更多
In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition contro...In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition controller, the sintering process state controller, and the abnormal conditions diagnosis subsystem. Knowledge base of the sintering process controlling was constructed, and inference engine of the system was established. Sinter chemical compositions were controlled by the strategies of self-adaptive prediction, internal optimization and center on basicity. And the state of sintering was stabilized centering on permeability. In order to meet the needs of process change and make the system clear, the system has learning ability and explanation function. The software of the system was developed in Visual C++ programming language. The application of the system shows that the hitting accuracy of sinter compositions and burning through point prediction are more than 85%; the first-grade rate of sinter chemical composition, stability rate of burning through point and stability rate of sintering process are increased by 3%, 9% and 4%, respectively.展开更多
Since the beginning of the 21st century,advances in big data and artificial intelligence have driven a paradigm shift in the geosciences,moving the field from qualitative descriptions toward quantitative analysis,from...Since the beginning of the 21st century,advances in big data and artificial intelligence have driven a paradigm shift in the geosciences,moving the field from qualitative descriptions toward quantitative analysis,from observing phenomena to uncovering underlying mechanisms,from regional-scale investigations to global perspectives,and from experience-based inference toward data-and model-enabled intelligent prediction.AlphaEarth Foundations(AEF)is a next-generation geospatial intelligence platform that addresses these changes by introducing a unified 64-dimensional shared embedding space,enabling-for the first time-standardized representation and seamless integration of 12 distinct types of Earth observation data,including optical,radar,and lidar.This framework significantly improves data assimilation efficiency and resolves the persistent problem of“data silos”in geoscience research.AEF is helping redefine research methodologies and fostering breakthroughs,particularly in quantitative Earth system science.This paper systematically examines how AEF’s innovative architecture-featuring multi-source data fusion,high-dimensional feature representation learning,and a scalable computational framework-facilitates intelligent,precise,and realtime data-driven geoscientific research.Using case studies from resource and environmental applications,we demonstrate AEF’s broad potential and identify emerging innovation needs.Our findings show that AEF not only enhances the efficiency of solving traditional geoscientific problems but also stimulates novel research directions and methodological approaches.展开更多
This paper presents a high-speed and robust dual-band infrared thermal camera based on an ARM CPU.The system consists of a low-resolution long-wavelength infrared detector,a digital temperature and humid⁃ity sensor,an...This paper presents a high-speed and robust dual-band infrared thermal camera based on an ARM CPU.The system consists of a low-resolution long-wavelength infrared detector,a digital temperature and humid⁃ity sensor,and a CMOS sensor.In view of the significant contrast between face and background in thermal infra⁃red images,this paper explores a suitable accuracy-latency tradeoff for thermal face detection and proposes a tiny,lightweight detector named YOLO-Fastest-IR.Four YOLO-Fastest-IR models(IR0 to IR3)with different scales are designed based on YOLO-Fastest.To train and evaluate these lightweight models,a multi-user low-resolution thermal face database(RGBT-MLTF)was collected,and the four networks were trained.Experiments demon⁃strate that the lightweight convolutional neural network performs well in thermal infrared face detection tasks.The proposed algorithm outperforms existing face detection methods in both positioning accuracy and speed,making it more suitable for deployment on mobile platforms or embedded devices.After obtaining the region of interest(ROI)in the infrared(IR)image,the RGB camera is guided by the thermal infrared face detection results to achieve fine positioning of the RGB face.Experimental results show that YOLO-Fastest-IR achieves a frame rate of 92.9 FPS on a Raspberry Pi 4B and successfully detects 97.4%of faces in the RGBT-MLTF test set.Ultimate⁃ly,an infrared temperature measurement system with low cost,strong robustness,and high real-time perfor⁃mance was integrated,achieving a temperature measurement accuracy of 0.3℃.展开更多
Fourier Ptychographic Microscopy(FPM)is a high-throughput computational optical imaging technology reported in 2013.It effectively breaks through the trade-off between high-resolution imaging and wide-field imaging.In...Fourier Ptychographic Microscopy(FPM)is a high-throughput computational optical imaging technology reported in 2013.It effectively breaks through the trade-off between high-resolution imaging and wide-field imaging.In recent years,it has been found that FPM is not only a tool to break through the trade-off between field of view and spatial resolution,but also a paradigm to break through those trade-off problems,thus attracting extensive attention.Compared with previous reviews,this review does not introduce its concept,basic principles,optical system and series of applications once again,but focuses on elaborating the three major difficulties faced by FPM technology in the process from“looking good”in the laboratory to“working well”in practical applications:mismatch between numerical model and physical reality,long reconstruction time and high computing power demand,and lack of multi-modal expansion.It introduces how to achieve key technological innovations in FPM through the dual drive of Artificial Intelligence(AI)and physics,including intelligent reconstruction algorithms introducing machine learning concepts,optical-algorithm co-design,fusion of frequency domain extrapolation methods and generative adversarial networks,multi-modal imaging schemes and data fusion enhancement,etc.,gradually solving the difficulties of FPM technology.Conversely,this review deeply considers the unique value of FPM technology in potentially feeding back to the development of“AI+optics”,such as providing AI benchmark tests under physical constraints,inspirations for the balance of computing power and bandwidth in miniaturized intelligent microscopes,and photoelectric hybrid architectures.Finally,it introduces the industrialization path and frontier directions of FPM technology,pointing out that with the promotion of the dual drive of AI and physics,it will generate a large number of industrial application case,and looks forward to the possibilities of future application scenarios and expansions,for instance,body fluid biopsy and point-of-care testing at the grassroots level represent the expansion of the growth market.展开更多
目前新疆植棉区已实现棉花栽培全程机械化与半自动化。在此基础上,通过融合代表新质生产力的第五代移动通信技术(fifth generation of mobile communications technology,5G)、物联网、人工智能(artificial intelligence,AI)、大数据、...目前新疆植棉区已实现棉花栽培全程机械化与半自动化。在此基础上,通过融合代表新质生产力的第五代移动通信技术(fifth generation of mobile communications technology,5G)、物联网、人工智能(artificial intelligence,AI)、大数据、云计算等先进技术,提出棉花“AI栽培体系”构想。论述了“AI栽培体系”硬件与软件的构成及其在棉花栽培中的运行机制,实现棉花栽培全流程精准化、生产集约化、资源最优化、管理高效化、决策智能化和作业无人化的技术突破,为智慧农业装备研发注入创新思路,为新疆未来推广全域自动化智能化棉花栽培提供实践指引。展开更多
The application of various artificial intelligent(AI) techniques,namely artificial neural network(ANN),adaptive neuro fuzzy interface system(ANFIS),genetic algorithm optimized least square support vector machine(GA-LS...The application of various artificial intelligent(AI) techniques,namely artificial neural network(ANN),adaptive neuro fuzzy interface system(ANFIS),genetic algorithm optimized least square support vector machine(GA-LSSVM) and multivariable regression(MVR) models was presented to identify the real power transfer between generators and loads.These AI techniques adopt supervised learning,which first uses modified nodal equation(MNE) method to determine real power contribution from each generator to loads.Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques.The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of various AI methods compared to that of the MNE method.展开更多
A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller...A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller. It consists of an off-line part and an on-line part. In the off-line part, for a given control system with a PID controller,by taking the overshoot, setting time and steady-state error of the system unit step response as the performance indexes and by using the ant system algorithm, a group of optimal PID parameters K*p , Ti* and T*d can be obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-line part, based on Kp* , Ti*and Td* and according to the current system error e and its time derivative, a specific program is written, which is used to optimize and adjust the PID parameters on-line through a fuzzy inference mechanism to ensure that the system response has optimal transient and steady-state performance. This kind of intelligent PID controller can be used to control the motor of the intelligent bionic artificial leg designed by the authors. The result of computer simulation experiment shows that the controller has less overshoot and shorter setting time.展开更多
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode...Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems.展开更多
How to mine valuable information from massive multisource heterogeneous data and identify the intention of aerial targets is a major research focus at present. Aiming at the longterm dependence of air target intention...How to mine valuable information from massive multisource heterogeneous data and identify the intention of aerial targets is a major research focus at present. Aiming at the longterm dependence of air target intention recognition, this paper deeply explores the potential attribute features from the spatiotemporal sequence data of the target. First, we build an intelligent dynamic intention recognition framework, including a series of specific processes such as data source, data preprocessing,target space-time, convolutional neural networks-bidirectional gated recurrent unit-atteneion (CBA) model and intention recognition. Then, we analyze and reason the designed CBA model in detail. Finally, through comparison and analysis with other recognition model experiments, our proposed method can effectively improve the accuracy of air target intention recognition,and is of significance to the commanders’ operational command and situation prediction.展开更多
In a time characterized by the availability of vast amounts of data,the effective utilization of information is critical for timely decision-making in military operations.However,processing large amounts of data requi...In a time characterized by the availability of vast amounts of data,the effective utilization of information is critical for timely decision-making in military operations.However,processing large amounts of data requires computational resources and time.Therefore,decision makers have used data-centric technologies to take advantage of public and private data sources to support military operations.This survey explores the integration and application of data-centric technologies,such as data analytics,data science,and machine learning,to optimize decision-making workflows within military contexts supporting the deployment of military assets and resources.To address the information gap,this article presents a literature review,specifically a survey.Our survey examines the use of the mentioned technologies to process and analyze information that contributes to the phases of situational awareness,and planning in military environments.We then introduce a taxonomy of the approaches associated with implementing these technologies in military scenarios.Furthermore,we discuss relevant factors for the seamless integration of data-centric technologies into military decision-making processes,and reveal the importance of specialized personnel,architectures,and cybersecurity issues in the task of developing prototypes and models.The findings of this paper aim to provide valuable insights for military institutions,offering a deeper understanding of the use of data-centric technologies as innovative practices to enhance the effectiveness of military decision-making.展开更多
Objective To explore the value of deep learning(DL)models semi-automatic training system for automatic optimization of clinical image quality control of transthoracic echocardiography(TTE).Methods Totally 1250 TTE vid...Objective To explore the value of deep learning(DL)models semi-automatic training system for automatic optimization of clinical image quality control of transthoracic echocardiography(TTE).Methods Totally 1250 TTE videos from 402 patients were retrospectively collected,including 490 apical four chamber(A4C),310 parasternal long axis view of left ventricle(PLAX)and 450 parasternal short axis view of great vessel(PSAX GV).The videos were divided into development set(245 A4C,155 PLAX,225 PSAX GV),semi-automated training set(98 A4C,62 PLAX,90 PSAX GV)and test set(147 A4C,93 PLAX,135 PSAX GV)at the ratio of 5∶2∶3.Based on development set and semi-automatic training set,DL model of quality control was semi-automatically iteratively optimized,and a semi-automatic training system was constructed,then the efficacy of DL models for recognizing TTE views and assessing imaging quality of TTE were verified in test set.Results After optimization,the overall accuracy,precision,recall,and F1 score of DL models for recognizing TTE views in test set improved from 97.33%,97.26%,97.26%and 97.26%to 99.73%,99.65%,99.77%and 99.71%,respectively,while the overall accuracy for assessing A4C,PLAX and PSAX GV TTE as standard views in test set improved from 89.12%,83.87%and 90.37%to 93.20%,90.32%and 93.33%,respectively.Conclusion The developed DL models semi-automatic training system could improve the efficiency of clinical imaging quality control of TTE and increase iteration speed.展开更多
文摘High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and process optimization,characterized by low efficiency and high costs.The integration of Artificial Intelligence(AI)technologies has provided innovative solutions for HEAs research.This review presented a detailed overview of recent advancements in AI applications for structural modeling and mechanical property prediction of HEAs.Furthermore,it discussed the advantages of big data analytics in facilitating alloy composition design and screening,quality control,and defect prediction,as well as the construction and sharing of specialized material databases.The paper also addressed the existing challenges in current AI-driven HEAs research,including issues related to data quality,model interpretability,and cross-domain knowledge integration.Additionally,it proposed prospects for the synergistic development of AI-enhanced computational materials science and experimental validation systems.
文摘Objective To observe the value of self-supervised deep learning artificial intelligence(AI)noise reduction technology based on the nearest adjacent layer applicated in ultra-low dose CT(ULDCT)for urinary calculi.Methods Eighty-eight urinary calculi patients were prospectively enrolled.Low dose CT(LDCT)and ULDCT scanning were performed,and the effective dose(ED)of each scanning protocol were calculated.The patients were then randomly divided into training set(n=75)and test set(n=13),and a self-supervised deep learning AI noise reduction system based on the nearest adjacent layer constructed with ULDCT images in training set was used for reducing noise of ULDCT images in test set.In test set,the quality of ULDCT images before and after AI noise reduction were compared with LDCT images,i.e.Blind/Referenceless Image Spatial Quality Evaluator(BRISQUE)scores,image noise(SD ROI)and signal-to-noise ratio(SNR).Results The tube current,the volume CT dose index and the dose length product of abdominal ULDCT scanning protocol were all lower compared with those of LDCT scanning protocol(all P<0.05),with a decrease of ED for approximately 82.66%.For 13 patients with urinary calculi in test set,BRISQUE score showed that the quality level of ULDCT images before AI noise reduction reached 54.42%level but raised to 95.76%level of LDCT images after AI noise reduction.Both ULDCT images after AI noise reduction and LDCT images had lower SD ROI and higher SNR than ULDCT images before AI noise reduction(all adjusted P<0.05),whereas no significant difference was found between the former two(both adjusted P>0.05).Conclusion Self-supervised learning AI noise reduction technology based on the nearest adjacent layer could effectively reduce noise and improve image quality of urinary calculi ULDCT images,being conducive for clinical application of ULDCT.
基金Supported by the Natural Science Foundation of Jiangsu Province of China(Grant No.BK20210347)Supported by the National Natural Science Foundation of China(Grant No.U2141246).
文摘Artificial intelligence technology is introduced into the simulation of muzzle flow field to improve its simulation efficiency in this paper.A data-physical fusion driven framework is proposed.First,the known flow field data is used to initialize the model parameters,so that the parameters to be trained are close to the optimal value.Then physical prior knowledge is introduced into the training process so that the prediction results not only meet the known flow field information but also meet the physical conservation laws.Through two examples,it is proved that the model under the fusion driven framework can solve the strongly nonlinear flow field problems,and has stronger generalization and expansion.The proposed model is used to solve a muzzle flow field,and the safety clearance behind the barrel side is divided.It is pointed out that the shape of the safety clearance under different launch speeds is roughly the same,and the pressure disturbance in the area within 9.2 m behind the muzzle section exceeds the safety threshold,which is a dangerous area.Comparison with the CFD results shows that the calculation efficiency of the proposed model is greatly improved under the condition of the same calculation accuracy.The proposed model can quickly and accurately simulate the muzzle flow field under various launch conditions.
文摘Artificial intelligence(AI)technology has been increasingly used in medical field with its rapid developments.Echocardiography is one of the best imaging methods for clinical diagnosis of heart diseases,and combining with AI could further improve its diagnostic efficiency.Though the applications of AI in echocardiography remained at a relatively early stage,a variety of automated quantitative and analytical techniques were rapidly emerging and initially entered clinical practice.The status of clinical applications of AI in echocardiography were reviewed in this article.
文摘Objective To observe the value of artificial intelligence(AI)models based on non-contrast chest CT for measuring bone mineral density(BMD).Methods Totally 380 subjects who underwent both non-contrast chest CT and quantitative CT(QCT)BMD examination were retrospectively enrolled and divided into training set(n=304)and test set(n=76)at a ratio of 8∶2.The mean BMD of L1—L3 vertebrae were measured based on QCT.Spongy bones of T5—T10 vertebrae were segmented as ROI,radiomics(Rad)features were extracted,and machine learning(ML),Rad and deep learning(DL)models were constructed for classification of osteoporosis(OP)and evaluating BMD,respectively.Receiver operating characteristic curves were drawn,and area under the curves(AUC)were calculated to evaluate the efficacy of each model for classification of OP.Bland-Altman analysis and Pearson correlation analysis were performed to explore the consistency and correlation of each model with QCT for measuring BMD.Results Among ML and Rad models,ML Bagging-OP and Rad Bagging-OP had the best performances for classification of OP.In test set,AUC of ML Bagging-OP,Rad Bagging-OP and DL OP for classification of OP was 0.943,0.944 and 0.947,respectively,with no significant difference(all P>0.05).BMD obtained with all the above models had good consistency with those measured with QCT(most of the differences were within the range of Ax-G±1.96 s),which were highly positively correlated(r=0.910—0.974,all P<0.001).Conclusion AI models based on non-contrast chest CT had high efficacy for classification of OP,and good consistency of BMD measurements were found between AI models and QCT.
基金supported by the National Natural Science Foundation of China(62073330)the Natural Science Foundation of Hunan Province(2020JJ4339)the Scientific Research Fund of Hunan Province Education Department(20B272).
文摘The use of artificial intelligence(AI)has increased since the middle of the 20th century,as evidenced by its applications to a wide range of engineering and science problems.Air traffic management(ATM)is becoming increasingly automated and autonomous,making it lucrative for AI applications.This paper presents a systematic review of studies that employ AI techniques for improving ATM capability.A brief account of the history,structure,and advantages of these methods is provided,followed by the description of their applications to several representative ATM tasks,such as air traffic services(ATS),airspace management(AM),air traffic flow management(ATFM),and flight operations(FO).The major contribution of the current review is the professional survey of the AI application to ATM alongside with the description of their specific advantages:(i)these methods provide alternative approaches to conventional physical modeling techniques,(ii)these methods do not require knowing relevant internal system parameters,(iii)these methods are computationally more efficient,and(iv)these methods offer compact solutions to multivariable problems.In addition,this review offers a fresh outlook on future research.One is providing a clear rationale for the model type and structure selection for a given ATM mission.Another is to understand what makes a specific architecture or algorithm effective for a given ATM mission.These are among the most important issues that will continue to attract the attention of the AI research community and ATM work teams in the future.
文摘The paper presents the coupling of artificial intelligence-AI and Object-oriented methodology applied for the construction of the model-based decision support system MBDSS.The MBDSS is designed for support the strategic decision making lead to the achievemellt of optimal path towardsmarket economy from the central planning situation in China. To meet user's various requirements,a series of innovations in software development have been carried out, such as system formalization with OBFRAMEs in an object-oriented paradigm for problem solving automation and techniques of modules intelligent cooperation, hybrid system of reasoning, connectionist framework utilization,etc. Integration technology has been highly emphasized and discussed in this article and an outlook to future software engineering is given in the conclusion section.
文摘I firmly believe that of systems engineering is the requirement-driven force for the progress ofsoftware engineering, artificial intelligence and electronic technologies. The development ofsoftware engineering, artificial intelligence and electronic technologies is the technical supportfor the progress of systems engineering. INTEGRATION can be considered as "bridging" the ex-isting technologies and the People together into a coordinated SYSTEM.
文摘In order to optimize the sintering process, a real-time operation guide system with artificial intelligence was developed, mainly including the data acquisition online subsystem, the sinter chemical composition controller, the sintering process state controller, and the abnormal conditions diagnosis subsystem. Knowledge base of the sintering process controlling was constructed, and inference engine of the system was established. Sinter chemical compositions were controlled by the strategies of self-adaptive prediction, internal optimization and center on basicity. And the state of sintering was stabilized centering on permeability. In order to meet the needs of process change and make the system clear, the system has learning ability and explanation function. The software of the system was developed in Visual C++ programming language. The application of the system shows that the hitting accuracy of sinter compositions and burning through point prediction are more than 85%; the first-grade rate of sinter chemical composition, stability rate of burning through point and stability rate of sintering process are increased by 3%, 9% and 4%, respectively.
基金National Natural Science Foundation of China Key Project(No.42050103)Higher Education Disciplinary Innovation Program(No.B25052)+2 种基金the Guangdong Pearl River Talent Program Innovative and Entrepreneurial Team Project(No.2021ZT09H399)the Ministry of Education’s Frontiers Science Center for Deep-Time Digital Earth(DDE)(No.2652023001)Geological Survey Project of China Geological Survey(DD20240206201)。
文摘Since the beginning of the 21st century,advances in big data and artificial intelligence have driven a paradigm shift in the geosciences,moving the field from qualitative descriptions toward quantitative analysis,from observing phenomena to uncovering underlying mechanisms,from regional-scale investigations to global perspectives,and from experience-based inference toward data-and model-enabled intelligent prediction.AlphaEarth Foundations(AEF)is a next-generation geospatial intelligence platform that addresses these changes by introducing a unified 64-dimensional shared embedding space,enabling-for the first time-standardized representation and seamless integration of 12 distinct types of Earth observation data,including optical,radar,and lidar.This framework significantly improves data assimilation efficiency and resolves the persistent problem of“data silos”in geoscience research.AEF is helping redefine research methodologies and fostering breakthroughs,particularly in quantitative Earth system science.This paper systematically examines how AEF’s innovative architecture-featuring multi-source data fusion,high-dimensional feature representation learning,and a scalable computational framework-facilitates intelligent,precise,and realtime data-driven geoscientific research.Using case studies from resource and environmental applications,we demonstrate AEF’s broad potential and identify emerging innovation needs.Our findings show that AEF not only enhances the efficiency of solving traditional geoscientific problems but also stimulates novel research directions and methodological approaches.
基金Supported by the Fundamental Research Funds for the Central Universities(2024300443)the Natural Science Foundation of Jiangsu Province(BK20241224).
文摘This paper presents a high-speed and robust dual-band infrared thermal camera based on an ARM CPU.The system consists of a low-resolution long-wavelength infrared detector,a digital temperature and humid⁃ity sensor,and a CMOS sensor.In view of the significant contrast between face and background in thermal infra⁃red images,this paper explores a suitable accuracy-latency tradeoff for thermal face detection and proposes a tiny,lightweight detector named YOLO-Fastest-IR.Four YOLO-Fastest-IR models(IR0 to IR3)with different scales are designed based on YOLO-Fastest.To train and evaluate these lightweight models,a multi-user low-resolution thermal face database(RGBT-MLTF)was collected,and the four networks were trained.Experiments demon⁃strate that the lightweight convolutional neural network performs well in thermal infrared face detection tasks.The proposed algorithm outperforms existing face detection methods in both positioning accuracy and speed,making it more suitable for deployment on mobile platforms or embedded devices.After obtaining the region of interest(ROI)in the infrared(IR)image,the RGB camera is guided by the thermal infrared face detection results to achieve fine positioning of the RGB face.Experimental results show that YOLO-Fastest-IR achieves a frame rate of 92.9 FPS on a Raspberry Pi 4B and successfully detects 97.4%of faces in the RGBT-MLTF test set.Ultimate⁃ly,an infrared temperature measurement system with low cost,strong robustness,and high real-time perfor⁃mance was integrated,achieving a temperature measurement accuracy of 0.3℃.
基金National Natural Science Foundation of China(No.12574332)the Space Optoelectronic Measurement and Perception Lab.,Beijing Institute of Control Engineering(No.LabSOMP-2023-10)Major Science and Technology Innovation Program of Xianyang City(No.L2024-ZDKJ-ZDCGZH-0021)。
文摘Fourier Ptychographic Microscopy(FPM)is a high-throughput computational optical imaging technology reported in 2013.It effectively breaks through the trade-off between high-resolution imaging and wide-field imaging.In recent years,it has been found that FPM is not only a tool to break through the trade-off between field of view and spatial resolution,but also a paradigm to break through those trade-off problems,thus attracting extensive attention.Compared with previous reviews,this review does not introduce its concept,basic principles,optical system and series of applications once again,but focuses on elaborating the three major difficulties faced by FPM technology in the process from“looking good”in the laboratory to“working well”in practical applications:mismatch between numerical model and physical reality,long reconstruction time and high computing power demand,and lack of multi-modal expansion.It introduces how to achieve key technological innovations in FPM through the dual drive of Artificial Intelligence(AI)and physics,including intelligent reconstruction algorithms introducing machine learning concepts,optical-algorithm co-design,fusion of frequency domain extrapolation methods and generative adversarial networks,multi-modal imaging schemes and data fusion enhancement,etc.,gradually solving the difficulties of FPM technology.Conversely,this review deeply considers the unique value of FPM technology in potentially feeding back to the development of“AI+optics”,such as providing AI benchmark tests under physical constraints,inspirations for the balance of computing power and bandwidth in miniaturized intelligent microscopes,and photoelectric hybrid architectures.Finally,it introduces the industrialization path and frontier directions of FPM technology,pointing out that with the promotion of the dual drive of AI and physics,it will generate a large number of industrial application case,and looks forward to the possibilities of future application scenarios and expansions,for instance,body fluid biopsy and point-of-care testing at the grassroots level represent the expansion of the growth market.
文摘目前新疆植棉区已实现棉花栽培全程机械化与半自动化。在此基础上,通过融合代表新质生产力的第五代移动通信技术(fifth generation of mobile communications technology,5G)、物联网、人工智能(artificial intelligence,AI)、大数据、云计算等先进技术,提出棉花“AI栽培体系”构想。论述了“AI栽培体系”硬件与软件的构成及其在棉花栽培中的运行机制,实现棉花栽培全流程精准化、生产集约化、资源最优化、管理高效化、决策智能化和作业无人化的技术突破,为智慧农业装备研发注入创新思路,为新疆未来推广全域自动化智能化棉花栽培提供实践指引。
基金the Ministry of Higher Education,Malaysia (MOHE) for the financial funding of this projectUniversiti Kebangsaan Malaysia and Universiti Teknologi Malaysia for providing infrastructure and moral support for the research work
文摘The application of various artificial intelligent(AI) techniques,namely artificial neural network(ANN),adaptive neuro fuzzy interface system(ANFIS),genetic algorithm optimized least square support vector machine(GA-LSSVM) and multivariable regression(MVR) models was presented to identify the real power transfer between generators and loads.These AI techniques adopt supervised learning,which first uses modified nodal equation(MNE) method to determine real power contribution from each generator to loads.Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques.The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of various AI methods compared to that of the MNE method.
文摘A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller. It consists of an off-line part and an on-line part. In the off-line part, for a given control system with a PID controller,by taking the overshoot, setting time and steady-state error of the system unit step response as the performance indexes and by using the ant system algorithm, a group of optimal PID parameters K*p , Ti* and T*d can be obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-line part, based on Kp* , Ti*and Td* and according to the current system error e and its time derivative, a specific program is written, which is used to optimize and adjust the PID parameters on-line through a fuzzy inference mechanism to ensure that the system response has optimal transient and steady-state performance. This kind of intelligent PID controller can be used to control the motor of the intelligent bionic artificial leg designed by the authors. The result of computer simulation experiment shows that the controller has less overshoot and shorter setting time.
文摘Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems.
基金supported by the National Natural Science Foundation of China (61502523)。
文摘How to mine valuable information from massive multisource heterogeneous data and identify the intention of aerial targets is a major research focus at present. Aiming at the longterm dependence of air target intention recognition, this paper deeply explores the potential attribute features from the spatiotemporal sequence data of the target. First, we build an intelligent dynamic intention recognition framework, including a series of specific processes such as data source, data preprocessing,target space-time, convolutional neural networks-bidirectional gated recurrent unit-atteneion (CBA) model and intention recognition. Then, we analyze and reason the designed CBA model in detail. Finally, through comparison and analysis with other recognition model experiments, our proposed method can effectively improve the accuracy of air target intention recognition,and is of significance to the commanders’ operational command and situation prediction.
文摘In a time characterized by the availability of vast amounts of data,the effective utilization of information is critical for timely decision-making in military operations.However,processing large amounts of data requires computational resources and time.Therefore,decision makers have used data-centric technologies to take advantage of public and private data sources to support military operations.This survey explores the integration and application of data-centric technologies,such as data analytics,data science,and machine learning,to optimize decision-making workflows within military contexts supporting the deployment of military assets and resources.To address the information gap,this article presents a literature review,specifically a survey.Our survey examines the use of the mentioned technologies to process and analyze information that contributes to the phases of situational awareness,and planning in military environments.We then introduce a taxonomy of the approaches associated with implementing these technologies in military scenarios.Furthermore,we discuss relevant factors for the seamless integration of data-centric technologies into military decision-making processes,and reveal the importance of specialized personnel,architectures,and cybersecurity issues in the task of developing prototypes and models.The findings of this paper aim to provide valuable insights for military institutions,offering a deeper understanding of the use of data-centric technologies as innovative practices to enhance the effectiveness of military decision-making.
文摘Objective To explore the value of deep learning(DL)models semi-automatic training system for automatic optimization of clinical image quality control of transthoracic echocardiography(TTE).Methods Totally 1250 TTE videos from 402 patients were retrospectively collected,including 490 apical four chamber(A4C),310 parasternal long axis view of left ventricle(PLAX)and 450 parasternal short axis view of great vessel(PSAX GV).The videos were divided into development set(245 A4C,155 PLAX,225 PSAX GV),semi-automated training set(98 A4C,62 PLAX,90 PSAX GV)and test set(147 A4C,93 PLAX,135 PSAX GV)at the ratio of 5∶2∶3.Based on development set and semi-automatic training set,DL model of quality control was semi-automatically iteratively optimized,and a semi-automatic training system was constructed,then the efficacy of DL models for recognizing TTE views and assessing imaging quality of TTE were verified in test set.Results After optimization,the overall accuracy,precision,recall,and F1 score of DL models for recognizing TTE views in test set improved from 97.33%,97.26%,97.26%and 97.26%to 99.73%,99.65%,99.77%and 99.71%,respectively,while the overall accuracy for assessing A4C,PLAX and PSAX GV TTE as standard views in test set improved from 89.12%,83.87%and 90.37%to 93.20%,90.32%and 93.33%,respectively.Conclusion The developed DL models semi-automatic training system could improve the efficiency of clinical imaging quality control of TTE and increase iteration speed.