期刊文献+
共找到849篇文章
< 1 2 43 >
每页显示 20 50 100
High-throughput screening of CO_(2) cycloaddition MOF catalyst with an explainable machine learning model
1
作者 Xuefeng Bai Yi Li +3 位作者 Yabo Xie Qiancheng Chen Xin Zhang Jian-Rong Li 《Green Energy & Environment》 SCIE EI CAS 2025年第1期132-138,共7页
The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF str... The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF structure database is available.In this study,we report a machine learning model for high-throughput screening of MOF catalysts for the CO_(2) cycloaddition reaction.The descriptors for model training were judiciously chosen according to the reaction mechanism,which leads to high accuracy up to 97%for the 75%quantile of the training set as the classification criterion.The feature contribution was further evaluated with SHAP and PDP analysis to provide a certain physical understanding.12,415 hypothetical MOF structures and 100 reported MOFs were evaluated under 100℃ and 1 bar within one day using the model,and 239 potentially efficient catalysts were discovered.Among them,MOF-76(Y)achieved the top performance experimentally among reported MOFs,in good agreement with the prediction. 展开更多
关键词 Metal-organic frameworks High-throughput screening machine learning Explainable model CO_(2)cycloaddition
在线阅读 下载PDF
A bibliometric analysis using machine learning to track paradigm shifts and analytical advances in forest ecology and forestry journal publications from 2010 to 2022
2
作者 Jin Zhao Liyu Li +4 位作者 Jian Liu Yimei Yan Qian Wang Chris Newman Youbing Zhou 《Forest Ecosystems》 SCIE CSCD 2024年第5期770-779,共10页
Forest habitats are critical for biodiversity,ecosystem services,human livelihoods,and well-being.Capacity to conduct theoretical and applied forest ecology research addressing direct(e.g.,deforestation)and indirect(e... Forest habitats are critical for biodiversity,ecosystem services,human livelihoods,and well-being.Capacity to conduct theoretical and applied forest ecology research addressing direct(e.g.,deforestation)and indirect(e.g.,climate change)anthropogenic pressures has benefited considerably from new field-and statistical-techniques.We used machine learning and bibliometric structural topic modelling to identify 20 latent topics comprising four principal fields from a corpus of 16,952 forest ecology/forestry articles published in eight ecology and five forestry journals between 2010 and 2022.Articles published per year increased from 820 in 2010 to 2,354 in 2021,shifting toward more applied topics.Publications from China and some countries in North America and Europe dominated,with relatively fewer articles from some countries in West and Central Africa and West Asia,despite globally important forest resources.Most study sites were in some countries in North America,Central Asia,and South America,and Australia.Articles utilizing R statistical software predominated,increasing from 29.5%in 2010 to 71.4%in 2022.The most frequently used packages included lme4,vegan,nlme,MuMIn,ggplot2,car,MASS,mgcv,multcomp and raster.R was more often used in forest ecology than applied forestry articles.R software offers advantages in script and workflow-sharing compared to other statistical packages.Our findings demonstrate that the disciplines of forest ecology/forestry are expanding both in number and scope,aided by more sophisticated statistical tools,to tackle the challenges of redressing forest habitat loss and the socio-economic impacts of deforestation. 展开更多
关键词 Forest ecology FORESTRY R software Structural topic modelling machine learning PUBLICATION
在线阅读 下载PDF
Unveiling the Re,Cr,and I diffusion in saturated compacted bentonite using machine-learning methods
3
作者 Zheng-Ye Feng Jun-Lei Tian +5 位作者 Tao Wu Guo-Jun Wei Zhi-Long Li Xiao-Qiong Shi Yong-Jia Wang Qing-Feng Li 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第6期65-77,共13页
The safety assessment of high-level radioactive waste repositories requires a high predictive accuracy for radionuclide diffusion and a comprehensive understanding of the diffusion mechanism.In this study,a through-di... The safety assessment of high-level radioactive waste repositories requires a high predictive accuracy for radionuclide diffusion and a comprehensive understanding of the diffusion mechanism.In this study,a through-diffusion method and six machine-learning methods were employed to investigate the diffusion of ReO_(4)^(−),HCrO_(4)^(−),and I−in saturated compacted bentonite under different salinities and compacted dry densities.The machine-learning models were trained using two datasets.One dataset contained six input features and 293 instances obtained from the diffusion database system of the Japan Atomic Energy Agency(JAEA-DDB)and 15 publications.The other dataset,comprising 15,000 pseudo-instances,was produced using a multi-porosity model and contained eight input features.The results indicate that the former dataset yielded a higher predictive accuracy than the latter.Light gradient-boosting exhibited a higher prediction accuracy(R2=0.92)and lower error(MSE=0.01)than the other machine-learning algorithms.In addition,Shapley Additive Explanations,Feature Importance,and Partial Dependence Plot analysis results indicate that the rock capacity factor and compacted dry density had the two most significant effects on predicting the effective diffusion coefficient,thereby offering valuable insights. 展开更多
关键词 machine learning Effective diffusion coefficient Through-diffusion experiment Multi-porosity model Global analysis
在线阅读 下载PDF
Navigating challenges and opportunities of machine learning in hydrogen catalysis and production processes: Beyond algorithm development
4
作者 Mohd Nur Ikhmal Salehmin Sieh Kiong Tiong +5 位作者 Hassan Mohamed Dallatu Abbas Umar Kai Ling Yu Hwai Chyuan Ong Saifuddin Nomanbhay Swee Su Lim 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第12期223-252,共30页
With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a c... With the projected global surge in hydrogen demand, driven by increasing applications and the imperative for low-emission hydrogen, the integration of machine learning(ML) across the hydrogen energy value chain is a compelling avenue. This review uniquely focuses on harnessing the synergy between ML and computational modeling(CM) or optimization tools, as well as integrating multiple ML techniques with CM, for the synthesis of diverse hydrogen evolution reaction(HER) catalysts and various hydrogen production processes(HPPs). Furthermore, this review addresses a notable gap in the literature by offering insights, analyzing challenges, and identifying research prospects and opportunities for sustainable hydrogen production. While the literature reflects a promising landscape for ML applications in hydrogen energy domains, transitioning AI-based algorithms from controlled environments to real-world applications poses significant challenges. Hence, this comprehensive review delves into the technical,practical, and ethical considerations associated with the application of ML in HER catalyst development and HPP optimization. Overall, this review provides guidance for unlocking the transformative potential of ML in enhancing prediction efficiency and sustainability in the hydrogen production sector. 展开更多
关键词 machine learning Computational modeling HER catalyst synthesis Hydrogen energy Hydrogen production processes Algorithm development
在线阅读 下载PDF
A machine learning approach to TCAD model calibration for MOSFET 被引量:1
5
作者 Bai‑Chuan Wang Chuan‑Xiang Tang +4 位作者 Meng‑Tong Qiu Wei Chen Tan Wang Jing‑Yan Xu Li‑Li Ding 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2023年第12期133-145,共13页
Machine learning-based surrogate models have significant advantages in terms of computing efficiency. In this paper, we present a pilot study on fast calibration using machine learning techniques. Technology computer-... Machine learning-based surrogate models have significant advantages in terms of computing efficiency. In this paper, we present a pilot study on fast calibration using machine learning techniques. Technology computer-aided design(TCAD) is a powerful simulation tool for electronic devices. This simulation tool has been widely used in the research of radiation effects.However, calibration of TCAD models is time-consuming. In this study, we introduce a fast calibration approach for TCAD model calibration of metal–oxide–semiconductor field-effect transistors(MOSFETs). This approach utilized a machine learning-based surrogate model that was several orders of magnitude faster than the original TCAD simulation. The desired calibration results were obtained within several seconds. In this study, a fundamental model containing 26 parameters is introduced to represent the typical structure of a MOSFET. Classifications were developed to improve the efficiency of the training sample generation. Feature selection techniques were employed to identify important parameters. A surrogate model consisting of a classifier and a regressor was built. A calibration procedure based on the surrogate model was proposed and tested with three calibration goals. Our work demonstrates the feasibility of machine learning-based fast model calibrations for MOSFET. In addition, this study shows that these machine learning techniques learn patterns and correlations from data instead of employing domain expertise. This indicates that machine learning could be an alternative research approach to complement classical physics-based research. 展开更多
关键词 machine learning Radiation effects Surrogate model TCAD model calibration
在线阅读 下载PDF
Subsurface analytics: Contribution of artificial intelligence and machine learning to reservoir engineering, reservoir modeling, and reservoir management 被引量:1
6
作者 MOHAGHEGH Shahab D. 《Petroleum Exploration and Development》 2020年第2期225-228,共4页
Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of e... Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of engineers and scientists.Starting in the late 1960s and early 1970s,advances in computer hardware along with development and adaptation of clever algorithms resulted in a paradigm shift in reservoir studies moving them from simplified analogs and analytical solution methods to more mathematically robust computational and numerical solution models. 展开更多
关键词 and reservoir management Contribution of artificial intelligence and machine learning to reservoir engineering Subsurface analytics reservoir modeling
在线阅读 下载PDF
Machine-Learning-Assisted Optimization and Its Application to Antenna Designs: Opportunities and Challenges 被引量:6
7
作者 Qi Wu Yi Cao +1 位作者 Haiming Wang Wei Hong 《China Communications》 SCIE CSCD 2020年第4期152-164,共13页
With the rapid development of modern wireless communications and radar, antennas and arrays are becoming more complex, therein having, e.g., more degrees of design freedom, integration and fabrication constraints and ... With the rapid development of modern wireless communications and radar, antennas and arrays are becoming more complex, therein having, e.g., more degrees of design freedom, integration and fabrication constraints and design objectives. While fullwave electromagnetic simulation can be very accurate and therefore essential to the design process, it is also very time consuming, which leads to many challenges for antenna design, optimization and sensitivity analysis(SA). Recently, machine-learning-assisted optimization(MLAO) has been widely introduced to accelerate the design process of antennas and arrays. Machine learning(ML) methods, including Gaussian process regression, support vector machine(SVM) and artificial neural networks(ANNs), have been applied to build surrogate models of antennas to achieve fast response prediction. With the help of these ML methods, various MLAO algorithms have been proposed for different applications. A comprehensive survey of recent advances in ML methods for antenna modeling is first presented. Then, algorithms for ML-assisted antenna design, including optimization and SA, are reviewed. Finally, some challenges facing future MLAO for antenna design are discussed. 展开更多
关键词 ANTENNA DESIGNS machine learning OPTIMIZATION sensitivity analysis surrogate models
在线阅读 下载PDF
Stability analysis of hydro-turbine governing system based on machine learning
8
作者 Yuansheng Chen Fei Tong 《Chinese Physics B》 SCIE EI CAS CSCD 2021年第12期301-307,共7页
Hydro-turbine governing system is a time-varying complex system with strong non-linearity,and its dynamic characteristics are jointly affected by hydraulic,mechanical,electrical,and other factors.Aiming at the stabili... Hydro-turbine governing system is a time-varying complex system with strong non-linearity,and its dynamic characteristics are jointly affected by hydraulic,mechanical,electrical,and other factors.Aiming at the stability of the hydroturbine governing system,this paper first builds a dynamic model of the hydro-turbine governing system through mechanism modeling,and introduces the transfer coefficient characteristics under different load conditions to obtain the stability category of the system.BP neural network is used to perform the machine study and the predictive analysis of the stability of the system under different working conditions is carried out by using the additional momentum method to optimize the algorithm.The test set results show that the method can accurately distinguish the stability category of the hydro-turbine governing system(HTGS),and the research results can provide a theoretical reference for the operation and management of smart hydropower stations in the future. 展开更多
关键词 hydro-turbine governing system STABILITY machine learning dynamic model
在线阅读 下载PDF
Machine learning inspired workflow to revise field development plan under uncertainty
9
作者 LOOMBA Ashish Kumar BOTECHIA Vinicius Eduardo SCHIOZER Denis José 《Petroleum Exploration and Development》 SCIE 2023年第6期1455-1465,共11页
We present an efficient and risk-informed closed-loop field development (CLFD) workflow for recurrently revising the field development plan (FDP) using the accrued information. To make the process practical, we integr... We present an efficient and risk-informed closed-loop field development (CLFD) workflow for recurrently revising the field development plan (FDP) using the accrued information. To make the process practical, we integrated multiple concepts of machine learning, an intelligent selection process to discard the worst FDP options and a growing set of representative reservoir models. These concepts were combined and used with a cluster-based learning and evolution optimizer to efficiently explore the search space of decision variables. Unlike previous studies, we also added the execution time of the CLFD workflow and worked with more realistic timelines to confirm the utility of a CLFD workflow. To appreciate the importance of data assimilation and new well-logs in a CLFD workflow, we carried out researches at rigorous conditions without a reduction in uncertainty attributes. The proposed CLFD workflow was implemented on a benchmark analogous to a giant field with extensively time-consuming simulation models. The results underscore that an ensemble with as few as 100 scenarios was sufficient to gauge the geological uncertainty, despite working with a giant field with highly heterogeneous characteristics. It is demonstrated that the CLFD workflow can improve the efficiency by over 85% compared to the previously validated workflow. Finally, we present some acute insights and problems related to data assimilation for the practical application of a CLFD workflow. 展开更多
关键词 field development plan closed-loop field development reservoir model machine learning reservoir uncertainty optimization reservoir simulation efficiency
在线阅读 下载PDF
A Novel Tuning Method for Predictive Control of VAV Air Conditioning System Based on Machine Learning and Improved PSO
10
作者 Ning He Kun Xi +1 位作者 Mengrui Zhang Shang Li 《Journal of Beijing Institute of Technology》 EI CAS 2022年第4期350-361,共12页
The variable air volume(VAV)air conditioning system is with strong coupling and large time delay,for which model predictive control(MPC)is normally used to pursue performance improvement.Aiming at the difficulty of th... The variable air volume(VAV)air conditioning system is with strong coupling and large time delay,for which model predictive control(MPC)is normally used to pursue performance improvement.Aiming at the difficulty of the parameter selection of VAV MPC controller which is difficult to make the system have a desired response,a novel tuning method based on machine learning and improved particle swarm optimization(PSO)is proposed.In this method,the relationship between MPC controller parameters and time domain performance indices is established via machine learning.Then the PSO is used to optimize MPC controller parameters to get better performance in terms of time domain indices.In addition,the PSO algorithm is further modified under the principle of population attenuation and event triggering to tune parameters of MPC and reduce the computation time of tuning method.Finally,the effectiveness of the proposed method is validated via a hardware-in-the-loop VAV system. 展开更多
关键词 model predictive control(MPC) parameter tuning machine learning improved particle swarm optimization(PSO)
在线阅读 下载PDF
A hybrid agent⁃based machine learning method for human⁃centred energy consumption prediction
11
作者 Qingyao Qiao 《建筑节能(中英文)》 CAS 2023年第3期41-41,共1页
Occupant behaviour has significant impacts on the performance of machine learning algorithms when predicting building energy consumption.Due to a variety of reasons(e.g.,underperforming building energy management syst... Occupant behaviour has significant impacts on the performance of machine learning algorithms when predicting building energy consumption.Due to a variety of reasons(e.g.,underperforming building energy management systems or restrictions due to privacy policies),the availability of occupational data has long been an obstacle that hinders the performance of machine learning algorithms in predicting building energy consumption.Therefore,this study proposed an agent⁃based machine learning model whereby agent⁃based modelling was employed to generate simulated occupational data as input features for machine learning algorithms for building energy consumption prediction.Boruta feature selection was also introduced in this study to select all relevant features.The results indicated that the performances of machine learning algorithms in predicting building energy consumption were significantly improved when using simulated occupational data,with even greater improvements after conducting Boruta feature selection. 展开更多
关键词 Building energy consumption PREDICTION machine learning Agent⁃based modelling Occupant behaviour
在线阅读 下载PDF
Prediction of impurity spectrum function by deep learning algorithm
12
作者 刘婷 韩榕生 陈亮 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第5期52-63,共12页
By using the numerical renormalization group(NRG)method,we construct a large dataset with about one million spectral functions of the Anderson quantum impurity model.The dataset contains the density of states(DOS)of t... By using the numerical renormalization group(NRG)method,we construct a large dataset with about one million spectral functions of the Anderson quantum impurity model.The dataset contains the density of states(DOS)of the host material,the strength of Coulomb interaction between on-site electrons(U),and the hybridization between the host material and the impurity site(Γ).The continued DOS and spectral functions are stored with Chebyshev coefficients and wavelet functions,respectively.From this dataset,we build seven different machine learning networks to predict the spectral function from the input data,DOS,U,andΓ.Three different evaluation indexes,mean absolute error(MAE),relative error(RE)and root mean square error(RMSE),are used to analyze the prediction abilities of different network models.Detailed analysis shows that,for the two kinds of widely used recurrent neural networks(RNNs),gate recurrent unit(GRU)has better performance than the long short term memory(LSTM)network.A combination of bidirectional GRU(BiGRU)and GRU has the best performance among GRU,BiGRU,LSTM,and BiLSTM.The MAE peak of BiGRU+GRU reaches 0.00037.We have also tested a one-dimensional convolutional neural network(1DCNN)with 20 hidden layers and a residual neural network(ResNet),we find that the 1DCNN has almost the same performance of the BiGRU+GRU network for the original dataset,while the robustness testing seems to be a little weak than BiGRU+GRU when we test all these models on two other independent datasets.The ResNet has the worst performance among all the seven network models.The datasets presented in this paper,including the large data set of the spectral function of Anderson quantum impurity model,are openly available at https://doi.org/10.57760/sciencedb.j00113.00192. 展开更多
关键词 machine learning Anderson impurity model spectral function
在线阅读 下载PDF
Benchmarking deep learning-based models on nanophotonic inverse design problems 被引量:8
13
作者 Taigao Ma Mustafa Tobah +1 位作者 Haozhu Wang L.Jay Guo 《Opto-Electronic Science》 2022年第1期37-51,共15页
Photonic inverse design concerns the problem of finding photonic structures with target optical properties.However,traditional methods based on optimization algorithms are time-consuming and computationally expensive.... Photonic inverse design concerns the problem of finding photonic structures with target optical properties.However,traditional methods based on optimization algorithms are time-consuming and computationally expensive.Recently,deep learning-based approaches have been developed to tackle the problem of inverse design efficiently.Although most of these neural network models have demonstrated high accuracy in different inverse design problems,no previous study has examined the potential effects under given constraints in nanomanufacturing.Additionally,the relative strength of different deep learning-based inverse design approaches has not been fully investigated.Here,we benchmark three commonly used deep learning models in inverse design:Tandem networks,Variational Auto-Encoders,and Generative Adversarial Networks.We provide detailed comparisons in terms of their accuracy,diversity,and robustness.We find that tandem networks and Variational Auto-Encoders give the best accuracy,while Generative Adversarial Networks lead to the most diverse predictions.Our findings could serve as a guideline for researchers to select the model that can best suit their design criteria and fabrication considerations.In addition,our code and data are publicly available,which could be used for future inverse design model development and benchmarking. 展开更多
关键词 inverse design PHOTONICS machine learning neural networks generative models
在线阅读 下载PDF
Overview of Data-Driven Models for Wind Turbine Wake Flows
14
作者 Maokun Ye Min Li +2 位作者 Mingqiu Liu Chengjiang Xiao Decheng Wan 《哈尔滨工程大学学报(英文版)》 2025年第1期1-20,共20页
With the rapid advancement of machine learning technology and its growing adoption in research and engineering applications,an increasing number of studies have embraced data-driven approaches for modeling wind turbin... With the rapid advancement of machine learning technology and its growing adoption in research and engineering applications,an increasing number of studies have embraced data-driven approaches for modeling wind turbine wakes.These models leverage the ability to capture complex,high-dimensional characteristics of wind turbine wakes while offering significantly greater efficiency in the prediction process than physics-driven models.As a result,data-driven wind turbine wake models are regarded as powerful and effective tools for predicting wake behavior and turbine power output.This paper aims to provide a concise yet comprehensive review of existing studies on wind turbine wake modeling that employ data-driven approaches.It begins by defining and classifying machine learning methods to facilitate a clearer understanding of the reviewed literature.Subsequently,the related studies are categorized into four key areas:wind turbine power prediction,data-driven analytic wake models,wake field reconstruction,and the incorporation of explicit physical constraints.The accuracy of data-driven models is influenced by two primary factors:the quality of the training data and the performance of the model itself.Accordingly,both data accuracy and model structure are discussed in detail within the review. 展开更多
关键词 DATA-DRIVEN machine learning Artificial neural networks Wind turbine wake Wake models
在线阅读 下载PDF
Machine learning-based grayscale analyses for lithofacies identification of the Shahejie formation,Bohai Bay Basin,China
15
作者 Yu-Fan Wang Shang Xu +4 位作者 Fang Hao Hui-Min Liu Qin-Hong Hu Ke-Lai Xi Dong Yang 《Petroleum Science》 2025年第1期42-54,共13页
It is of great significance to accurately and rapidly identify shale lithofacies in relation to the evaluation and prediction of sweet spots for shale oil and gas reservoirs.To address the problem of low resolution in... It is of great significance to accurately and rapidly identify shale lithofacies in relation to the evaluation and prediction of sweet spots for shale oil and gas reservoirs.To address the problem of low resolution in logging curves,this study establishes a grayscale-phase model based on high-resolution grayscale curves using clustering analysis algorithms for shale lithofacies identification,working with the Shahejie For-mation,Bohai Bay Basin,China.The grayscale phase is defined as the sum of absolute grayscale and relative amplitude as well as their features.The absolute grayscale is the absolute magnitude of the gray values and is utilized for evaluating the material composition(mineral composition+total organic carbon)of shale,while the relative amplitude is the difference between adjacent gray values and is used to identify the shale structure type.The research results show that the grayscale phase model can identify shale lithofacies well,and the accuracy and applicability of this model were verified by the fitting relationship between absolute grayscale and shale mineral composition,as well as corresponding re-lationships between relative amplitudes and laminae development in shales.Four lithofacies are iden-tified in the target layer of the study area:massive mixed shale,laminated mixed shale,massive calcareous shale and laminated calcareous shale.This method can not only effectively characterize the material composition of shale,but also numerically characterize the development degree of shale laminae,and solve the problem that difficult to identify millimeter-scale laminae based on logging curves,which can provide technical support for shale lithofacies identification,sweet spot evaluation and prediction of complex continental lacustrine basins. 展开更多
关键词 Shale machine learning Absolute grayscale Relative amplitude Grayscale phase model Lithofacies identification
在线阅读 下载PDF
基于Q-Learning算法和神经网络的飞艇控制 被引量:5
16
作者 聂春雨 祝明 +1 位作者 郑泽伟 武哲 《北京航空航天大学学报》 EI CAS CSCD 北大核心 2017年第12期2431-2438,共8页
针对现代飞艇控制中动力学模型不确定性带来的系统建模和参数辨识工作较为复杂的问题,提出了一种基于自适应建模和在线学习机制的控制策略。设计了一种在分析实际运动的基础上建立飞艇控制马尔可夫决策过程(MDP)模型的方法,具有自适应... 针对现代飞艇控制中动力学模型不确定性带来的系统建模和参数辨识工作较为复杂的问题,提出了一种基于自适应建模和在线学习机制的控制策略。设计了一种在分析实际运动的基础上建立飞艇控制马尔可夫决策过程(MDP)模型的方法,具有自适应性。采用Q-Learning算法进行在线学习并利用小脑模型关节控制器(CMAC)神经网络对动作值函数进行泛化加速。对本文方法进行仿真并与经过参数整定的PID控制器对比,验证了该控制策略的有效性。结果表明,在线学习过程能够在数小时内收敛,通过自适应方法建立的MDP模型能够满足常见飞艇控制任务的需求。本文所提控制器能够获得与PID控制器精度相当且更为智能的控制效果。 展开更多
关键词 飞艇 马尔可夫决策过程(MDP) 机器学习 Q-learning 小脑模型关节控制器(CMAC)
在线阅读 下载PDF
Generalization properties of restricted Boltzmann machine for short-range order
17
作者 M A Timirgazin A K Arzhnikov 《Chinese Physics B》 SCIE EI CAS CSCD 2023年第6期556-562,共7页
A biased sampling algorithm for the restricted Boltzmann machine(RBM) is proposed, which allows generating configurations with a conserved quantity. To validate the method, a study of the short-range order in binary a... A biased sampling algorithm for the restricted Boltzmann machine(RBM) is proposed, which allows generating configurations with a conserved quantity. To validate the method, a study of the short-range order in binary alloys with positive and negative exchange interactions is carried out. The network is trained on the data collected by Monte–Carlo simulations for a simple Ising-like binary alloy model and used to calculate the Warren–Cowley short-range order parameter and other thermodynamic properties. We demonstrate that the proposed method allows us not only to correctly reproduce the order parameters for the alloy concentration at which the network was trained, but can also predict them for any other concentrations. 展开更多
关键词 machine learning short-range order Ising model restricted Boltzmann machine
在线阅读 下载PDF
Prediction of(n,2n)reaction cross-sections of long-lived fission products based on tensor model
18
作者 Jia-Li Huang Hui Wang +7 位作者 Ying-Ge Huang Er-Xi Xiao Yu-Jie Feng Xin Lei Fu-Chang Gu Long Zhu Yong-Jing Chen Jun Su 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第10期208-221,共14页
Interest has recently emerged in potential applications of(n,2n)reactions of unstable nuclei.Challenges have arisen because of the scarcity of experimental cross-sectional data.This study aims to predict the(n,2n)reac... Interest has recently emerged in potential applications of(n,2n)reactions of unstable nuclei.Challenges have arisen because of the scarcity of experimental cross-sectional data.This study aims to predict the(n,2n)reaction cross-section of long-lived fission products based on a tensor model.This tensor model is an extension of the collaborative filtering algorithm used for nuclear data.It is based on tensor decomposition and completion to predict(n,2n)reaction cross-sections;the corresponding EXFOR data are applied as training data.The reliability of the proposed tensor model was validated by comparing the calculations with data from EXFOR and different databases.Predictions were made for long-lived fission products such as^(60)Co,^(79)Se,^(93)Zr,^(107)P,^(126)Sn,and^(137)Cs,which provide a predicted energy range to effectively transmute long-lived fission products into shorter-lived or less radioactive isotopes.This method could be a powerful tool for completing(n,2n)reaction cross-sectional data and shows the possibility of selective transmutation of nuclear waste. 展开更多
关键词 (n 2n)Reaction cross-section Tensor model machine learning Collaborative filtering algorithm Selective transmutation
在线阅读 下载PDF
Online Neural Network Tuned Tube-Based Model Predictive Control for Nonlinear System
19
作者 Yuzhou Xiao Yan Li Lingguo Cui 《Journal of Beijing Institute of Technology》 EI CAS 2024年第6期547-555,共9页
This paper proposes a robust control scheme based on the sequential convex programming and learning-based model for nonlinear system subjected to additive uncertainties.For the problem of system nonlinearty and unknow... This paper proposes a robust control scheme based on the sequential convex programming and learning-based model for nonlinear system subjected to additive uncertainties.For the problem of system nonlinearty and unknown uncertainties,we study the tube-based model predictive control scheme that makes use of feedforward neural network.Based on the characteristics of the bounded limit of the average cost function while time approaching infinity,a min-max optimization problem(referred to as min-max OP)is formulated to design the controller.The feasibility of this optimization problem and the practical stability of the controlled system are ensured.To demonstrate the efficacy of the proposed approach,a numerical simulation on a double-tank system is conducted.The results of the simulation serve as verification of the effectualness of the proposed scheme. 展开更多
关键词 nonlinear model predictive control machine learning neural network control
在线阅读 下载PDF
基于机器学习的酸性气藏地下储气库硫化氢含量预测方法
20
作者 冯国庆 杜勤锟 +3 位作者 周道勇 蔡家兰 程希 莫海帅 《天然气工业》 北大核心 2025年第2期159-169,共11页
地下储气库(以下简称储气库)中含有硫化氢等有害气体,不仅影响储气库的安全运行,还直接对环境造成严重污染,准确预测储气库采出气组分中H2S的含量具有重要意义。目前,常采用油藏数值模拟的组分模型来预测H2S含量,但其计算过程复杂且耗... 地下储气库(以下简称储气库)中含有硫化氢等有害气体,不仅影响储气库的安全运行,还直接对环境造成严重污染,准确预测储气库采出气组分中H2S的含量具有重要意义。目前,常采用油藏数值模拟的组分模型来预测H2S含量,但其计算过程复杂且耗时较长,不能方便快捷地用于储气库单井H2S的含量预测。为此,以HCX储气库为研究对象,在建立储气库的机理模型并开展数值模拟的基础上,以机理模型计算的储气库多周期H2S预测结果为样本集,应用多输出支持向量回归(MSVR)、长短期记忆网络(LSTM)、人工神经网络(ANN)3种机器学习算法建立了硫化氢含量的智能代理模型,并对3种模型预测精度进行对比分析。研究结果表明:①长短期记忆网络模型具有适中的训练时间、较好的预测精度,可将该模型作为HCX储气库的H2S预测智能代理模型;②进一步对LSTM模型的训练数据和过渡拟合问题进行优化,确定最佳训练数集1500组,最佳丢弃率为0.2,隐含层设置范围可控制在层数1~2层,节点数30~60个;③经HCX储气库的实例应用表明,建立的LSTM智能代理模型能够准确预测储气库采出气中H2S的含量。结论认为,经过优化的LSTM算法智能代理模型具有较好的外推性,该研究成果可为含H2S储气库的建设和安全高效运行提供技术支持。 展开更多
关键词 含硫储气库 数值模拟 组分模拟 硫化氢含量预测 机器学习 长短期记忆网络模型 机器学习模型优化
在线阅读 下载PDF
上一页 1 2 43 下一页 到第
使用帮助 返回顶部